Speaking with SBS News, James Roberts, general manager of group fraud management services at CBA, said that scammers are now using deepfakes to impersonate people that we may expect to speak to, such as celebrities, staff, friends and loved ones, according to a new report by CBA.
“So what we’ve been talking about in the study is deepfake. So like you said, either video deepfakes or audio deepfakes,” he said.
“So, often these are impersonations of someone you trust, so it might be a celebrity, a public figure, maybe even a family member. And they’re often the contents often hosted on social media platforms, dating apps, and even messaging platforms.”
CBA said these scams are designed to seem human and exploit trust.
“Humans tend to trust faces, voices and familiar people. Deepfakes take advantage of that instinct,” said Monica Whitty, professor of human factors in cyber security at Monash University.
“Scammers are using AI to create fake investment videos, deepfake celebrities, and even voice and text clones of loved ones, senior executives and government officials. Talking openly about this technology is one of the easiest ways to help stay ahead of it,” Roberts said in the report.
The CBA study reveals that roughly 27 per cent of Australians say they have witnessed a deepfake scam in the past year, with investment scams being the most prevalent (59 per cent), followed by business email compromise (40 per cent) and relationship scams (38 per cent).
“So quite prevalent, but our expectations are that it should continue to grow as more and more criminals are also adopting this technology,” Roberts continued during his discussion with SBS.
However, Australians are confident that they could spot an AI-generated deepfake scam, with the report showing almost nine in 10 (89 per cent) of Australians feel they could identify one.
However, the study also found that Australians are dangerously underprepared when it comes to dealing with deepfake scams, with only 42 per cent familiar with AI-enhanced scams.
“The findings reveal a growing gap between confidence and reality – and that gap is exactly what scammers are looking to exploit as they increasingly turn to AI to target everyday Australians and small businesses,” Roberts said in the report.
Whitty added that a lack of open discussion about deepfake scams increased vulnerability.
“The data shows that many Australians don’t talk openly about deepfake scams – with only a third discussing AI-generated scams with their relatives or friends. That means fewer opportunities to share warning signs or learn from others’ experiences,” she said.
Australians also seem to know how to prevent scams, with 74 per cent agreeing that they should set up a safe word with family members to ensure that they can verify against deepfake scams. Despite this, only 20 per cent say they have actually set one.
Fortunately, Australians should not fear a rapid change in protecting themselves against scams, said Roberts.
“The good news is that the steps that keep people safe don’t need to evolve at the same speed as the technology does. Deepfakes might be new, but the same tried-and-tested habits – slowing down, checking details and speaking with someone you know and trust, such as a family member, remains your best defence – even against AI-powered scams,” he said.
Techniques such as not falling for urgency, avoiding using links, speaking with trusted people in person, identifying differences in emails and other things, verifying users and payment details and links, not sending money to romantic interests online, and more.
Roberts added that avoiding the scammer once identified is an ideal solution, as tempting as it might be to get angry.
“While it probably is satisfying to cuss them out and have that engagement, it’s often best not to answer a scam call if it’s coming in,” Roberts told SBS.
“The more you answer it, they feel that maybe there’s an opportunity for scamming you, but if you’re never answering it, I think practically you end up dropping off their lists, and eventually they stop calling you.”
Daniel Croft