cyber daily logo

Breaking news and updates daily. Subscribe to our Newsletter

Breaking news and updates daily. Subscribe to our Newsletter X facebook linkedin Instagram Instagram

AI-based voice cloning is making scams even more effective and harder to detect

Deepfakes and artificial intelligence voice cloning are proving to be powerful new tools in the arsenal of scammers, according to a new report from Recorded Future.

user icon David Hollingworth
Mon, 22 May 2023
AI-based voice cloning is making scams even more effective and harder to detect
expand image

What makes these tools so powerful is both their effectiveness at what they do and their accessibility. There are many such tools available, but one of the most well-known and easy-to-use ones is Prime Voice AI, which is browser-based and can convert text to speech for free — even with custom voice samples uploaded by the user.

With platforms such as this, scammers are able to operate simple impersonation scams, even if they’re non-English speakers. More competent scammers are capable of even more effective scam campaigns.

What most voice-cloning scammers are doing is using the technology to create “one-time samples” that are then used in disinformation and extortion scams or to impersonate executives and other high-level individuals. There are limits to the tech, though, especially when it comes to faking real-time conversations. Generating voice samples in non-English languages is also currently limited.

Other threat actors have decided to create their own voice-cloning tools and are selling access to them on Telegram or are operating voice-cloning-as-a-service sites.

“When it comes to deepfakes involving voice cloning, which, based on our last report, is an area of deepfakes that is growing and spreading fast, it is important Australian organisations consider ramping up their voice-cloning detection and real-time voice analysis capabilities, as well as look at anti-spoofing technologies such as ‘liveness’ detection,” said Alexander Leslie, threat intelligence analyst at Recorded Future’s Insikt Group. “They should also consider investing in call patterns monitoring capabilities, and setting up authentication protocols such as two-factor, biometrics, or in the case of callback scams call authentication protocols.”

“Technology isn’t the only way to fight against deepfake scams; training and education have a great role to play too. I advise Australian organisations to train their employees on the risks associated with voice cloning and how to identify suspicious activity related to voice-cloning attacks, including executive impersonations.”

You can read Recorded Future’s full report — I Have No Mouth, And I Must Do Crimehere.

David Hollingworth

David Hollingworth

David Hollingworth has been writing about technology for over 20 years, and has worked for a range of print and online titles in his career. He is enjoying getting to grips with cyber security, especially when it lets him talk about Lego.

cyber daily subscribe
Be the first to hear the latest developments in the cyber industry.