The Terrifying Deepfake Voice Scam That You Need To Be Prepared For

A new phone-scam is being reported around the world using an AI generated cloned version of a loved one, and it’s expected to reach Australia soon.

They’re emotive and high-quality replications that claim there’s an emergency, such as a kidnapping with a ransom to be paid.

Toby Murray, Associate Professor of Computing and Information Systems says it only takes seconds of audio to produce a reasonable quality clone of somebody’s voice.

Murray told The Briefing how the scam works and what people can do to stay safe:

“Most people who are posting videos online have left more than is needed to gain a decent amount of audio to make a decent copy of their voices.”

The quality of the AI modelling is anticipated to improve over the next 12 months.

“If you can’t create a 100% convincing voice clone of someone with today’s technology, you probably will by the end of the year.”

Tips to safegaurd include being aware of the scam, asking specific questions on the phone, listening for audio artefacts.

To find more information about online scams or to make a report, visit the National Anti-Scam Centre, Scamwatch.

Subscribe to The Briefing, Australia’s fastest-growing news podcast on LiSTNR today. The Briefing serves up the latest news and deep dives on topics affecting you, all in under 20 minutes.