³Ô¹ÏÍøÕ¾

The dangers of voice cloning and how to combat it

The rapid development of has brought both benefits and risk.

Authors


  • Leo S.F. Lin

    Senior Lecturer in Policing Studies, Charles Sturt University


  • Duane Aslett

    Senior Lecturer in Policing Studies, Charles Sturt University


  • Geberew Tulu Mekonnen

    Lecturer, School of Policing Studies, Charles Sturt University


  • Mladen Zecevic

    Lecturer at the School of Policing Studies, Charles Sturt University

One concerning trend is the misuse of voice cloning. In seconds, scammers can clone a voice and trick people into thinking a friend or a family member urgently needs money.

News outlets, including , warn these types of scams have the potential to impact millions of people.

As technology makes it easier for criminals to invade our personal spaces, staying cautious about its use is more important than ever.

What is voice cloning?

The rise of AI has created possibilities for image, text, voice generation and machine learning.

While AI offers many benefits, it also provides fraudsters new methods to exploit individuals for money.

You may have heard of “,” where AI is used to create fake images, videos and even audio, often involving or politicians.

, a type of deepfake technology, creates a digital replica of a person’s voice by capturing their speech patterns, accent and breathing from brief audio samples.

Once the speech pattern is captured, an AI voice generator can convert text input into highly realistic speech resembling the targeted person’s voice.

With advancing technology, voice cloning can be accomplished with just .

While a simple phrase like “hello, is anyone there?” can lead to a voice cloning scam, a longer conversation helps scammers capture more vocal details. It is therefore best to keep calls brief until you are sure of the caller’s identity.

Voice cloning has valuable applications in entertainment and health care – enabling remote voice work for artists (even ) and assisting people with speech disabilities.

However, it raises serious privacy and security concerns, underscoring the need for safeguards.

How it’s being exploited by criminals

Cybercriminals exploit voice cloning technology to impersonate celebrities, authorities or ordinary people for fraud.

They create urgency, gain the victim’s trust and request money via gift cards, wire transfers or cryptocurrency.

The process by collecting audio samples from sources like YouTube and TikTok.

Next, the technology analyses the audio to generate new recordings.

Once the voice is cloned, it can be used in deceptive communications, often accompanied by spoofing Caller ID to appear trustworthy.

Many voice cloning scam cases have made headlines.

For example, criminals cloned the voice of a in the United Arab Emirates to orchestrate a $A51 million heist.

A fell victim to a voice cloning scam involving a fake call from the Indian Embassy in Dubai.

In Australia recently, scammers employed a voice clone of to attempt to trick people to invest in Bitcoin.

Teenagers and children are also targeted. In a in the United States, a teenager’s voice was cloned and her parents manipulated into complying with demands.

How widespread is it?

Recent shows 28% of adults in the United Kingdom faced voice cloning scams last year, with 46% unaware of the existence of this type of scam.

It highlights a significant knowledge gap, leaving millions at risk of fraud.

In 2022, almost 240,000 Australians reported being victims of voice cloning scams, leading to a financial loss of .

How people and organisations can safeguard against it

The risks posed by voice cloning require a .

People and organisations can implement several measures to safeguard against the misuse of voice cloning technology.

First, can help protect people and organisations and mitigate these types of fraud.

Public-private collaboration can provide clear information and consent options for voice cloning.

Second, people and organisations should look to use , which is new technology that can recognise and verify a live voice as opposed to a fake. And organisations using voice recognition should consider adopting multi-factor authentication.

Third, enhancing investigative capability against voice cloning is another crucial measure for law enforcement.

Finally, for countries are needed for managing associated risks.

Australian law enforcement recognises the potential benefits of AI.

Yet, concerns about the “dark side” of this technology have prompted calls for research into the criminal use of “.”

There are also calls for possible intervention strategies that law enforcement could use to combat this problem.

Such efforts should connect with the overall , which focuses on proactive, reactive and restorative strategies.

That national plan stipulates a duty of care for service providers, reflected in the Australian government’s new to safeguard the public and small businesses.

The legislation aims for new obligations to prevent, detect, report and disrupt scams.

This will apply to regulated organisations such as telcos, banks and digital platform providers. The goal is to protect customers by preventing, detecting, reporting, and disrupting .

Reducing the risk

As cybercrime costs the Australian economy an estimated , public awareness and strong safeguards are essential.

Countries like Australia are recognising the growing risk. The effectiveness of measures against voice cloning and other frauds depends on their adaptability, cost, feasibility and regulatory compliance.

All stakeholders – government, citizens, and law enforcement – must stay vigilant and raise public awareness to reduce the risk of victimisation.

The Conversation

/Courtesy of The Conversation. View in full .