top of page

The Deceptive World Of Voice Cloning

  • Kobe Wang
  • Apr 24
  • 4 min read

A graph showing the AI voice cloning market in the Asia Pacific, as well as future projections (via Grand View Research).


Imagine a scenario where you get a phone call from your best friend. Their voice is unmistakable, and it sounds just like them. They say they’re in trouble and urgently need money. You don’t think twice and send the money to them. But here’s the twist: it wasn’t them.


Welcome to the unsettling world of voice cloning, where artificial intelligence (AI) and technology can mimic human voices with shocking levels of accuracy. While AI brings many benefits, it also opens more opportunities for scams, fraud, and deception.


Background


In today’s rapidly evolving world, artificial intelligence is reshaping nearly every aspect of life. While there are benefits, there is also a wide array of growing concerns, particularly around the misuse of AI in the form of deepfakes.


Voice cloning is a specific form of deepfake technology. With just a few seconds of audio, AI can analyze every part of ones’ speech patterns, from their accents and tone to even their breath pattern. All it takes is a brief voice recording, and a scammer can generate a voice that sounds nearly identical to the real person.


What is Voice Cloning?


Voice cloning works by putting audio into AI trained to analyze and replicate audio. Once the system learns how someone speaks, it can convert written text into a produced voice that sounds exactly like the original speaker.


While it can have positive uses, such as aiding those who’ve lost the ability to speak, helping in the entertainment industry, and offering translations, the darker side of voice cloning is beginning to outweigh the good.


How It’s Being Exploited


Cybercriminals and scammers have started to use voice cloning to create highly convincing fake calls. A typical scam begins with collecting voice samples, after which the voice is cloned with AI. Then, using the voice, the scammer fabricates a crisis or opportunity to pressure the target into sending money or sharing personal details. Because these calls often appear to come from trusted contacts, victims are more likely to fall for the scam.


Notorious Real-World Examples


  • United States (2024): A fake Joe Biden robocall urged voters not to participate in the election. The incident, which made national headlines, led to a $6 million fine for the political consultant involved and a ban on AI-generated robocalls by the Federal Communications Commission. There was also a disturbing kidnapping case involving the cloned voice of a teenager to manipulate her parents into complying with ransom demands.

  • United Arab Emirates: Criminals cloned the voice of a company director to orchestrate a $51 million heist.

  • India: A businessman in Mumbai received a convincing fake call from someone posing as a representative of the Indian Embassy in Dubai.

  • Australia: Scammers used a voice clone of Steven Miles to promote Bitcoin investments.


These cases prove that no one’s voice is safe from being cloned, whether it be a public figure, business, or private citizen.


U.S. Law


Legal protections for an individual’s voice have evolved over the years. The U.S. common law stated that vocal imitations were not an infringement on a celebrity’s privacy or publicity rights until 1988, when the case of Midler v. Ford Motor Co. changed the law to what it is today: imitating a celebrity's voice without permission for commercial purposes violates their right of publicity. This right protects a person’s name, image, persona, and voice, as well as other distinct characteristics, from unauthorized exploitation by others.


How Widespread Is It?


  • In the United Kingdom, 28% of adults reported encountering voice cloning scams in 2023, and nearly half (46%) were unaware such scams even existed.

  • In Australia, 240,000 victims reported being targeted by voice cloning scams in 2022 alone, with a collective financial loss of 568 million AUD (~362 million USD).


Lack of awareness and knowledge regarding new scamming methods has left millions vulnerable to this new form of fraud.


Reducing the Risk


What started as TikTok trends and viral celebrity impersonations for fun has morphed into a serious issue. To protect yourself, be skeptical of unexpected voice or video messages, even if they appear to be authentic, by double checking. Avoid posting detailed voice recordings online, understand how deepfakes and voice cloning work, and educate others, especially seniors and young people, who are often more susceptible to manipulation in scams.


Conclusion


The ability to replicate a voice was once a fairytale concept, but now, it's an everyday thing that holds real dangers. Voice cloning has made it hard to distinguish between real and fake to the point where hearing a familiar voice is no longer proof of identity.


Luckily, awareness is key. The more you know, the better you can protect yourself and others. As AI continues to advance, it's not just about whether technology can fool you; it's about whether you’re prepared to spot scams. The next time your phone rings and you hear a suspicious yet familiar voice, make sure to pause, question, and verify. 

bottom of page