What are AI voice cloning apps scams? How to be safe from them? Explained

Of late, voice cloning, a by-product of AI technology, has become a tool for fraudsters to target and exploit innocent people. Hence, let us understand what are AI voice cloning apps scams and how to prevent them.

Explainer on AI voice cloning scams
Explainer on AI voice cloning scams

Highlights

  • Voice cloning technology allows one to create almost perfect replicas of any voice
  • Scammers are using strikingly convincing AI voice cloning tools to impersonate somebody’s voice and extract information and funds from victims

Ever since OpenAI’s ChatGPT made its debut, both the pros and cons pertaining to AI have been hogging the limelight. Artificial intelligence has the deadly potential to mitigate the boundaries between reality and fiction. Voice cloning, a by-product of AI technology, has nowadays become a tool for fraudsters to target and exploit innocent people.

In fact, generative AI has lowered the bar for cybercriminals looking to clone someone's voice and use it for their selfish motives.

What is AI voice cloning?

To understand AI Voice Cloning Apps Scams, let us first understand what exactly voice cloning is. As the name suggests, voice cloning technology allows one to create almost perfect replicas of any voice by making use of its unique characteristics from a piece of short audio sample.

Interestingly, a simple internet search yields a variety of apps, many of them freely available for creating AI voices with a small sample, at times lasting for a few seconds, of the victim’s real voice that can be easily stolen from content posted online. These days several AI software and applications are available in the market that can be used for cloning one's voice, for instance, Baidu Deep Voice, Lyrebird, VocaliD, iSpeech, and Modulate, among others.

In the new breed of scams, scammers are using strikingly convincing AI voice cloning tools to impersonate somebody’s voice and extract information and funds from victims more effectively.

Moreover, scammers can make use of varied accents, and genders, or even mimic the speech patterns of known ones, thus, allowing the creation of convincing deepfakes. Researchers opine that cybercriminals now require as little as three seconds of someone's voice for cloning successfully and use it in a scam call.

These days, cunning scammers prey on potential victims' empathy by using urgent and distressing situations to manipulate them into sending money.

As per a recent survey conducted by the US-based McAfee Labs, 70 percent of the participants responded that they could not tell the difference between a real voice and a fake one.

Red flags to watch out for these AI voice cloning scams

  • Urgent demand for money
  • Callers asking to wire money, buy gift cards, send cryptos, or demand to pay money using other untraceable methods
  • Inconsistencies in the information given

How to protect yourself from AI voice cloning scams

  • Being cautious when sharing content online
  • Deleting personal data from various data broker sites, which can otherwise be a source of information for cybercriminals
  • Be skeptical when receiving unexpected phone calls or messages
  • Never click on links from unknown sources
  • Block unwanted phone calls and annoying spam text messages
  • If you suspect that you are being scammed, report it to the authorities immediately