scorecardresearch

Beware of AI voice scams: Woman loses Rs 1.4 lakh in devastating AI voice scam

A 59-year-old woman fell victim to a Rs 1.4 lakh AI voice scam, emphasising the increasing occurrence of technologically sophisticated frauds, as the perpetrator skillfully imitated her nephew's voice.

advertisement
Woman fooled by nephew's mimicked AI voice, loses Rs 1.4 Lakhartificial intelligence
Woman fooled by nephew's mimicked AI voice, loses Rs 1.4 Lakh
profile
New Delhi, UPDATED: Nov 17, 2023 12:34 IST

Highlights

  • A 59-year-old woman loses Rs.1.4 lakh in AI voice scam
  • The scammer replicates the voice of her nephew, who lives in Canada, asking for immediate financial help
  • They utilised AI tools that mimic voices precisely by gathering data from the public domain

In a distressing incident, a 59-year-old woman fell victim to a sophisticated AI-generated voice scam, losing a staggering Rs 1.4 lakh. The fraudster mimicked her nephew's voice, creating a harrowing narrative that left the victim in shock. This unfortunate event sheds light on the increasing prevalence of technology-enabled frauds that are affecting unsuspecting individuals.

advertisement

The mimicked call

Late one night, the woman Prabhjyot (pseudonym) received a call that seemed to be from her nephew in Canada. The caller, using advanced AI technology, replicated the nephew's voice flawlessly, speaking in their native Punjabi language.

The fabricated narrative included details of an accident and impending legal trouble, creating a sense of urgency. The victim, trusting the caller, unknowingly initiated multiple money transfers to the fraudulent account.

Rise of AI voice frauds

Cybersecurity experts are alarmed at the growing prevalence of AI voice frauds, particularly targeting individuals with relatives in countries like Canada and Israel.

The scammers utilise AI tools that mimic voices precisely by gathering data from the public domain, including social media recordings and sales calls made by fraudsters. The creation of distressing situations in foreign countries adds to the effectiveness of these scams.

Cautionary measures

Prasad Patibandla, the Director of Operations at the Centre for Research on Cyber Intelligence and Digital Forensics, explained the intricate workings of AI voice scams. He emphasised that confirming the urgency of a situation before transferring funds is crucial.

Authorities, while acknowledging that AI-related frauds may not be extensively reported, urge the public to exercise vigilance and verify the legitimacy of distress calls, especially from relatives residing abroad.

A stark reminder for vigilance

The incident serves as a stark reminder for individuals to remain cautious and verify the authenticity of distress calls, especially when solicited for immediate financial aid purportedly from family members residing abroad.

KVM Prasad, ACP of Cyber Crime in Hyderabad, highlights the importance of scepticism in dealing with such situations. While AI voice frauds are occurring in fewer numbers, the public is urged to stay vigilant and report any suspicious activity.

advertisement

Published on: Nov 17, 2023 12:08 ISTPosted by: samira siddiqui, Nov 17, 2023 12:08 IST

COMMENTS 0

Advertisement
Recommended