Delhi Elderly Woman Loses Lakhs to AI Voice Clone Scam: ‘Save Me, Mom…

Delhi-Elderly-Woman-Loses-Lakhs-to-AI-Voice-Clone-Scam-Save-Me-Mom-data

A Sophisticated Scam: AI-Generated Voice Clones Target Vulnerable Individuals

A recent incident in Delhi highlights the growing threat of artificial intelligence-driven cybercrime, where a 65-year-old woman was duped out of ₹2 lakh by scammers using a cloned version of her daughter’s voice.

The Scam

The caller, feigning distress and urgency, coerced the victim into transferring money immediately. The family later confirmed that the daughter was safe at home, and the call was entirely fabricated.

The Modus Operandi

This case reflects a rapidly emerging pattern in which cybercriminals exploit short voice samples from social media to create convincing voice replicas. By posing as relatives in distress, they manipulate emotional panic and push victims into making instant digital payments.

According to cyber experts, fraudsters typically scan social media profiles to gather voice samples, which they then use to create fake calls. To add credibility, they incorporate crying sounds, background noise, and phrases that create a sense of urgency, leaving victims with little time to verify the authenticity of the call.

Vulnerable Targets

Elderly individuals and women living alone are disproportionately targeted by these scammers, who rely on emotional manipulation rather than technical deception. Projections suggest that AI-enabled fraud cases could increase by around 40% in 2026, driven by the widespread availability of voice-cloning tools and social media exposure.

A Global Problem

This trend is not limited to India, as similar scams have been reported in the United States, where people aged 60 and above lost billions of dollars to such scams in 2024.

Prevention and Mitigation

The increased believability of scam calls, thanks to AI, has rendered familiar voice recognition unreliable. Authorities now recommend multi-step verification before responding to any emergency request involving money. Cybersecurity teams stress that any urgent financial demand over the phone should be treated as suspicious until independently verified.

  • Creating a family code word that only the person who knows the agreed secret word should be trusted
  • Calling back on the known number
  • Asking personal questions that only the real person would know
  • Restricting social media privacy
  • Never sharing banking credentials over calls

Reporting and Support

Victims of such scams are advised to immediately report suspected fraud to the cybercrime helpline 1930, as early reporting increases the chances of freezing and recovering the transferred amount. The impact of these scams extends beyond monetary damage, as elderly victims often experience guilt, anxiety, and long-term fear of phone calls.

Families are urged to maintain regular contact with senior members and raise digital awareness to reduce vulnerability.

A Call to Action

As AI tools become increasingly sophisticated and accessible, cybercriminal tactics are evolving at an unprecedented pace. Authorities emphasize that verification before payment must become a non-negotiable rule, as voice alone can no longer be considered proof of identity in emergency situations.



About Author

en_USEnglish