The Cyber Crime Wing of the Tamil Nadu Police has issued an advisory regarding a new impersonation scam that utilizes AI (Artificial Intelligence) voice cloning. The police have urged the public to exercise caution when receiving unsolicited calls on their mobile phones. According to the police, cyber fraudsters have started using voice cloning with advanced AI technology to imitate the voices of trusted individuals, including family members, in phone calls. The scam involves creating a sense of urgency or distress to deceive victims into quickly transferring money, exploiting their trust.
This scam highlights the increasing sophistication of cybercrimes and underscores the need for awareness and caution to protect residents from becoming victims of such fraudulent schemes. The Cyber Crime Wing has explained that the scam begins with a phone call to the victim from a scammer pretending to be someone known and trusted by the victim, such as a family member or friend. The scammer may claim to be in urgent need of financial assistance due to a fabricated emergency or threat. To evoke a sense of urgency and emotional distress in the victim, the scammer employs various tactics like sobbing or pleading tones, claiming to be in a dire situation that necessitates immediate help.
Behind the scenes, the scammer utilizes advanced AI software to clone the voice of the person they are impersonating. They obtain a voice sample of the targeted individual from social media posts, videos, or even by speaking to the person on the phone using a ‘wrong number’ tactic. This technology enables them to convincingly mimic not only the voice but also the intonation and emotional nuances of the victim’s trusted contact.
Sanjay Kumar, the ADGP of the Cyber Crime Wing, summarized the scam as the use of AI-generated cloned voices for cybercrimes. Once the scammer has successfully established a sense of urgency and trust, they request the victim to transfer money immediately to resolve the supposed crisis. They often suggest using fast and convenient payment methods like the Unified Payments Interface (UPI) system to expedite the transaction. Driven by concern and a desire to help their loved one, the victim may comply with the scammer’s demands without verifying the authenticity of the caller or the legitimacy of the situation.
After completing the money transfer, the victims may later discover that they have been deceived when they independently contact their family member or friend and realize that they were never in distress or in need of financial assistance. As a result, the victims suffer financial losses and experience feelings of betrayal, violation, and emotional distress upon realizing they were scammed, according to the police.
Sanjay Kumar urged the public to always verify the identity of the person calling, especially if they request urgent financial assistance. He advised asking probing questions or contacting the friend/relative through a known and verified number to confirm their identity before taking any action. He also stressed the importance of staying informed about common scams, including this voice cloning fraud, and learning to recognize warning signs. Individuals should be cautious of unexpected requests for money, particularly those involving urgent situations or emotional manipulation.
If there is any suspicion of being a victim of such fraud or encountering any suspicious activity, the police urge reporting the incident to the Cyber Crime Toll-Free Helpline 1930 or lodging a complaint on www.cybercrime.gov.in.