A 68-year-old businessman from Powai, Mumbai, was scammed out of Rs80,000 in an AI voice cloning scam. The victim received a call impersonating the Indian Embassy in Dubai, informing him that his son had been arrested and was in need of bail. The fraudster even played a recorded voice of the son, which resembled his actual voice. The shocked businessman was coerced into transferring the money immediately, fearing that his son would be sent to life imprisonment if he didn’t comply.
The fraudster instructed the victim to transfer the money using GPay, a service that the victim wasn’t familiar with. As a result, the victim requested his office staff to facilitate the transfer. Once the money was transferred, the fraudster abruptly ended the call. It was only after contacting his son in Dubai that the victim discovered he had been scammed, as his son was safe and had never been detained.
This AI voice cloning scam follows a similar pattern where unsuspecting victims are targeted by scammers who employ advanced technology to manipulate and deceive them. The scammer uses AI to clone the voice of the victim’s family member, creating a sense of urgency and panic within the victim. Social media platforms are used to gather personal information and voice samples for the cloning process.
In another case that occurred on April 2, a professor from NMIMS College in Mumbai fell victim to a fraud scheme, losing Rs1 lakh. She received a call from someone claiming to be an inspector from the Mumbai police, falsely stating that her son had been detained. The fraudster, armed with the victim’s profile and family details obtained from social media, coerced her into transferring the money or risk her son going to jail.
Phone calls are not usually the direct source of phone hacks, but text messages can contain malware that can lead to phishing attacks or other scams. As demonstrated in the Mumbai businessman’s case, scammers are utilizing AI to manipulate victims. AI voice cloning scams are becoming more prevalent and sophisticated, making it crucial for individuals to remain cautious when dealing with unknown callers.
Ritesh Bhatia, a cyber expert, warns about the proliferation of AI voice cloning scams and advises the public to exercise vigilance. He suggests verifying the distressing situation through direct contact with the supposed party involved or the relevant authorities. Additionally, reporting suspicious numbers to the police is crucial in combating these scams.