In a troubling incident, a resident of Lucknow fell victim to a cyberthug who used artificial intelligence to impersonate the voice of the victim’s relative.
The fraudster manipulated the victim, Kartikeya, into transferring Rs 45,000 under the guise of a financial transaction. However, he was duped into sharing the credentials of his bank account.
Lucknow police are investigating the case.
Kartikeya, a resident of Vineet Khand, which falls in the Gomtinagar police station jurisdiction, received a call from an unknown number, where the caller claimed to be his maternal uncle.
The impersonator explained that he was in the process of transferring Rs 90,000 to someone known to him but was facing issues with the transaction through his UPI.
Deceptively, the fraudster instructed Kartikeya to send ₹45,000 to his account, posing to solve the apparent UPI problem. Trusting the caller, Kartikeya complied with the advice and transferred the specified amount.
Soon after, Kartikeya received several SMSes stating that two sums of Rs 10,000, one of ₹30,000, and ₹40,000 had been credited to his account. Believing that the money had been successfully credited and that he was paid back, Kartikeya later discovered that the funds were not present in his account.
Fortunately, multiple transactions failed, and only ₹44,500 could have been potentially transferred out of his account. Realizing the fraud, Kartikeya promptly reported the incident to the police. An FIR has been filed, and an investigation is underway to apprehend the perpetrator.
The second such case in two weeks
This incident in Lucknow follows a similar case in Delhi, where cybercriminals used AI-based voice cloning to extort money from an older man, Lakshmi Chand Chawla, of Yamuna Vihar,
In this instance, scammers impersonated the voice of the victim’s cousin’s son, claiming a kidnapping and coercing the victim to transfer ₹50,000 via Paytm.
The criminals created a realistic voice recording of the child, leveraging AI voice cloning technology to deceive the victim. When Chawla contacted the family, asking about the kidnapping, he was told that there had been no such incident. It was at this point that he realized what had happened.
AI Voice clones are on the rise.
While this phenomenon is relatively new in India, scammers have used AI-cloned voices to dupe people for over a year. One of the first cases involved a woman in Arizona, US, who received a message claiming that her daughter was a kidnapper. However, the woman knew this was a scam before the money could change hands.
In light of this new scam, there are a few things that one needs to keep in mind to protect themselves.
Check the phone number thoroughly before answering a call: Most scammers use numbers from Vietnam, Sudan, and Malaysia to place such calls, so unless you regularly receive calls from legitimate contacts from these areas, avoid answering them. Moreover, avoid picking up calls from unknown numbers if you can.
Screen your calls: There are plenty of ways to screen calls from unknown numbers. Some phones now come with AI assistants that will take a call for you, filter them, and then let you choose if you want to take the call, so if you have that option, use it. Otherwise, you can always text the number saying that you can’t take calls right now and that they need to text or send a message on WhatsApp or Telegram.
Be wary of what links you click on: Be mindful of the links you receive, especially from those not saved as part of your contact list. Scammers count on their victims to click these links, which will plant malware on your phone and send vital information.