skip to content

Woman Loses Rs 1.4 Lakh in AI Voice Scam, Caller Mimics Nephew

Date:

Hyderabad: A 59-year-old woman from Hyderabad fell victim to a staggering sum of Rs 1.4 lakh through an AI voice scam. The caller convincingly mimicked her nephew from Canada, claiming to be in urgent distress and seeking immediate financial assistance.

According to TOI, the scam transpired late at night when Prabhjyot received a call that first appeared to be from her nephew in Canada. The caller skillfully mimicked her nephew’s voice, speaking fluent Punjabi, claiming to be in a critical situation involving an accident and imminent legal troubles. Describing the encounter, Prabhjyot said, “He sounded just like my nephew and spoke exactly in the Punjabi we speak at home with all the nuances. He called me late in the night and said he had an accident and was about to be jailed. He requested me to transfer money and keep this conversation a secret.”

Prabhjyot discovered the deceptive nature of the call only after she had already carried out several money transfers to the account provided by the fraudulent caller.

Expressing concern over the growing prevalence of AI voice frauds, cybersecurity experts have issued warnings, particularly highlighting the vulnerability of people with relatives in countries like Canada and Israel to such scams. Prasad Patibandla, the Director of Operations at the Centre for Research on Cyber Intelligence and Digital Forensics (CRCIDF) in Delhi, elucidated the intricate workings of these scams. “AI voice imitating tools can mimic a person’s voice precisely by utilizing data available in the public domain, such as social media recordings or even sales calls made by fraudsters. Creating a sense of urgency by fabricating a distressed situation in a foreign country adds to the effectiveness of these scams,” Patibandla explained.

While AI-related fraud cases may not be widely reported at cybercrime police stations, authorities strongly advise the public to stay vigilant and authenticate distress calls from relatives in regions susceptible to such deceptive activities. KVM Prasad, ACP of Cyber Crime in Hyderabad, emphasized the importance of skepticism: “AI voice frauds are occurring, albeit in fewer numbers. It’s crucial for people to confirm the urgency of the situation before transferring any funds.”

Click here for Latest News updates and viral videos on our AI-powered smart news

For viral videos and Latest trends subscribe to NewsMobile YouTube Channel and Follow us on Instagram

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Share post:

Subscribe

Popular

More like this
Related

Pushpa 2 Stampede Case: Allu Arjun Sent To 14-Day Judicial Custody

Telugu film actor Allu Arjun was sent to 14...

“Would Like To have Ties Free Of Terrorism”: EAM Jaishankar On India-Pakistan Relations

External Affairs Minister S Jaishankar said on Friday that...

Bryan Adams Lights Up Gurugram With Electrifying Concert

Rock icon Bryan Adams took the stage last night...

Nifty, Sensex Continue To Decline, Weak Global Cues And Fed Rate Cut Expectations Pulling Indices Down

Indian stock markets continued the downward trend on Friday...