The voice on the phone seemed terrifyingly real: an American mother could hear her daughter sobbing before a man took over and demanded a ransom. However, the girl was an AI clone, and the kidnapping was a hoax. According to experts, the most dangerous aspect of AI is its potential to blur the lines between reality and fiction, providing cybercriminals with a low-cost and effective means of spreading misinformation.
Fraudsters are employing remarkably realistic AI voice cloning technologies — publicly available online — to steal from victims by impersonating family members in a new style of scam that has alarmed US officials.
“Help me, Mom, please help me,” Jennifer DeStefano, a mother from Arizona, heard on the other end of the line.
DeStefano was “100 percent” certain it was her 15-year-old daughter in deep distress while away on a skiing trip.
“It was never a question of who is this? It was completely her voice… it was the way she would have cried,” DeStefano told a local television station in April.
“I never doubted for one second it was her.”
The scammer who took over the call, which came from a number DeStefano was unfamiliar with, requested up to $1 million. The horrific episode, which is currently being investigated by authorities, highlighted the potential for fraudsters to abuse AI clones.
Officials have warned of an increase in the “grandparent scam,” in which an imposter acts as a grandchild in desperate need of money
“AI voice cloning, now almost indistinguishable from human speech, allows threat actors like scammers to extract information and funds from victims more effectively,” Wasim Khaled, chief executive of Blackbird.AI, told AFP.
A simple internet search reveals a plethora of software, many of which are free, for creating AI voices using a brief sample – often only a few seconds – of a person’s real voice, which can be easily taken from content posted online.
“With a small audio sample, an AI voice clone can be used to leave voicemails and voice texts. It can even be used as a live voice changer on phone calls,” Khaled said.
“Scammers can employ different accents, and genders, or even mimic the speech patterns of loved ones. [The technology] allows for the creation of convincing deep fakes.”
In a global study of 7,000 people from nine nations, including the US, one in four indicated they had encountered or knew someone who had experienced an AI voice cloning scam. According to the survey, 70% of respondents were unsure they could “tell the difference between a cloned voice and the real thing,” according to McAfee Labs, based in the US. Officials in the US have warned of an increase in the “grandparent scam,” in which an imposter acts as a grandchild in desperate need of money in a difficult position.
“You get a call. There’s a panicked voice on the line. It’s your grandson. He says he’s in deep trouble — he wrecked the car and landed in jail. But you can help by sending money,” the US Federal Trade Commission said in a warning in March.
“It sounds just like him. How could it be a scam? Voice cloning, that’s how.”
There were numerous accounts of elderly persons who had been deceived in the comments section beneath the FTC’s warning.
AI voice scams are gaining traction and spreading rapidly
Eddie, a 19-year-old in Chicago, had a similar incident when his grandfather received a call from someone who sounded exactly like him, stating he needed money following a vehicle accident. The hoax, as described by McAfee Labs, was so convincing that his grandfather began scrambling for money and even considered re-mortgaging his house before the truth was revealed.
“Because it is now easy to generate highly realistic voice clones… nearly anyone with any online presence is vulnerable to an attack,” Hany Farid, a professor at the UC Berkeley School of Information, told AFP.
“These scams are gaining traction and spreading.”
ElevenLabs, an Artificial Intelligence startup, admitted earlier this year that its voice cloning technology might be used for “malicious purposes” after users shared a deep fake audio pretending to be actor Emma Watson reading Adolf Hitler’s biography “Mein Kampf.”
“We’re fast approaching the point where you can’t trust the things that you see on the internet,” Gal Tal-Hochberg, group chief technology officer at the venture capital firm Team8, told AFP.
“We are going to need new technology to know if the person you think you’re talking to is actually the person you’re talking to,” he said.
How to protect yourself against AI voice scams?
You can quickly determine if a caller is impersonating a loved one by hanging up and dialing the real number; it’s quite unlikely that the attackers will be able to listen in on your call.
Ask a challenging question that only your loved one might know the solution to (avoid asking a question that can be answered online, on social media, etc.).
The FCC advises not answering calls from unknown numbers; instead, let them go to voicemail.
However, the power of faked calls is that they make it seem as though the caller is someone you know.
If at all possible, avoid posting your phone number online or on social media, and urge family members to do the same.