Experts warn of rise in scammers using AI to mimic voices of loved ones in distress
Jennifer DeStefano answered a phone call from an unknown number earlier this year and was horrified to hear what sounded exactly like the panicked voice of her oldest daughter Briana, who was begging to be saved from kidnappers.
“I hear her saying, ‘Mom, these bad men have me. Help me. Help me. Help me.’ And even just saying it just gives me chills,” DeStefano told ABC News.
In reality, Briana was safe and sound. Scammers had allegedly used artificial intelligence to mimic Briana’s voice to try and extort money out of her terrified family.
The incident is just one example of an alarming trend. One of the largest cybersecurity firms in the country, Check Point Technologies, says they've seen a substantial increase in AI-based scams and attacks from just the last year. Phone and cyber scams, in total, took approximately $10 billion out of the pockets of Americans in 2022, according to the FBI Internet Crime Complaint Center.
When DeStefano got that disturbing call on Jan. 20, her 15-year-old daughter Briana had been away on a ski trip.
“This man gets on and he says, ‘Listen here. I have your daughter. If you call anybody, you call the police, I'm gonna pump your daughter so full of drugs, I'm gonna have my way with her, I'm gonna drop her in Mexico, and you're never gonna see your daughter again,’” DeStefano said.
Then, DeStefano says, the scammer asked for $1 million.
“That’s when I went into panic mode. And I just opened up the door, put the phone on mute, and started screaming for help,” DeStefano said.
A nearby acquaintance overheard the commotion and called 911. The dispatcher told her it sounded like DeStefano was being targeted by a popular scam and asked if she had spoken to her daughter directly.
DeStefano was finally able to get through to her husband, who was also on the ski trip, and he was able to confirm that Briana was OK.
But some questions remained unanswered – who or what was actually on the other end of the line and how were they able to impersonate Briana well enough to fool her own mother?
Experts warn that even just a few seconds of social media content can give scammers all they need to recreate someone’s voice using artificial intelligence.
Reports of the elaborate scam have increased in recent months. In May, a Texas man told “Good Morning America” that his father was scammed out of $1,000 after receiving a distressed call from a scammer allegedly impersonating his grandson, Christian, saying he had gotten into trouble in Mexico and needed some money to get out of the situation.
Pete Nicoletti of Check Point Technologies advises that all family members adopt a “safe word” that can be used when talking with a loved one who has supposedly been kidnapped.
Former FBI special agent and ABC News contributor Rich Frankel says this kind of cybercrime is hard to stop. He recommends recording any type of suspicious call and then trying to reach loved ones directly.
“I would call law enforcement right away, because if it is a real kidnapping, you want law enforcement involved. And if it's a scam, you wanna know about it right away,” Frankel said.
DeStefano is now trying to protect others from falling victim to this type of scam, testifying at a Senate Judiciary subcommittee hearing on AI and human rights in June.
“Is this the future we are creating by enabling the use of artificial intelligence without consequence and without regulation?” DeStefano asked lawmakers.