An Arizona mom recounted a harrowing experience receiving a scam AI phone call from a stranger who claimed to have abducted her 15-year-old daughter.
“I pick up the phone, and I hear my daughter’s voice, and it says, ‘Mom!’ and she’s sobbing,” DeStefano told AZFamily.com. “I said, ‘What happened?’ And she said, ‘Mom, I messed up,’ and she’s sobbing and crying.”
“Then I hear a man’s voice say, ‘Put your head back. Lie down,’ and I’m like, ‘Wait, what is going on?’”
“This man gets on the phone, and he’s like, ‘Listen here. I’ve got your daughter. This is how it’s going to go down. You call the police, you call anybody, I’m going to pop her so full of drugs. I’m going to have my way with her, and I’m going to drop her off in Mexico.’ And at that moment, I just started shaking. In the background, she’s going, ‘Help me, Mom. Please help me. Help me,’ and bawling.”
The mom, Jennifer DeStefano, said the incident was all the more terrifying as one of the people on the call sounded exactly like her daughter.
“It was never a question of who is this? It was completely her voice. It was her inflection. It was the way she would have cried,” she said. “I never doubted for one second it was her. That’s the freaky part that really got me to my core.”
“I’m like, ‘I don’t have a million dollars. Just don’t hurt my daughter!’ she begged. “Then he wanted $50,000.”
She didn’t know at the time, but DeStefano had fallen victim to a virtual kidnapping scam.
The scam involves an AI-generated voice trained through deep learning technology that virtually clones the voice of a loved one, with another person claiming to have kidnapped them and demanding a ransom for the family member’s safe return.
“You can no longer trust your ears,” explained Arizona State University computer science professor Subbarao Kambhampati.
“In the beginning, it would require a larger amount of samples. Now there are ways in which you can do this with just three seconds of your voice. Three seconds. And with the three seconds, it can come close to how exactly you sound,” Kambhampati told AZFamily.
“Most of the voice cloning actually captures the inflection as well as the emotion. The larger the sample, the better off you are in capturing those,” the AI expert continued. “Obviously, if you spoke in your normal voice, I wouldn’t necessarily be able to clone how you might sound when you’re upset, but if I also had three seconds of your upset voice, then all bets are off.”
Despite her initial panic, DeStefano was eventually able to figure out her daughter was safe with the help of other parents.
More from AZFamily:
DeStefano kept him talking. She was at her other daughter’s dance studio, surrounded by worried moms who wanted to help. One called 911. Another called DeStefano’s husband. Within just four minutes, they confirmed her daughter was safe. “She was upstairs in her room going, ‘What? What’s going on?’” DeStefano said. “Then I get angry, obviously, with these guys. This is not something you play around with.”
[…]
DeStefano hung up the phone. That’s when the wave of relief washed over her. “I literally just sat down and broke down crying,” she said. They were tears for all of the what-ifs. It all just seemed so real.
The panicked mother’s story underscores the dark side of AI and highlights the need for awareness about these types of scams as they play out all over the country and even Canada.
One Newfoundland woman was scammed out of $10,000 last month after receiving a call from someone she thought was her son.
Last year, an NBC New York correspondent reportedly also fell victim to the scam in which strangers claimed to have kidnapped his mother and demanded a ransom.
One Arizona man recorded a scam call where strangers claimed to have taken his daughter hostage. The call featured the voice of a person who sounded like the man’s daughter sobbing uncontrollably.
Speaking to AZFamily, FBI assistant special agent Dan Mayo said scammers who use voice cloning often comb through social media posts to lift voice samples.
“You’ve got to keep that stuff locked down. The problem is, if you have it public, you’re allowing yourself to be scammed by people like this because they’re going to be looking for public profiles that have as much information as possible on you, and when they get a hold of that, they’re going to dig into you,” Mayo said.
As AI grows increasingly sophisticated, it’s important to be wary of the various ways it can be used to defraud you of your hard-earned money.
Source: News Wars Rephrased By: InfoArmed