Skip to Content

‘I’ve got your daughter’: Arizona mom warns of close call with AI voice cloning scam

By Susan Campbell

Click here for updates on this story

    SCOTTSDALE, Arizona (KPHO, KTVK) — The phone number that appeared on the screen was unfamiliar. Jennifer DeStefano almost let it go to voicemail, but her 15-year-old was out of town skiing. Maybe there had been an accident. “I pick up the phone, and I hear my daughter’s voice, and it says, ‘Mom!’ and she’s sobbing,” DeStefano recalled. “I said, ‘What happened?’ And she said, ‘Mom, I messed up,’ and she’s sobbing and crying.”

In a split second, DeStefano’s confusion turned to terror. “Then I hear a man’s voice say, ‘Put your head back. Lie down,’ and I’m like, ‘Wait, what is going on?’” DeStefano said. “This man gets on the phone, and he’s like, ‘Listen here. I’ve got your daughter. This is how it’s going to go down. You call the police, you call anybody, I’m going to pop her so full of drugs. I’m going to have my way with her, and I’m going to drop her off in Mexico.’ And at that moment, I just started shaking. In the background, she’s going, ‘Help me, Mom. Please help me. Help me,’ and bawling.”

There was no doubt in DeStefano’s mind. Her daughter was in trouble. “It was never a question of who is this? It was completely her voice. It was her inflection. It was the way she would have cried,” she said. “I never doubted for one second it was her. That’s the freaky part that really got me to my core.”

But the 15-year-old never said any of it. The voice on the phone was just a clone created by artificial intelligence. “You can no longer trust your ears,” said Subbarao Kambhampati, a computer science professor at Arizona State University specializing in AI. He says voice cloning technology is rapidly improving. “In the beginning, it would require a larger amount of samples. Now there are ways in which you can do this with just three seconds of your voice. Three seconds. And with the three seconds, it can come close to how exactly you sound,” Kambhampati told On Your Side. “Most of the voice cloning actually captures the inflection as well as the emotion. The larger the sample, the better off you are in capturing those,” he said. “Obviously, if you spoke in your normal voice, I wouldn’t necessarily be able to clone how you might sound when you’re upset, but if I also had three seconds of your upset voice, then all bets are off.”

Deep learning technology currently has very little oversight, and according to Kambhampati, it is becoming easier to access and use. “It’s a new toy, and I think there could be good uses, but certainly, there can be pretty worrisome uses, too,” he said.

Dan Mayo, the assistant special agent in charge of the FBI’s Phoenix office, says scammers who use voice cloning technology often find their prey on social media. “You’ve got to keep that stuff locked down. The problem is, if you have it public, you’re allowing yourself to be scammed by people like this because they’re going to be looking for public profiles that have as much information as possible on you, and when they get a hold of that, they’re going to dig into you.”

According to the Federal Trade Commission, scammers will often ask victims to wire money, send cryptocurrency or pay the ransom with gift cards. Once the money or gift card numbers are transferred, getting them back is almost impossible. “Just think of the movies. Slow it down. Slow the person down. Ask a bunch of questions,” Mayo said. “If they have someone of interest to you, you’re going to know a lot of details about them that this scam artist isn’t going to know. You start asking questions about who it is and different details of their background that are not publicly available, you’re going to find out real quick that it’s a scam artist.”

There are other red flags. “If the phone number is coming from an area code that you’re not familiar with, that should be one red flag,” Mayo added. “Second red flag; international numbers. Sometimes they will call from those as well. The third red flag; they will not allow you to get off the phone and talk to your significant other. That’s a problem.”

The person who had supposedly kidnapped DeStefano’s daughter demanded money. He started at a million dollars. “I’m like, ‘I don’t have a million dollars. Just don’t hurt my daughter!’ she begged. “Then he wanted $50,000.”

DeStefano kept him talking. She was at her other daughter’s dance studio, surrounded by worried moms who wanted to help. One called 911. Another called DeStefano’s husband. Within just four minutes, they confirmed her daughter was safe. “She was upstairs in her room going, ‘What? What’s going on?’” DeStefano said. “Then I get angry, obviously, with these guys. This is not something you play around with.”

It’s unknown how many people have received similar scam calls about a family emergency or fake kidnapping using a voice clone. “It happens on a daily basis, some of which are reported, some of which are not. I think a lot of people are kind of decompressing when they realize that it was a fake scam and probably just happy that it didn’t happen to them. However, there are some people who give in to these, and they end up sending the money to these individuals,” Mayo said. “Trust me, the FBI is looking into these people, and we find them.”

DeStefano hung up the phone. That’s when the wave of relief washed over her. “I literally just sat down and broke down crying,” she said. They were tears for all of the what-ifs. It all just seemed so real.

Please note: This content carries a strict local market embargo. If you share the same market as the contributor of this article, you may not use it on any platform.

Article Topic Follows: CNN - Regional

Jump to comments ↓

CNN Newsource

BE PART OF THE CONVERSATION

KTVZ NewsChannel 21 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content