Editor’s note: This article is available in Spanish here.
When Will Trapp’s parents called their son in a panic, convinced he had been in a horrible accident, he was at home studying.
Trapp, an interdisciplinary studies sophomore, was unaware that his family just heard his panicked voice over the phone. On the call, Trapp’s “voice” was followed by a “police officer” who detailed to his parents that ther son was involved in a gory car accident, leaving a pregnant woman in the other car injured.
Trapp’s parents then said that a so-called “public defender” instructed them to withdraw over $15,000 to cover their son’s bail at the San Luis Obispo jail. A courier would come to their home to pick up the money, the public defender said.
“It was really scary hearing my mom like that,” Trapp said. “I was pretty troubled by it in the week following.”
Trapp’s parents were not wrong in identifying the audio as their son’s voice. But, the car accident never happened and they never actually spoke to a police officer or attorney.
The Trapp family nearly fell victim to a type of scam that is becoming increasingly common, which involves AI technology called “voice cloning.”
Cal Poly computer engineering professor Foaad Khosmood explained that the technology relies on a combination of voice recordings from the individual and a large catalog of learned information.Depending on the quality of the model the scammers used, the phone call the Trapp family received could have only required a few minutes of speech, according to Khosmood.
“You can get that from anywhere. People could be recording you just walking down the street. They could be Zoom recordings that you don’t know about,” he said. “Those companies that call and say it’s a tech support thing and record for quality, who knows what they’re going to do with it?”
However, Khosmood cautions people from becoming too paranoid about this technology, saying there are warning signs of voice cloning scams.
“They might be able to imitate your voice, but they can’t imitate your style,” Khosmood said. “It’s all about how you say something. People’s vocabularies are very different.”
He also encourages people to ask questions that the actual person would know, as well as call the individual at their normal phone number, as the Trapp family did.
Computer science senior Cameron Hardy believes it is important for the general public to become more aware of the potential of voice cloning technology.
“Once people are aware of it, I think it’s not as likely to work on everyone,” Hardy said. “I think the people that are most likely to be vulnerable to this, like a lot of other scams, are the older generations that aren’t as tech-savvy.”
Looking ahead, Trapp said his family has made some safety changes after this experience.
“My family and I came up with a code word they could ask the ‘fake me.’ That would have solved things a lot quicker,” Trapp said. “I just feel more aware and prepared.”

