Scammers use 15-second clip to create AI voice clone, nearly dupe lawyer’s father out of $30,000
An AI-generated voice nearly tricked a lawyer’s father into paying $30,000 in fake bail money.
Consumer protection attorney Jay Shooster shared on X that scammers used a voice clone to impersonate him, claiming he’d been arrested after a drunk driving accident.
Shooster believes the scammers may have used a 15-second clip of his voice from a recent TV appearance to create the fake. Even though Shooster had warned his family about such scams, they almost fell for it.
“That’s how effective these scams are. Please spread the word to your friends and family,” Shooster said. He called for tighter regulation of the AI industry.
Ad
Study shows humans can’t reliably recognize AI voices
A University College London study found that people failed to recognize AI-generated voices 27% of the time, regardless of language. Repeated listening didn’t significantly improve detection rates.
This means that, in theory, one in four phone scams involving fake AI voices could be successful. The researchers stress the need for better automated deepfake detectors, as human intuition has its limits.
In a separate experiment, IBM security experts demonstrated “audio-jacking” – using AI to manipulate live phone calls and divert money to fake accounts. The researchers combined speech recognition, text generation, and voice cloning techniques. They warn that as the technology advances, it poses a growing risk to consumers. Future attacks could potentially manipulate live video calls as well.
While voice cloning has some positive uses, such as preserving the voices of people with disabilities, the risks currently seem to outweigh the benefits.
It’s not just about fraud. Voice actress Amelia Tyler recently reported her AI-cloned voice was used to read rape pornography without her consent. Tyler, nominated for a BAFTA for her role in “Baldur’s Gate 3”, heard the clone during a livestream after viewers input text into a generator.
Recommendation