Jennifer DeStefano, a mom from Arizona claims that scammers used AI expertise to clone her daughter’s voice in a $1 million kidnapping scheme. She acquired a name from somebody claiming to have kidnapped her 15-year-old daughter whereas she was on a snowboarding journey. The individual used an AI simulation to mimic her daughter’s voice and demanded a ransom of $1 million, later lowered to $50,000. DeStefano mentioned that she heard her daughter crying and begging for assist in the background, which satisfied her that it was a real name — thankfully, the mom managed to substantiate her daughter’s location and security after contacting the police.
The expertise used to clone the daughter’s voice was an AI simulation, which replicated her voice from temporary soundbites. The incident has raised considerations concerning the misuse of AI and the potential for scammers to make use of it to control individuals in new and complex methods. The pc science professor and AI authority at Arizona State College, Subbarao Kambhampati, defined that AI simulations may come near replicating somebody’s voice with solely three seconds of audio.
With a view to keep away from such scams, Dan Mayo, an assistant particular agent within the FBI Phoenix workplace advises individuals to be cautious with their private data on social media, as scammers can use this to dig into their lives and create a convincing story. Mayo recommends asking questions on family members that scammers wouldn’t know to confirm their identification.
This incident highlights the significance of being vigilant and conscious of the potential for AI expertise for use in scams.
Filed in AI (Artificial Intelligence) and Crime.. Learn extra about