VIRAL: Beware of AI generated voice scams
From car accidents to kidnapping
It's happening...artificial intelligence (AI) is penetrating lives in a very alarming way. There are now several viral posts from victims who have shared their experiences on how AI was used to scam them and their loved ones.
In a now viral post of TikTok user with the name Eddie Cumberbatch, he shared that his grandparents received a call from “him” informing them that he’d been in a bad car accident and they needed to send him money.
Screen capture from Cumberbatch's TikTok post
“I don’t even own a car,” he said in the video. “Obviously it’s not me, but they were close on getting a second mortgage out in order to pay the scammer because they thought it was me.” Meanwhile, [a mother in Arizona](https://www.tiktok.com/@itvnews/video/7221144534820932870?q=AI%20voice%20scam&t=1683593501534) also received a phone call from her “daughter” crying because she’s been kidnapped. She even was able to talk to the supposed kidnappers asking her for a million dollars in exchange for her daughter. In both of these incidents, voice cloning through AI was used to mimic the voice of the supposed victims of accident and kidnapping in order to extort money from their loved ones. According to a report from Time Magazine, scammers are using the audio clip from someone’s voice online and easily upload it to an online program or applications that can replicate the voice. This is also what Cumberbatch had told in his viral video. “This AI sh\*t is getting out of hand,” he said. “This is terrifying. If you ever uploaded a video on the internet, you and your family could be hit by this… because all they need is a short video clip and they can clone your voice.” He then recommended people to make a secret code with their loved ones in order to verify if they are actually speaking to one another just in case they received an emergency call like this.
Screen capture from Cumberbatch's TikTok post
“I don’t even own a car,” he said in the video. “Obviously it’s not me, but they were close on getting a second mortgage out in order to pay the scammer because they thought it was me.” Meanwhile, [a mother in Arizona](https://www.tiktok.com/@itvnews/video/7221144534820932870?q=AI%20voice%20scam&t=1683593501534) also received a phone call from her “daughter” crying because she’s been kidnapped. She even was able to talk to the supposed kidnappers asking her for a million dollars in exchange for her daughter. In both of these incidents, voice cloning through AI was used to mimic the voice of the supposed victims of accident and kidnapping in order to extort money from their loved ones. According to a report from Time Magazine, scammers are using the audio clip from someone’s voice online and easily upload it to an online program or applications that can replicate the voice. This is also what Cumberbatch had told in his viral video. “This AI sh\*t is getting out of hand,” he said. “This is terrifying. If you ever uploaded a video on the internet, you and your family could be hit by this… because all they need is a short video clip and they can clone your voice.” He then recommended people to make a secret code with their loved ones in order to verify if they are actually speaking to one another just in case they received an emergency call like this.