Late one night, while mindlessly scrolling through YouTube, I stumbled upon a video that shed light on a disturbing scam utilizing voice AI platforms. It revealed the potential abuse of this technology in a practice known as virtual kidnapping. This article explores the concept behind virtual kidnappings, the methods employed, and the implications of such a scam.
Understanding virtual kidnapping
Virtual kidnapping is a scam that capitalizes on the fear and panic that arises when someone believes their loved one has been kidnapped. Rather than physically abducting the victim, the scammer aims to extort money or gain some advantage by creating a convincing illusion of kidnapping.
Traditional low-tech method
One of the more traditional approaches to virtual kidnapping involves spoofing the victim’s phone number. The scammer would call a member of the victim’s family or one of the victim’s friends, creating a chaotic atmosphere with background noise to make it seem like the victim is in immediate danger. The scammer would then demand a ransom for the victim’s safe return.
To enhance the credibility of the scam, perpetrators often utilize open-source intelligence (OSINT) to gather information about the victim and their associates. This information helps scammers make the ruse more plausible, such as targeting individuals who are known to be traveling or away from home by monitoring their social media accounts.
Read also: OSINT 101: What is open source intelligence and how is it used?
High-tech voice cloning
A more advanced and refined version of virtual kidnapping involves obtaining samples of the victim’s voice and using AI platforms to create a clone of it. The scammer can then call the victim’s family or friends, impersonating the victim and making alarming demands.
Feasibility of voice cloning
To demonstrate the feasibility of voice cloning, I decided to experiment with free AI-enabled video and audio editing software. By recording snippets of Jake Moore’s well-known voice — Jake is ESET’s Global Security Advisor — I attempted to create a convincing voice clone.
Using the software, I recorded Jake’s voice from various videos available online. The tool generated an audio file and transcript, which I later submitted to the AI-enabled voice cloning service. Although skeptical about the success of the experiment, I received an email notification within 24 hours stating that the voice clone was ready for use.
And here are the results:
AUDIO DOWNLOAD: Jake's AI generated fake plea
Limitations and potential misuse
While the initial voice cloning attempt showed flaws in pacing and tone and a limited vocabulary, the potential for nefarious use of this technology remains evident. Criminals could exploit virtual kidnapping by sending voice messages that include personal information obtained through OSINT techniques, making the scam more convincing.
Moreover, high-profile individuals, such as managing directors of technology companies, could become targets for voice theft due to their public presence. By stealing their voices, scammers could manipulate employees within the organization to perform undesirable actions. Combined with other social engineering tactics, this could become both a powerful tool and a challenging issue to combat as technology improves.
A cause for concern?
This new modification of the existing virtual kidnapping technique, through which scammers create the illusion of kidnapping without physically abducting anyone, is a concerning development in the realm of cybercrime. The abuse of voice AI platforms to clone voices raises serious ethical and security concerns.
As technology progresses, it is crucial for individuals, organizations, and AI platform developers to be vigilant about the potential misuse of voice cloning and other similar tech. Safeguarding personal information, being cautious with your online presence, and employing robust security measures and training can help mitigate the risks associated with virtual kidnappings and protect against unauthorized voice cloning attempts.
Related reading: FBI warns of voice phishing attacks stealing corporate credentials