Tuesday, May 9, 2023

Experts Raise The Alarm Against AI Voice Cloning As Scams Reach An All-Time High

The world of AI is as good as it is bad and a new report by security experts is raising the alarm against voice scams supported by AI technology.

The ordeal appears to be super prevalent and very convincing to some people as it sounds like you might be speaking to a loved one. But in all reality, it’s far from that. The in-depth report has a lot of people at a standstill as it digs into how the entire cloning process works. Similarly, it talks about how scams such as these have become so common and the likelihood of falling into them is much easier now than before.

This includes some light on the average costs linked to it and how one can prevent it and steer clear of all voice scams built using AI technology.

Just last month, we really witnessed some shocking but so well-curated AI scams that many deemed to be frightening. One of those was linked to call spoofing where loved ones popped up on a victim’s device as the one initiating the call. Meanwhile, another one utilized an AI voice clone to extort random funds from the mother so the daughter would be released who was not kidnapped.

Therefore, experts feel it might just be a short while away before we’re seeing attackers bring together call spoofing with clones using AI technology. Therefore, this is right where the McAfee report has popped up.

It features AI voice scams that are designed to produce awareness of a threat with several methods to both stop it from happening and provide the necessary protection. But how does the world of AI voice cloning work.

Thanks to highlights from McAfee, voice scams are a combination of imposter scams that were around for so long. But they’re super convincing and it has to do with scammers making use of a person’s voice and asking them for funds in cases of emergency or acting like you’re holding someone for ransom purposes.

AI voice clone tools are cheap in cost and so readily available. It’s not only quite fast but also so simple as many malicious parties are now utilizing voice clones. They can simply get sample audio to enable this as people share voices across various social media apps. And the more that’s shared online, the simpler it is for actors to search for a voice and clone it too.

As one can expect, voice cloning might appear to be something new and we’ve witnessed some huge stories from the real world arise regarding such scams. But in all reality, the study by McAfee proved how they are indeed growing more common with time.

On average, around 25% of the world’s population who took part in the survey claim to have gone through such a scam related to the world of AI voice scam or they were aware of people who they personally knew that had been affected by such scams.

Nations like India really have been affected immensely where around 47% of people in the study claim to know an individual who was affected.

As far as the accuracy is concerned, well, you’ll be amazed at how they’re 95% accurate as so many cases from the public revealed how such voices sounded like a person was cloned. In a certain case, there was a cybercriminal demanding money for a kidnapping that never took place. Clearly, it’s become even harder now than ever to differentiate real from fake people.

Sadly, 77% of respondents claim that they lost money in such scams and the figures can range between $1000 to $15,000. In total, such imposter scams enabled a theft worth a whopping $2.6 billion.



Read next: AI Chatbots Are Taking Over The World As Downloads Reach Millions Across The App Store
by Dr. Hura Anwar via Digital Information World

No comments:

Post a Comment