Vishing & AI: The Phone Threat We Can’t Hear Coming

Why Trust Techopedia

When receiving a call from a friend or family member, could you tell the difference between their voice and an AI imposter? Sure, you might be tech-savvy enough to spot scam WhatsApp messages, but if a child or parent called you and asked for financial help in their unique voice, would you be as confident?

The rise of AI voice generator software makes it incredibly easy for anyone to replicate human voices from the comfort of their home. But as we begin to understand the scale of voice fraud  — especially when AI meets the phenomenon of vishing — how do we defend ourselves from a threat we cannot see?

Key Takeaways

  • AI can clone voices from 15 seconds of audio from social media posts or audio.
  • Fake audio can spread misinformation and fake ads by celebrities.
  • AI voice fraud uses your loved one’s voice against you.
  • AI voice cloning can be used in scams asking family members for money.
  • 99% of scams will use the motivation of adding time pressure to force you into making an irrational decision.

AI Voice Fraud: A Growing Threat Highlighted by High-Profile Victims

Online scams have become much more sophisticated than phishing emails, robocalls, and suspicious calls from unknown numbers. AI voice cloning is now enabling scammers to impersonate the voices of those you trust most. To protect yourself, you will need more than alertness to telltale signs such as grammatical errors and robotic-sounding voices.

AI enables cloning voices from social media posts, phone messages, or leaked audio, making scams significantly more believable and difficult to detect. For example, Open AI’s Voice Engine only requires a 15-second audio sample to generate natural-sounding speech that closely resembles the original speaker. It’s never been easier to replicate anyone’s voice.

Headlines from the first six months of 2024 suggest that this year will be the moment we take AI voice cloning seriously. In April, BBC presenter Liz Bonnin shared how somebody cloned her voice without her consent and used it for misleading advertisements. Just a month later, Scarlett Johansson accused OpenAI of using her voice.

Elsewhere, London Mayor Sadiq Khan shared how deep fake audio of him making inflammatory remarks before Armistice Day came very close to causing “serious disorder.”

Advertisements

In May, a Louisiana political consultant was indicted for orchestrating a fake robocall that impersonated President Biden. The voice scam was designed to urge voters not to cast their ballots in the January 2024 primary.

These high-profile events highlighted how celebrities and public figures are vulnerable to AI voice fraud. But as the months passed, it quickly became apparent that nobody was safe from AI voice cloning.

How ‘Thelma’ Highlights the Rise of AI Voice Scams

The portrayal of a voice scam in the sleeper hit movie Thelma has raised awareness of AI voice fraud. The film follows a 93-year-old grandmother who loses $10,000 to a scam call, reflecting a genuine incident in which voice cloning technology is commonly used to impersonate loved ones.

AI voice scams increasingly target older people, exploiting fear and urgency in their unsuspecting victims to achieve lucrative results. 

It’s easy to hear about these stories and think you will be just fine because you never answer phone calls from unknown or hidden numbers. But when scammers have gathered vast amounts of data about you and your family and unanswered calls are followed by a message saying, “Mom, it’s me. Please pick up,” would you still ignore such an incoming call?

The case of The Cut’s financial columnist, Charlotte Cowles, is another excellent example of how even the most rational and intelligent user can be caught off guard and easily fall victim to voice fraud. Despite writing a weekly column in the “Business” section of the New York Times, interviewing hundreds of financial experts, and priding herself on never losing her head, she somehow found herself handing $50,000 to a stranger.

How to Protect Yourself from AI Voice Cloning Scams

At the time of writing, scammers cannot provide the illusion of calling from the real person’s number, so every unknown number claiming to be a friend or family member should be a huge red flag. Your next move should be to contact the person directly on an alternate work phone number or social media platform to confirm their story.

Many security experts suggest that families could better protect themselves and each other by having a unique passcode or secret safe word to verify the caller’s identity. But the inability to remember our passwords might make this trickier than it sounds.

Whether it’s a primitive phishing email, robocall, or a sophisticated AI voice clone, 99% of scams will use the motivation of adding time pressure to force you into making an irrational decision.

Never panic or allow a false sense of urgency to cloud your judgment.

Take a deep breath and assess the situation calmly before responding. These inherent human skills are the best way to protect yourself from becoming a victim of an online scam.

The Bottom Line

We are currently sleepwalking into a period of digital deception. The voices we trust most — those of our loved ones — can now be weaponized against us. The months ahead are not just about bolstering the defenses of your financial security but also rising to the challenge of how we communicate and trust each other online.

Although it’s easy to feel overwhelmed at how technology is encroaching into every aspect of our lives, your best defense lies in your uniquely human qualities: skepticism, patience, and the ability to pause and reflect.

Advertisements

Related Reading

Related Terms

Advertisements
Neil C. Hughes
Senior Technology Writer
Neil C. Hughes
Senior Technology Writer

Neil is a freelance tech journalist with 20 years of experience in IT. He’s the host of the popular Tech Talks Daily Podcast, picking up a LinkedIn Top Voice for his influential insights in tech. Apart from Techopedia, his work can be found on INC, TNW, TechHQ, and Cybernews. Neil's favorite things in life range from wandering the tech conference show floors from Arizona to Armenia to enjoying a 5-day digital detox at Glastonbury Festival and supporting Derby County.  He believes technology works best when it brings people together.