AI Voice Cloning Scams: The New Threat You Need to Know About
Published on August 5, 2025 by The ScamSpotter Team
Imagine getting a call from your child, spouse, or parent. They sound like they're in trouble, and they need money fast. But it's not them. It's a scammer using AI to perfectly mimic their voice. This is the new reality of AI voice cloning scams.
How it Works
Scammers only need a few seconds of audio of a person's voice—often scraped from social media videos—to create a convincing clone. They then use this clone to call a family member and fabricate an emergency, like a kidnapping or a medical crisis, demanding immediate payment.
How to Protect Yourself
- Establish a "Safe Word": Agree on a secret word or phrase with your family that only you would know. If you receive a suspicious call, ask for the safe word.
- Verify Independently: If you get a distressing call, hang up immediately. Then, call the person back on their known phone number to verify the story.
- Be Wary of Urgency: Scammers create a sense of panic to prevent you from thinking clearly. Any request for immediate, untraceable payment (like wire transfers or gift cards) is a major red flag.
Heard a suspicious voice message?
While our tool can't analyze audio yet, you can still check the accompanying text messages. Paste any suspicious texts into our AI Scam Checker for a free, instant analysis.
Check a scam now →