AI voice cloning is making scam calls much more convincing
AI voice cloning is making scam calls much more convincing, and experts warn many Americans could be vulnerable as AI voice cloning scams become more widespread.
Scammers use short clips of your voice, often captured when you answer a silent call, to create realistic audio that can trick your friends, family, or even you.
How scammers are using AI to trick you
Scammers stay silent on calls so you'll say "hello."
That three-second clip is enough for AI tools to mimic your voice with surprising accuracy.
They can grab speech patterns from social media or voicemails too, and then use the cloned voice—often as pre-recorded or quickly generated audio—to impersonate people you trust;
fully interactive real-time voice conversion is emerging but not yet widely used, making it way easier to steal sensitive info or money.
The financial impact of these scams is staggering
Reports show vishing incidents have surged recently; experts say AI voice cloning and hybrid tactics helped drive that rise.
Losses have reached into the billions, and experts warn deepfake-enabled fraud could grow substantially.
Even large firms have been hit: there have been reported deepfake executive frauds that led to multi-million-dollar transfers.
Scam calls have increased in the US and many people report more spam and scam attempts.
How to protect yourself from these scams
If you get a silent call, hang up without responding. Don't give scammers any audio to work with.
Playing background noise helps mess up their recordings too.
Watch out for awkward pauses or weird static on calls, and set up safe words with family members just in case someone tries to impersonate them.
Turning on AI call screening features on your phone adds another layer of protection!