
Criminals are using AI to mimic familiar voices in scam calls and texts, prompting FBI warnings and calls for new defense strategies.
At a Glance
- FBI warns of AI-generated voice scams impersonating U.S. officials
- Scammers use vishing (voice) and smishing (texts) to steal personal data
- Voice cloning tech mimics real people, including high-ranking officials
- Experts recommend secret code words and identity checks
- Deepfake audio scams can target anyone, not just public figures
Cloning Confidence
The FBI has issued a stark warning: criminals are now using voice-cloning AI to mimic the voices of government officials, family members, and trusted contacts to manipulate victims into giving up personal information or money. The fraudulent calls and texts—called vishing and smishing—are part of an ongoing campaign to breach accounts and gain access to sensitive data.
Watch: Beware of Deepfake Scams!
AI-generated audio is eerily convincing. Victims often hear voices they trust, urging urgent action—only to later realize they’ve been tricked. Scammers spoof numbers, fabricate scenarios, and imitate familiar speech patterns to build credibility.
Telltale Signs and Red Flags
FBI cybersecurity analysts say deepfake voice messages often include subtle abnormalities: unusual intonation, awkward pacing, or stilted pauses. Still, these signs can be hard to spot in real time—especially when a trusted voice is making an urgent appeal.
According to Lifehacker, these attacks are not limited to celebrities or public servants. “Contact information acquired through social engineering schemes could also be used to impersonate contacts to elicit information or funds,” the FBI notes.
Key warning signs include:
- Calls demanding immediate payment or personal info
- Unusual requests from known contacts
- Urgency, fear tactics, or secrecy instructions
- Links from unfamiliar numbers pretending to be officials
How to Protect Yourself
Security experts advise the use of “identity codewords” among family and close associates to help verify authenticity. If a call or message feels suspicious—no matter how familiar the voice—pause and verify through known, official channels.
Never click unknown links or give credentials via phone or text. And if someone claims to be from a government agency, remember the FBI’s reminder: “If you receive a message claiming to be from a senior U.S. official, do not assume it is authentic.”
With AI’s capabilities evolving, scams are becoming harder to spot—but that makes human caution more critical than ever. Knowledge, not technology, remains your best defense.