Stealing your voice? FTC warns of scams targeting family using AI cloned voices

Scammers fake family emergencies using AI

Photo by Michael Marais on Unsplash (Unsplash)

Scammers are using artificial intelligence to clone people’s voices and reach out to their loved ones, faking an emergency situation in an attempt to steal money.

The FTC is warning the public of the rise of cloned voice scams, which particularly target the elderly in an effort to swindle them. Scam artists are using voice-cloning programs to recreate a familiar voice used to then call a parent or grandparent, for example, to fake an emergency and request money be sent immediately.

Recommended Videos



Scammers can use a short audio file, which can be taken from a video or other content posted online, to clone that voice using AI. Once the voice is cloned, the AI version can be used to make it say nearly anything.

So, how do you know if an emergency call from a loved one is legitimate or a scam using a cloned voice? The FTC says, first and foremost: do not trust the voice.

“Call the person who supposedly contacted you and verify the story. Use a phone number you know is theirs,” officials said. “If you can’t reach your loved one, try to get in touch with them through another family member or their friends.”

Scammers will often request money in ways that “make it hard to get your money back,” the FTC says, including telling someone to wire money, send cryptocurrency or buy gift cards and provide the card and pin numbers. “Those could be signs of a scam,” an FTC alert reads.

If you’re aware of a scam, or have been targeted by a scammer, you can report it on the FTC’s website right here.

An example of a family emergency scam call using an AI-cloned voice. Graphic courtesy of the FTC. (Federal Trade Commission)

Related: AI voice cloning making deepfake videos more believable


About the Author:

Cassidy Johncox is a senior digital news editor covering stories across the spectrum, with a special focus on politics and community issues.