You Compare List Is Empty

Pick a few items to see how they stack up.

Your Fave List Is Empty

Add the money tools you want to keep an eye on.

Menu Products

How to Avoid AI Voice and Deepfake Scams

Disclosure: The article may contain affiliate links from partners who may compensate us. However, the words, opinions, and reviews are our own. Learn how we make money to support our mission.

AI voice and deepfake scams can feel unsettling because they make fake messages sound and look real. A call may sound like your grandchild. A video may appear to show a celebrity endorsing an investment. A voicemail may seem to come from your boss. A message may include a familiar face, voice, or writing style.

That is what makes these scams different. They do not only ask you to trust a stranger. They may imitate someone you already trust.

In this guide, you’ll learn how AI voice and deepfake scams work, which warning signs to watch for, and how to verify requests before sending money, sharing information, or taking action.


TL;DR: Quick Decision Guide

  • If a loved one calls in a panic asking for money → hang up and call them back using a known number.
  • If a voice sounds familiar but asks for secrecy, money, gift cards, crypto, or wire transfers → verify before acting.
  • If a celebrity or influencer appears to promote an investment, giveaway, or miracle product → assume it needs independent verification.
  • If a boss, coworker, or official asks for urgent money or credentials by voice message or video → confirm through another trusted channel.
  • If a video looks slightly off, rushed, or emotionally manipulative → slow down and check the source.
  • If you already paid or shared information → save evidence, contact your financial institution, report the scam, and secure your accounts.


What Are AI Voice and Deepfake Scams?

AI voice and deepfake scams use artificial intelligence to imitate real people or create convincing fake audio, images, or videos.

A scammer may use:

  • A cloned voice
  • A fake video
  • A manipulated celebrity clip
  • An AI-generated photo
  • A fake livestream
  • A voice memo
  • A video call impersonation
  • A fake endorsement
  • A synthetic social media profile

The goal is usually to make a scam feel more believable. Instead of reading a suspicious message, you may hear what sounds like a real person asking for help or see what looks like a familiar face promoting something.

The FTC has warned that scammers can clone a loved one’s voice using a short audio clip found online, then use that voice in family emergency scams.

👉 Compare: Identity Protection Tools in the Marketplace


Step 1: Do Not Trust a Voice Alone

A familiar voice used to feel like proof. It is no longer enough by itself.

A scammer may use a cloned voice to say:

  • “Grandma, I’m in trouble.”
  • “I was in an accident.”
  • “I need bail money.”
  • “Please don’t tell anyone.”
  • “My phone is broken.”
  • “I need you to send money now.”
  • “I’m embarrassed, please keep this quiet.”

The voice may sound like a child, grandchild, spouse, friend, coworker, or boss. That does not mean the request is real.

The FTC says scammers use voice cloning to make requests for money or information more believable, including calls that sound like a boss or family member.

What to do:
Hang up and call the person directly using a number you already know. Do not use the number provided by the caller or message.

Smile Money Tip: A familiar voice can create an emotional shortcut. Verification gives you a way back to clarity.


Step 2: Create a Family Safe Word

A family safe word is a simple way to verify emergency requests.

Choose a word or phrase that:

  • Is easy for family to remember
  • Is not posted online
  • Is not a pet name, street name, birthday, school, or public detail
  • Can be used calmly during a suspicious call
  • Can be changed if too many people learn it

For example, if someone calls claiming to be your grandchild in trouble, you can ask: “What is our family word?”

If they do not know it, stop the conversation.

What to do:
Create a safe word with parents, grandparents, adult children, teens, college students, and close family members. Make it part of your family fraud plan.

👉 Related: How to Avoid Romance Scams


Step 3: Watch for Urgency, Secrecy, and Unusual Payments

AI may make a scam sound real, but the scam pattern is usually familiar.

Watch for:

  • “Act now.”
  • “Don’t tell anyone.”
  • “Stay on the phone.”
  • “Send money immediately.”
  • “Use gift cards.”
  • “Send crypto.”
  • “Wire the money.”
  • “Use a payment app.”
  • “Withdraw cash.”
  • “A courier will pick it up.”
  • “Your bank or family cannot know.”

These are classic scam tactics. The technology may be new, but the pressure is not.

What to do:
If the request involves urgency, secrecy, or hard-to-reverse payments, stop and verify through another channel.

👉 Related: How to Lock Down Your Social Media Privacy Settings


Step 4: Be Skeptical of Celebrity and Influencer Deepfakes

Deepfake scams are not limited to family emergency calls. Scammers also use fake celebrity or influencer videos to promote:

  • Investment platforms
  • Crypto opportunities
  • Product giveaways
  • “Free money” programs
  • Miracle health products
  • Government relief scams
  • Fake apps
  • Rewards programs
  • Shopping deals

Some deepfake videos are created from real interviews or public appearances and altered to make it look like the person is endorsing something they never endorsed. Recent reporting has shown scammers using AI-generated celebrity deepfake ads on TikTok to promote fraudulent rewards programs and collect personal information.

What to do:
Do not invest, buy, or enter personal information because a famous person appears in a video. Go to the person’s official website or verified social media account. Search for the offer independently.

If the video sends you to a third-party site asking for personal information, payment, or account login, treat it with caution.


Step 5: Verify Work, Business, and Official Requests

AI voice and deepfake scams can also target workplaces, small businesses, and professionals.

A scammer may impersonate:

  • A CEO
  • A manager
  • A client
  • A vendor
  • A government official
  • A financial advisor
  • A lawyer
  • A payroll or HR representative

They may ask for:

  • Wire transfers
  • Bank account changes
  • Payroll changes
  • Gift card purchases
  • Login credentials
  • Confidential documents
  • Payment approvals
  • Vendor payments

The FBI has warned that malicious actors have used AI-generated voice messages and texts to impersonate senior U.S. officials, build rapport, and move targets toward platforms designed to steal credentials or access accounts.

What to do:
Use a second verification channel. If a request comes by voice memo, confirm by phone, secure company chat, known email, or in person. For financial transfers, use a written approval process and callback rule.


Step 6: Look for Signs a Video or Audio May Be Fake

Deepfakes are getting better, so visual clues are not always reliable. Still, some warning signs can help.

Watch for:

  • Mouth movements that do not match words
  • Unnatural blinking or facial expressions
  • Strange lighting or shadows
  • Robotic or flat voice patterns
  • Odd pauses or timing
  • Blurry edges around the face
  • A voice that sounds familiar but slightly off
  • A video that avoids natural interaction
  • A call that ends quickly when you ask specific questions
  • Refusal to answer a personal verification question

Do not rely only on spotting technical flaws. Some fakes look and sound convincing.

What to do:
Treat the request, not just the media, as the warning. If the request involves money, secrecy, credentials, or urgency, verify independently.


Step 7: Limit Public Audio and Personal Details

Scammers may use publicly available content to make scams more believable.

They may pull from:

  • Social media videos
  • Public speeches
  • Voicemails
  • Podcasts
  • Livestreams
  • Family videos
  • School or sports clips
  • Public posts about family relationships
  • Job titles and workplace details
  • Travel posts
  • Birthdays and locations

You do not need to erase your life online, but you can reduce what strangers can collect.

What to do:

  • Review privacy settings.
  • Limit public videos of children or older relatives.
  • Avoid posting real-time travel details.
  • Hide family relationship details when possible.
  • Keep friends lists private.
  • Avoid public posts that reveal routines or vulnerabilities.
  • Be careful with voice notes and public video clips.

The goal is not fear. It is reducing easy material for scammers.


What to Do Before You Send Money or Information

Use this verification checklist:

  1. Stop the conversation.
  2. Do not send money.
  3. Do not share codes, passwords, account numbers, or IDs.
  4. Call the person back using a saved number.
  5. Ask for the family safe word if it involves a loved one.
  6. Contact another trusted person if you cannot reach them.
  7. Verify business requests through a separate channel.
  8. Check official sources for celebrity, investment, or giveaway claims.
  9. Avoid payment methods that are hard to reverse.
  10. Save the message, audio, video, or account details.

If the person pressures you not to verify, that is your answer.


What to Do If You Were Targeted by an AI or Deepfake Scam

If you received a suspicious AI voice call, fake video, or deepfake message:

  • Stop communication.
  • Save voicemails, videos, screenshots, links, phone numbers, usernames, and payment instructions.
  • Report the account, ad, or message to the platform.
  • Warn the real person or organization being impersonated.
  • Report fraud to the FTC at ReportFraud.ftc.gov.
  • If money was sent online, report it to the FBI’s Internet Crime Complaint Center at IC3.gov.

The FBI’s 2025 Internet Crime Report found cyber-enabled crime losses approached $21 billion, with cryptocurrency and artificial intelligence-related complaints among the costliest categories.


What to Do If You Already Sent Money or Shared Information

Act quickly.

If you sent money by bank transfer or wire:
Contact your bank immediately and ask whether the transfer can be stopped, recalled, or investigated.

If you paid by card:
Contact your card issuer and ask about disputing the charge.

If you used a payment app:
Report the transaction in the app and contact your linked bank or card issuer.

If you sent crypto:
Save the wallet address, transaction hash, platform details, and messages. Recovery may be difficult, but documentation matters.

If you shared a password or code:
Change the password immediately, turn on multi-factor authentication, and review account activity.

If you shared personal information:
Monitor accounts, check credit reports, consider a fraud alert, and freeze your credit if sensitive information was exposed.

If a loved one was impersonated:
Tell the family. Create or update a safe word and callback rule.


Common Mistakes to Avoid

  • Trusting a voice because it sounds familiar
  • Sending money before calling back directly
  • Believing a celebrity video without checking the source
  • Sharing one-time codes with someone who contacts you
  • Letting urgency override verification
  • Posting too much public family information
  • Ignoring requests for secrecy
  • Approving workplace payments from voice messages alone
  • Sending crypto or wire transfers under pressure
  • Feeling embarrassed and not reporting the scam

AI can make scams more convincing, but the safest response is still simple: pause, verify, then decide.


Avoid AI Voice and Deepfake Scams FAQs

  1. Can scammers really clone someone’s voice?

    Yes. The FTC has warned that scammers can use a short audio clip from online content to clone a loved one’s voice and use it in an emergency scam.

  2. What is a deepfake scam?

    A deepfake scam uses AI-generated or manipulated video, audio, or images to impersonate a real person, create a fake endorsement, or make a fraudulent request seem believable.

  3. How do I know if a voice call is real?

    Do not rely on the voice alone. Hang up and call the person back using a known number. For family emergencies, use a safe word or ask a question that cannot be answered from public information.

  4. Are celebrity investment videos real?

    Be careful. Scammers can create fake celebrity videos that appear to endorse investments, giveaways, or products. Verify through the celebrity’s official channels and independent sources.

  5. Where can I report AI voice or deepfake scams?

    Report fraud to the FTC at ReportFraud.ftc.gov. If the scam happened online or involved money, you can also report it to the FBI’s Internet Crime Complaint Center at IC3.gov.


Final Thought

AI voice and deepfake scams are designed to make trust feel automatic. They use familiar faces, familiar voices, and familiar emotions to rush your decision.

Your protection is not knowing every new technology. It is having a verification habit. Pause, call back, ask the safe word, check the source, and never let urgency make the decision for you.

Next Steps:

Share the knowledge:

Author Bio

Picture of Jason Vitug

Jason Vitug

Jason Vitug is the founder and CEO of phroogal. His writings explore the intersection of money, wellness, and life. Jason is a New York Times reviewed author, speaker, and world traveler, and Plutus-award winning creator. He holds an MBA from Norwich University and a BS in Finance from Rutgers University. View my favorite things
Picture of Jason Vitug

Jason Vitug

Jason Vitug is the founder and CEO of phroogal. His writings explore the intersection of money, wellness, and life. Jason is a New York Times reviewed author, speaker, and world traveler, and Plutus-award winning creator. He holds an MBA from Norwich University and a BS in Finance from Rutgers University. View my favorite things