Center for Practical AI
Public Education · AI Scams

"She heard her daughter's voice screaming 'Mom! I messed up!' A man's voice took over: 'Your daughter is with me. $50,000 or she gets hurt.'"

Jennifer DeStefano was at her other daughter's dance studio when her phone rang. She heard what she was absolutely certain was her 15-year-old daughter's voice — with her inflection, her way of crying — screaming for help. A man demanded ransom. It took four minutes, several phone calls, and a confirmation from her husband for DeStefano to realize her daughter was safe in a ski lodge, unaware any of this was happening. The voice was AI-generated from audio harvested from her daughter's social media. No technical skill was required. It took seconds. DeStefano testified before the U.S. Senate Judiciary Committee in June 2023.

Artificial intelligence has made it possible to clone someone's voice from three seconds of audio. In 2024, Americans lost $16.6 billion to internet fraud — a 33% increase from the year before.

The number behind this guide

lost to AI-facilitated fraud in 2024.

A 33% increase from the year before. The call that felt like your daughter's voice cost nothing to make.

Try It

"The Call" — Scam Scenario Simulator

Walk through a virtual kidnapping call or a romance investment fraud scenario — making real-time decisions. Google Research found that interactive scenario-based "inoculation" significantly improves real-world scam recognition. Includes facilitator mode for group use.

Try the simulation →

What the simulator covers

  • Virtual kidnapping / voice clone call scenario
  • Romance investment fraud (pig butchering) scenario
  • Branching decisions with immediate outcome feedback
  • Warning sign review at the end of each scenario
  • Family safe word setup guide
  • Facilitator mode with psychological explanations
The Landscape

Five categories of AI fraud. All of them are operational now.

These are not experimental threats — they are documented, scaled operations causing billions in annual losses.

1

Voice Clone Scams

AI clones a family member's voice from 3-10 seconds of social media audio. Scammer simulates a kidnapping, arrest, or accident. Demands gift cards, Zelle, or wire transfer.

Documented case: The Canadian grandparent ring: 25 indicted, $21M stolen from Americans across 46 states.
2

Deepfake Investment Fraud

AI generates video of a trusted public figure (Musk, Buffett) endorsing a fake investment platform. Distributed via Facebook and TikTok ads. Early 'gains' are fake; withdrawal triggers invented fees.

Documented case: Steve Beauchamp, 82: lost his entire $690,000 retirement fund to a fake Elon Musk investment platform.
3

Pig Butchering / Romance Scams

Weeks or months of AI-automated emotional contact from a fake persona. Leads to a fake investment platform. Withdrawal attempts generate fees. Funds disappear.

Documented case: Abigail (California): lost $81,000 cash plus $200,000 in home equity — sold her condo to send more.
4

Deepfake Video Conference Fraud

Real-time AI deepfakes of company executives impersonate CFOs or CEOs in live video calls. Employees authorize fraudulent wire transfers.

Documented case: Arup Engineering (Hong Kong, 2024): $25 million transferred to scammers after a fake video meeting.
5

AI-Personalized Phishing

AI generates emails using your name, employer, and colleagues at near-zero cost. Click-through rate: 54% — vs. 12% for generic phishing.

Documented case: FBI IC3 phishing losses: $70 million in 2024 — a 374% increase from the prior year.
How It Works

Three seconds of audio. That's all it takes.

Modern open-source voice cloning tools require only 3–10 seconds of audio. Generation time: approximately 75 milliseconds per response.

3–10 sec

Minimum audio needed to clone a convincing voice using modern open-source tools.

Security researchers, 2024

70%

of people cannot confidently distinguish a real voice from a cloned one.

McAfee 'Artificial Impostor,' 2023

20×

Increase in AI-enabled scams from 2023 to 2025.

Microsoft AI for Good Lab via AARP

$16.6B

Total U.S. internet fraud losses in 2024 — a 33% increase year-over-year.

FBI IC3 Annual Report, 2024

Where scammers get audio

  • Social media videos (TikTok, Instagram Reels, school events posted online)
  • Voicemail greetings (scammers call repeatedly to capture the greeting)
  • Podcast or interview clips
  • Online conference recordings
  • School or church presentations posted online

The 5-phase methodology

1

Target selection

Data brokers legally sell name, age, address, family relationships, and income data. Scammers can build a family profile in under 10 minutes.

2

Audio harvest

3-5 seconds from a child's TikTok, a grandchild's Instagram Reel, or a voicemail greeting.

3

Clone generation

Audio is fed into a voice AI. A distress script is typed. Background crying and noise are layered in. Ready in seconds.

4

The call

The cloned voice plays. A human operator isolates the victim, prevents verification, and escalates panic.

5

Payment extraction

Gift cards, Zelle, wire transfer, or cash. Once sent, funds are nearly unrecoverable. Less than 5% of losses are ever returned.

Kai Zhuang

17 · Riverdale, Utah · December 2023

Virtual Kidnapping

Scammers from Hong Kong spent weeks cultivating a 17-year-old Chinese foreign exchange student, then convinced him to isolate in a tent in a canyon in freezing winter temperatures. They used recorded and AI-manipulated calls to convince his family in China he had been kidnapped. His family transferred $80,000 before police found him on December 31.

Outcome: Rescued, safe, reunited with family. The FBI confirmed calls originated from Hong Kong. The case illustrates how scammers weaponize both the victim's voice and their family's fear simultaneously.
Why It Works

Your intelligence does not protect you. Neither does your education.

The scam exploits something much older than either — and research confirms that credential and intelligence predict almost nothing about susceptibility.

The amygdala hijack

When a call tells you your child is being held at gunpoint, your brain's fear response — mediated by the amygdala — can completely override the prefrontal cortex where rational analysis happens. This is not a character flaw. It is neurological.

"Research confirms: once emotionally overwhelmed, even highly intelligent people default to fast, emotional decision-making. People with high verbal ability are no more resistant to romance scams than anyone else."

Summarized from peer-reviewed fraud psychology research (PNAS Nexus, 2024; Frontiers in Psychiatry, 2021)

The six psychological tools

1.Urgency: "Act now or she gets hurt." Time pressure prevents verification.
2.Authority: Police, attorneys, courts, government agencies — all impersonated.
3.Isolation: "Do not call your husband. If you do, the deal is off."
4.Love as a weapon: Exploiting the specific, irreplaceable bond between parent and child.
5.Social proof: "Other families have cooperated."
6.Scarcity: "This offer expires in two hours."

Increase in older adults losing $100,000+ to impersonation scams since 2020 ($55M → $445M).

FTC data, August 2025

$83K

Average loss per FBI IC3 complaint for adults 60+.

FBI IC3 2024 Annual Report

48%

of people say they would send money if they received a call claiming a family member was in a car accident.

McAfee 'Artificial Impostor,' 2023

54%

click-through rate for AI-personalized phishing — vs. 12% for generic phishing.

Peer-reviewed research, 2024-25

Who Is Targeted

Everyone. But not everyone equally.

AI scams are adapted for specific populations with specific vulnerabilities. Falling for one is not a sign of weakness.

Steve Beauchamp

82 · Retired

Deepfake Investment Fraud

AI deepfake videos of Elon Musk endorsing a fake investment platform appeared in his Facebook feed. He invested $27,000 initially, watched it appear to grow, and kept investing — paying fee after fee when he tried to withdraw. He lost his entire $690,000 retirement fund.

Outcome: Reported in the New York Times, August 2024. Beauchamp had no unusual risk factors — he was careful, logical, and had managed money successfully throughout his life. The scam exploited the appearance of legitimate media.

Older Adults (60+)

Adults 60+ filed 147,127 IC3 complaints in 2024 — a 46% increase from 2023 — and lost $4.885 billion. More than 7,500 lost over $100,000 each. Average loss per complaint: $83,000.

Important framing: falling for a well-crafted AI scam is not a sign of weakness or cognitive decline. These scams are professionally designed to work on intelligent, careful people.

Naum Lantsman

75 · Los Angeles restaurateur

Language-Targeted Fraud

Contacted via Telegram in his native Russian by scammers who impersonated a financial advisor. The language choice built immediate trust. Led through a fake investment platform with fabricated gains. Lost $340,000 life savings.

Outcome: The Lantsman case illustrates how AI voice cloning now works in dozens of languages — eliminating what was once a practical barrier to targeting non-English-speaking communities.

Immigrants

Scammers impersonate immigration authorities triggering fears specific to status. AI now works in dozens of languages.

People who are grieving

Obituaries are harvested to identify recent widows and widowers. Scammers claim shared grief to establish connection quickly.

Job seekers

Fake AI-generated job postings, fake video interviews, then an 'equipment deposit' ($1,500–$5,000). FTC job scam losses: $513M in 2024.

What Families Can Do

Four tiers of real protection.

From a single 2-minute action to long-term habits — built from documented scam defenses.

1

Set up a family code word. Today.

  • Choose a random word or phrase known only to your family. Not a pet's name. Not your street. Something genuinely random.
  • In any call claiming a family emergency, demand the code word. An AI cannot know it. A scammer cannot guess it.
  • An experienced scammer may respond 'I'm too scared to remember it' — if the code word isn't given, the call is a scam.
  • Test the system periodically. Make sure everyone knows it and knows when to use it.
  • "The code word idea is simple and nontrivial to subvert." — Hany Farid, UC Berkeley, Scientific American
2

Build the pause-and-verify reflex.

  • The 10-second rule: before taking any action on a frightening call, force a 10-second pause. Scammers create urgency specifically to prevent this.
  • Hang up and call back on a number you already have stored. Any legitimate emergency survives a 60-second delay.
  • Never call back a number given in the same conversation. Never click links in the same message chain.
  • Reach the family member through a completely different channel — text from another phone, call a friend, contact another family member.
3

Reduce the attack surface.

  • Social media videos containing children's voices are source material for voice clones. Set accounts to private or friends-only where possible.
  • Enable carrier-level spam protection (AT&T Call Protect, T-Mobile Scam Shield, Verizon Call Filter — all free).
  • Data broker opt-out: services like DeleteMe or manual opt-outs from Spokeo, Whitepages, BeenVerified reduce scammer targeting data. Do this annually.
  • No legitimate emergency service ever requests gift cards, Zelle, cryptocurrency, or cash delivered to a stranger.
4

Know the warning signs before you're in the moment.

  • Gift cards for emergencies: 100% of requests to buy gift cards for bail, fees, or government fines are scams. Every time.
  • 'Keep it secret': any instruction not to tell your spouse, children, or other family members is a manipulation tactic.
  • Too-good returns: investment platforms offering guaranteed high returns from an online contact are pig butchering operations.
  • Urgency: real emergencies allow time for verification. Scams do not.
  • Video call 'proof': seeing someone who looks like a public figure or your colleague on a video call is not proof of identity — deepfake video technology is real-time.
If It Happens

You are not foolish. These scams are professionally designed to work on intelligent, careful people.

1

Stop all payments immediately

Even if the scammer calls back demanding more. Payment does not guarantee safety — it guarantees more demands.

2

Call your bank now

Ask to freeze the account and reverse any pending transactions. For wire transfers, hours matter. Call within 72 hours.

3

Gift card issuers

Call the gift card issuer immediately. There is a small window where unspent balances can sometimes be recovered.

4

Wire transfers

MoneyGram: 1-800-926-9400. Western Union: 1-800-448-1492. Contact them immediately.

5

Document everything

Screenshot conversations. Save the phone number. Write down the time, what was said, and what you paid.

6

Report — even if you feel ashamed

FTC: reportfraud.ftc.gov. FBI: ic3.gov. Shame is what keeps people from reporting. Reporting is how scammers are caught.

Recovery reality

Less than 5% of funds lost to sophisticated AI scams are ever recovered. Gift card payments are almost never recoverable. Cryptocurrency is nearly impossible to recover. Wire transfers have a window of hours. Credit card disputes offer the best chance. One rare success: Aleksey Madan, 69, had his full $140,000 returned after Massachusetts law enforcement action.

For Educators

Workshop facilitation guide and curriculum resources

For teachers, librarians, senior center staff, and community educators: facilitation guide for the scam simulator, discussion questions, case study handouts, and guidance on running workshops for older adults and families.

Go to the Educator Guide →

Want to bring this workshop to your community?

CPAI delivers family scam awareness training to libraries, senior centers, faith organizations, and schools.