Imagine getting a call from someone who sounds exactly like your wife, boss, or grandson. They are in trouble and need your help. But here’s the trick: it’s not them. It is artificial intelligence (AI). According to the Identity Theft Resource Center (ITRC), the number of AI-based frauds has increased by 148 percent this year. They used to be only a minor concern, but now they affect ordinary people, CEOs, and entire companies, costing billions of dollars globally.
How Scammers Use AI to Steal:
Things like voice quality and typos, which used to be the big giveaways, are no longer a concern. According to a study conducted by McAfee, 70% of participants were unable to differentiate between a cloned voice and the real voice. Below are a few avenues a threat actor could take to trick victims.
- Voice Cloning: Scammers can replicate someone’s voice using just a few seconds of audio.
- Deepfake Video Calls: Cybercriminals use AI-generated video to impersonate a manager or colleague to obtain money or confidential data. This is exactly how hackers stole $25 million from Hong Kong banks last year.
- Fake Websites and Social Media. Criminals can copy accounts and set up fake websites that look like real sites in less than a minute. They typically impersonate customer service representatives from well-known companies.
Why This Threat Is Significant:
According to the Federal Trade Commission, impersonation scams cost Americans nearly $3 billion in the most recent reporting year and that number is rising yearly. And considering that the trend of exponential growth shows no signs of slowing down, it is possible to assert that impersonation remains one of the most expensive and widespread types of cybercrime.
- Consumers: Individuals are repeatedly targeted by voice-clone calls, phishing emails and texts, all designed to trick them into sending money or obtaining their personal information. They often try to implement fear or urgency in order to pressure the victim into acting quickly. For example, a woman in California was swindled out of $430,000 when fraudsters used an AI-cloned voice to impersonate a soap opera star.
- Businesses: Companies are battling increasingly refined fraud schemes aimed at impersonating executives, in the form of CEO and vendor impersonation. Hackers also exploit trust within supply chains, inserting malware into hardware or software products to gain long-term access. Deepfake attacks on executives increased from 34% to 41%, according to the report. Beyond financial losses, companies face data leaks, regulatory penalties, lawsuits and reputational damage.
- Seniors: Scammers often target older adults through schemes such as government impersonation, Medicare or Social Security fraud and fake emergencies. Criminals use the fact that elderly have a harder time navigating technology and potential threats due to the constant evolution of technology. Consequently, fraudsters target older adults pretending to be their grandchildren needing urgent financial aid (National Council on Aging). The losses are particularly devastating because the elderly commonly live on fixed incomes.
How to Spot & Stop AI Impersonation Scams:
AI-driven impersonation scams are becoming increasingly convincing, making it harder to distinguish real communication from fraud. Scammers often exploit fear, urgency, and trust to pressure victims into quick action. By following these steps, you can strengthen your defenses and protect yourself, your family, and your organization.
- Use a Safe Word with Family
Have a “safe” word for family members to use in case of an emergency. Choose a phrase everyone will remember. If someone calls claiming there’s an emergency, confirm the safe word before responding.
- Check Through Verified Sources
Don’t trust caller ID or alarming emails. Hang up and call back using a verified number or contact source you know is legitimate.
- Wait Before Acting
Scammers rely on urgency and fear. Take a moment to pause, think, and verify before complying with any request.
- Turn On Multi-Factor Authentication (MFA)
Enable MFA on all accounts. Even if someone gains access to your voice or login details, MFA adds a critical layer of protection.
- Monitor Your Identity Online
Regularly search for your name and your business’s brand. Report and remove any fake accounts, impersonations, or fraudulent websites.
- Stay Educated
Awareness is one of the strongest defenses. Share knowledge about these scams with coworkers, friends, and family so they are prepared too.
In conclusion, AI impersonation frauds are no longer just movie scenarios; they are real and becoming harder to identify. Don’t believe everything you see or hear (voice clone, deepfake video call, fake social media page). You can still trust someone, but you have to check it out first. We can all stay one step ahead of fake lies by being careful, using safe words and a little bit skeptical.

