Safety

AI deepfake scams in 2026: how to spot one before it gets you

17 April 2026 - 9 min read
James, co-founder of Smash Your AI

James

Co-founder of Smash Your AI, middle leader at a large UK ed tech company, 18 years in education.

A warning guide to AI deepfake and voice cloning scams in 2026

A friend of mine got a phone call last month. It was her son. His voice. His exact way of speaking. He said he was in trouble and needed money transferred right now.

It was not her son. It was an AI voice clone.

She nearly sent £2,000 before something felt off and she hung up to ring him directly. He was at work, completely fine, had never made the call.

This is happening every single day now, in the UK and around the world. And the scams are getting better. So in this post I want to walk you through what a deepfake actually is, the main scam patterns to watch for, and a few simple habits that will protect you and your family.

What is a deepfake?

A deepfake is a fake video, image, or audio clip made using AI. The AI studies real footage of a person, learns how they look or sound, then creates something new that appears to be them.

There are two main types you need to know about:

  • Video deepfakes. A fake video of a real person saying or doing something they never did.
  • Voice cloning. A fake audio clip in a real person's voice. Often only needs three seconds of original audio to build a convincing copy.

Three seconds. That is less than most people's voicemail greetings. Anyone who has posted a video on TikTok, Instagram, or LinkedIn has enough audio out there for a scammer to work with.

How bad is this problem actually?

Worse than most people realise. Here are a few numbers from the last few weeks alone:

This is not a "might happen one day" story. It is a "probably happening to someone you know this month" story.

The five scams to watch for right now

Almost every deepfake scam I have seen falls into one of these five buckets. If you can recognise the pattern, you are most of the way to spotting it.

1. The "help me, mum" voice clone

This is the one that got my friend. A scammer clones a child or grandchild's voice, phones a parent or grandparent, and says they are in some kind of trouble. A car accident. A mugging. An arrest abroad. They need money transferred urgently and they cannot talk long.

The goal is panic plus speed. If you are panicking, you are not thinking. That is exactly how they want you.

2. The fake video CEO

This one targets businesses. A finance team member gets a video call from someone who looks and sounds exactly like their CEO or finance director, asking them to urgently transfer money to a new supplier. One Hong Kong firm lost £20 million to this exact scam last year. It happened on a video call with four other "colleagues" on screen. Every single person on the call except the real employee was a deepfake.

3. The celebrity crypto advert

You scroll past a video on Facebook or TikTok. Martin Lewis, Elon Musk, or even the Prime Minister is telling you about a brilliant new investment that guarantees big returns. The face is real. The voice is real. The advice is fake, and so is the investment. These are currently one of the fastest-growing scams in the UK according to Action Fraud (the UK's national fraud reporting centre).

4. The romance scam upgrade

Romance scams used to rely on stolen photos. Now the scammer can hop on a short video call with you looking exactly like their stolen photos, because the deepfake runs in real time. Which means a lot of the old advice ("ask them to video call, scammers will not do that") no longer works.

5. The fake identity check

Some scams target banks, crypto exchanges, and government services by using deepfake video to pass "selfie" identity checks. If the scammer has your name, date of birth, and a photo from your social media, they can sometimes open accounts in your name.

LEARN AI THE RIGHT WAY

Stop worrying about AI. Start using it properly.

Our online course shows you exactly what AI can and cannot do, so you can spot scams and use the same tools for yourself. Perfect for complete beginners.

View the course

The 7 red flags that give a deepfake away

Deepfakes are getting better, but almost every scam I have seen has at least one of these warning signs. Teach these to your family and you will spot most of them.

1. Urgency and panic

"You must do this right now or something terrible will happen." Real family members and real companies do not work like that. Ever.

2. An unusual payment method

Bank transfer to a new account. Gift cards. Crypto. Anything that cannot be reversed once sent.

3. "Do not tell anyone"

Any request to keep the conversation secret from family, your bank, or your colleagues is a massive red flag.

4. Weird pauses or repetition

Voice clones sometimes repeat short phrases or pause awkwardly. The emotional tone can feel slightly off, like an actor reading a script.

5. Video that looks almost right

Blurry edges around the face, strange blinking, teeth that look fake, or hair that moves unnaturally. If something feels uncanny, trust that feeling.

6. A new phone number or email

"I am calling from a different number because my phone broke." That is the scammer hiding their real identity.

7. A story that involves crypto

If a celebrity, politician, or old friend is recommending a crypto platform, assume it is fake until proven otherwise.

A simple family code word that will save you

This is the single best thing I can teach you from this whole post. It costs nothing. It takes one minute.

Agree a family code word today.

Pick a random word that no one outside your family would know. Something like "pineapple" or "blue whale" or the name of a pet you had 20 years ago. The rule is simple. If any family member ever phones asking for money or for anything urgent, you ask for the code word. If they cannot say it, you hang up.

A voice clone cannot know your family code word. Even if it sounds exactly like your child. Even if the number looks right. No code word, no money.

This one tip alone would have stopped my friend losing her £2,000.

How to protect your business

If you run a small business, especially one where money gets moved around, the same rules apply with a few tweaks:

  • Have a "call back" rule. Any request to change bank details or send a payment must be verified by calling the person on a known, saved number. Not the number the email came from.
  • Use a callback code word between finance and the leadership team, the same as a family code word.
  • Two-person approval for any transfer over a set amount. The scammer can only fool one person at a time.
  • Train your team. Most finance scams work because the junior employee is too polite to push back on the "CEO". Tell your team that pushing back is not only allowed, it is expected.

It is worth making this part of your team induction. Not because AI is dangerous, but because so few people realise how convincing voice cloning has become.

What to do if it happens to you

If you think you have been targeted, or worse, if you have actually sent money:

  1. Call your bank immediately. If the money has only just left, they can sometimes freeze it.
  2. Report it. In the UK, Action Fraud on 0300 123 2040. In the US, the FBI's Internet Crime Complaint Center (IC3).
  3. Tell your family and contacts. If your voice or face has been cloned, it might be used to target people close to you next.
  4. Change your passwords. Especially email, banking, and social media.

And do not be embarrassed. These scams have fooled lawyers, accountants, and finance directors. They are designed to be believable. Reporting quickly is what matters.

Why this will get worse before it gets better

A year ago, making a realistic deepfake took technical skill and expensive software. Today, there are free apps you can download onto a laptop that will clone a voice from a 10 second TikTok clip.

That does not mean you should panic. It means you need to build a few simple habits that make you harder to scam:

  • Family code word. Agree one today.
  • Never move money because someone rushed you.
  • When in doubt, hang up and call back on a known number.
  • Be sceptical of any video or voice clip that wants you to do something, especially pay money.

The good news is that awareness is the single best defence. You reading this article makes you meaningfully harder to scam than the average person. Share it with anyone in your family who might be a target, especially older relatives who may not have seen this technology in action yet.

One more thing

If you take one thing from this post, please let it be this. Pick up the phone today, text your family WhatsApp group, and agree a code word. It will take you two minutes. It might save you or someone you love an enormous amount of money and heartbreak.

AI is an incredible tool. I use it every day and I believe everyone should learn how to use it. But the same technology that can help you write a better business plan can also be used against you. Knowing how it works is the best protection you can have.

If you want to understand AI properly so you can use it for yourself and recognise what it can and cannot do, take a look at our online course. It is built for complete beginners and covers the tools, the risks, and how to use AI without falling for the traps.

Want to learn AI properly?

Our online course covers everything from your first prompt to advanced techniques, with practical exercises designed for real-world tasks.

View the course