AI George Clooney
AI George Clooney via Meta AI

A woman in Argentina lost £10,000 after being scammed by an AI-generated version of George Clooney. Lately, technology has made it increasingly easy for criminals to impersonate celebrities and manipulate innocent people.

As digital trickery becomes more convincing, understanding how to recognise and protect yourself from such scams is more important than ever.

The Scam Unfolded

Originally, she had come across a Facebook account which appeared to be Clooney's verified profile. Soon after, she received a message asking if she had a Fans Club card. The scammer sent her deepfake videos that made it seem as if Clooney was speaking directly to her. In one message, the AI Clooney said, 'Thank you so much for supporting me. I promise to pay everything to you. I love you.'

The woman was convinced, especially since the account was reportedly verified. 'Clooney' then claimed he needed funds, as he planned to leave his wife. He promised to help her find a job as a thank you for her support. The scammer then asked her to transfer money onto a card. Over six weeks, she sent roughly £10,000 (approximately $13,200), only to realise she had been duped. As suspicion set in, she contacted the FBI to report the scam.

Understanding Deepfakes

What exactly is a 'deepfake'? Deepfake technology uses AI to create realistic images, videos, and audio that can make someone appear to say or do things they never did. While some use these tools for entertainment, criminals exploit them to carry out scams and spread misinformation.

There are different types of scams that are particularly prevalent, the first of which is referred to as 'Investment Fraud'. Here, criminals create videos of famous figures promoting fake investment. An elderly man lost over £550,000 (roughly $725,000) after a deepfake of Elon Musk endorsed a bogus opportunity.

Another common type is the 'Romance Scam', where scammers pretend to woo victims, building trust before asking for money. These schemes have led victims in Hong Kong to part with more than £43 million (around $57 million).

'Political Misinformation' scams are also becoming increasingly more common, where fake videos or audio clips of politicians are used to spread false information, influence elections, or incite unrest. 'Extortion and Family Scams' prey on victims' anxiety and their sense of urgency, usually by impersonating loved ones' voices. Scammers claim they're in trouble and need money urgently.

Meanwhile, many celebrities' faces have been used in the 'Celebrity Endorsement' Fraud where they're featured in fake ads promoting products to trick consumers into buying bogus goods. In 2024, videos falsely showed Taylor Swift endorsing kitchenware, leading fans to potentially spend money on counterfeit items.

How to Spot a Deepfake

Detecting deepfakes isn't always straightforward, but watching for certain signs can help you avoid falling into a trap. For videos, experts say to look for unnatural blinking, inconsistent reflections, or mismatched shadows.

Audio that doesn't sync perfectly with the video, specifically with lip movements and faces that appear blurred or distorted are some common red flags. For photos, look for artificially glossy skin, extra or missing fingers, or facial features that seem slightly off. For audio scams, a robotic tone, monotone delivery, or unnatural background noise can point to a fake recording.

Protecting Yourself

Prevention begins with scepticism. Limit the personal information you share online, especially details that could be used to craft convincing deepfakes. Confirm identities by calling known numbers or using verified contact details. Stay cautious and remember that if something feels 'off,' it probably is.

If you suspect you've been targeted, act quickly. Notify your bank or credit card provider, change passwords, and enable two-factor authentication. Report the incident to law enforcement, and share your experience with friends and family, especially older relatives, so they can recognise these scams.