What Is The Three-Finger Test? How A Deepfake Scammer Got Exposed During A Video Call
A straightforward gesture reveals the limitations of deepfake technology in real-time video calls

A deepfake scammer was exposed during a video call last week after failing the three-finger test, a straightforward challenge that has rapidly become a go-to method for spotting AI-generated fakes.
The incident involved a scammer using advanced software to impersonate a different person on screen. When asked to perform a simple hand gesture, the technology could not keep up, revealing the deception in seconds. The clip has since spread across social media platforms, raising awareness of how deepfake scams are being tackled in real time.
What is the Three-Finger Test?
The three-finger test works by requesting the person on the video call to hold up three fingers directly in front of their face. This action introduces occlusion, where the hand blocks part of the face, along with quick changes in lighting and spatial depth.
For a real person, the movement is natural and fluid. Deepfake algorithms, however, often generate the face and hands as separate layers. When they must interact in live conditions, the result is frequently imperfect. Fingers can appear to merge, warp or lose definition, while the overall image shows noticeable artefacts around the edges of the hand.
The test has been popularised through demonstrations by cybersecurity professionals. It does not require any apps or software – just a verbal request during the call. Experts from Huntress Labs highlighted its effectiveness in a recent session, showing how even polished deepfakes can crumble under this specific demand.
How the Scammer was Exposed
In the now-viral recording, scam baiter Jim Browning was engaged in a video conversation with an operative from a suspected fraud ring. The caller was employing a deepfake to alter his appearance, likely to appear more trustworthy or to match a fabricated identity.
Browning posed the question in a matter-of-fact tone. 'Can you hold up three fingers in front of your face or anything?'
After a brief hesitation, the scammer attempted the gesture. The deepfake responded poorly. The fingers distorted and overlapped unnaturally, and the facial overlay shifted out of alignment. The glitch was immediate and unmistakable. Browning followed up calmly. 'I think that's a reasonable thing to ask.'
deepfake scammer getting exposed by the 3-finger test pic.twitter.com/T8TGTxTMJE
— non aesthetic things (@PicturesFoIder) March 30, 2026
The scammer's cover was blown. The call ended soon after, with the deception laid bare. The footage originated from a collaborative effort involving Browning and researchers at Huntress Labs. It has been viewed many times and shared by accounts such as one popular X profile that described it simply as 'deepfake scammer getting exposed by the 3-finger test'.
Why Deepfakes Struggle With The Test
While the three-finger test offers a valuable tool against current deepfake scams, it is not infallible. More sophisticated AI models are already being developed that can better manage complex gestures and occlusions. Cybersecurity firms warn that scammers will adapt. Nevertheless, the episode underscores a broader trend. Deepfake technology is increasingly used in targeted attacks, from romance scams to fake executive calls.
The incident serves as a practical lesson for anyone conducting business or personal conversations via video. Incorporating the test early in suspicious calls can prevent escalation. As deepfake capabilities advance, such low-tech solutions provide an accessible layer of protection.
The clip's rapid spread reflects growing public interest in self-defence against AI fraud. With reports of deepfake incidents multiplying, the three-finger test is likely to feature in future safety guidance issued by authorities and tech companies.
© Copyright IBTimes 2025. All rights reserved.






















