Nano Banana Pro Deepfake? Musk, Pichai and Huang 'Spotted' Together in Google Gemini AI Snap
How Google's Gemini 3 AI Sparked Deepfake Rumours With a Viral Elon Musk and Sundar Pichai Image

We are living in the times of advanced generative AI, and even a single image can lead to a worldwide viral debate. That's exactly what happened recently when an eerily realistic picture featuring tech titans Elon Musk, Sundar Pichai, and Jensen Huang and more began circulating online. At first glance, you might think it's a candid and very real snapshot of billionaires hanging out, but this is no ordinary photo and the truth has come out. Created using Google's cutting edge Gemini 3 model, popularly known as Nano Banana Pro, the viral image has gotten users and social media debating about AI ethics and deepfake potential.
Viral Billionaires' Image
Social media users are having a field day, generating images of famous entrepreneurs like Musk, Pichai, Sam Altman, Tim Cook and more casually chilling together. Moreover, the viral picture in question shows Musk holding a cigar, surrounded by Pichai and Huang, as if they just met up in a parking lot filled with luxury cars, even a Cybertruck makes an appearance.
caption this pic.twitter.com/9kIcrEM7Xo
— Sam Sheffer (@samsheffer) November 20, 2025
But it's not a real meetup of the richest men on earth. It's reportedly the work of Google Gemini 3 Pro Image, aka Nano Banana Pro generated by a user who posted it on his Twitter. As one user bluntly put it: 'At first glance you may think it's real' And that realism comes from the tremendous improvements in character consistency and image fidelity brought by the Gemini 3 model.
The name 'Nano Banana' doesn't come out of nowhere either. It's the internal codename for Google's image editing engine, and Pro refers to the enhanced 'Gemini 3 Pro Image' version. According to Reddit users, this version supports up to 14 reference images and can maintain consistent facial features across different generations of images which is how you can make a bunch of billionaires look like they're actually together.
READ MORE: Instagram & WhatsApp Saved by TikTok? — Meta Clears Historic Legal Hurdle
Why the Confusion Around Deepfake?
The realism of the image has led to moral debates as some argue that it's dangerously close to a deepfake, even though it was generated purely via AI. Deepfakes usually refer to malicious or deceptive synthetic media that impersonates real people for disinformation or fraud. In this case, while the image isn't necessarily malicious, it does show how easy it has become to simulate real people in convincing contexts.
The Google Gemini app itself offers a way to detect such creations. According to Google reportedly, you can upload a suspected image to Gemini and ask 'Was this created with Google AI?' or 'Is this AI-generated?' The AI will then try to tell you whether it was made or edited using Google's AI toolkit. And this transparency feature is extremely important as image generation tools grow more powerful.
I would buy this album pic.twitter.com/MD9RNmSXIs
— Miro Jurcevic (@mirojurcevic) November 21, 2025
But not everyone is convinced these safeguards are enough. Some worry that even with watermarking and AI detection features, people could misuse Gemini Nano Banana Pro to spread misinformation or impersonate public figures. There's also worry around privacy, after all, creating hyper realistic images of real people is no longer the stuff of science fiction.
The band known as The Tensor Cores™️ is expanding rapidly I see… pic.twitter.com/OzgYFdMs17
— Matthew Sabia (@MatthewSabia) November 21, 2025
Furthermore, there's also a growing narrative of caution from law enforcement and cybersecurity experts. In India, for example, an IPS officer has warned users against uploading photos to fake or unauthorized Gemini like sites, citing the possibility of scams or misuse. The recommendation is simple but urgent, which is that users should ideally only use official Google Gemini channels for Nano Banana edits, and be careful about what you upload.
© Copyright IBTimes 2025. All rights reserved.




















