ADVERTISEMENT

Deepfake Fraud: How 'Go Invest' Tried To Scam People Through Impersonation Of PM Modi, Sundar Pichai

AI-generated clips falsely depict Sundar Pichai endorsing a platform named 'Google Invest' was presented as a government-backed initiative promising exceptionally high returns.

<div class="paragraphs"><p>The deceptive content started with the circulation of deepfake videos featuring Google CEO Sundar Pichai. (Photo source: Unsplash)</p></div>
The deceptive content started with the circulation of deepfake videos featuring Google CEO Sundar Pichai. (Photo source: Unsplash)

A new deepfake, well-coordinated, and multi-layered investment fraud campaign has come to light that features the fraudulent use of prominent national figures, including Prime Minister Narendra Modi, Finance Minister Nirmala Sitharaman, Infosys Founder Narayana Murthy and his wife Sudha Murty.

In a recent report, Gurugram-based cybersecurity firm Athenian Tech investigated platforms like Google Invest, Go Invest, InvestGPT, and the newly emerged Cryptify Flows. Each of these campaigns falsely claims government support and uses emotionally resonant language to position itself as a means for economic empowerment.

The fraudulent campaign revealed a disturbing evolution in cyber-enabled financial scams — one that combines cloned media, synthetic endorsements, and AI-generated deception into a singular and scalable threat.

By systematically leveraging the likenesses of national leaders, tech icons, and institutional brands, the perpetrators have manipulated public trust through familiarity, urgency, and emotional appeal, the report said.

The deceptive content started with the circulation of deepfake videos featuring Google CEO Sundar Pichai.

These AI-generated clips falsely depict Pichai endorsing a platform named 'Google Invest', which is presented as a government-backed initiative promising exceptionally high returns—over Rs 10 lakh per month—on a single investment of Rs 21,000.

Opinion
Dangerous: AI Agents To Automate Social Engineering, Deepfakes, Credential Abuse; Reduce Account Takeover Time

Crafted with AI-simulated speech and facial reconstruction, these videos are designed to appear authentic, thereby manipulating unsuspecting viewers into trusting the fraudulent scheme.

Not just Pichai's fake video, the fraudsters replicated the layout and typographic style of the Times of India website to lend journalistic legitimacy to Go Invest. A fake article featuring an image of PM Modi said the investment model is officially endorsed and powered by AI.

Although branded as 'Google Invest', all embedded links redirect users to the fraudulent Go Invest platform.

Besides, the fraudulent network also released a deepfake video of Sitharaman where she appears to endorse a fictitious initiative called "InvestGPT."

Similar in structure to the earlier Pichai deepfake, this clip utilises manipulated lip-sync and voice synthesis to simulate an endorsement, reinforcing a familiar strategy that exploits public trust in institutional figures.

When engaging with the embedded links in these fraudulent articles or promotional content, users are redirected to fake investment portals—often hosted on infrastructures registered outside India, including those in Turkey and using Russian DNS servers, the Athenian Tech report said.

A more recent variant of this deceptive scheme, operating under the name 'Cryptify Flows' surfaced, once again mimicking the cloned TOI format. This version falsely asserts that Sitharaman, alongside Narayana Murthy and Sudha Murty, jointly launched a government-backed platform.

This platform purportedly promises daily payouts of Rs 1.9 lakh. The fraudulent content further attempts to legitimise itself by claiming false affiliations with the State Bank of India, Microsoft, and IBM, implying technological validation and official integration.

"The use of deepfake technology, foreign-registered infrastructure, and misappropriated identities signals a new era of cyber fraud, where misinformation is no longer spread merely through words but is convincingly acted out by synthetic replicas of real people. As such threats become increasingly difficult to detect in real-time, the burden of defense cannot rest solely on users," the report said.

Opinion
Delhi High Court Grants Panel Time To Submit Deepfake Report By July 21
OUR NEWSLETTERS
By signing up you agree to the Terms & Conditions of NDTV Profit