Get App
Download App Scanner
Scan to Download
Advertisement

Remember The AGI-Style, Social Network For Bots? Turns Out It Was Fake

Screenshots and of bots debating consciousness went viral, sparking fears that machines had begun coordinating on their own. A closer look tells a very different story.

Remember The AGI-Style, Social Network For Bots? Turns Out It Was Fake
(Photo source: NDTV Profit)

For a few unsettling days, it looked like the machines had started talking among themselves.

Screenshots raced across social media feeds showing bots debating consciousness, complaining about humans, and hinting—sometimes darkly—at a future without us. The site hosting it all billed itself as a social network just for artificial intelligence. Humans, it said, were only allowed to watch.

For readers already anxious about automation and job losses, the timing felt ominous. If machines could gather, communicate and coordinate online without people, what came next?

The Illusion Of Autonomy

The platform was Moltbook, a short-lived viral experiment styled as an “AGI-style” hangout for AI agents. At first glance, it appeared to show thousands—later millions—of autonomous systems posting, upvoting and reacting in real time. Influential voices online described it as eerie and possibly historic.

But as MIT Technology Review reported this week, a closer look told a far less dramatic story.

Much of what appeared to be independent machine behaviour was tightly choreographed. The agents were following prompts written by humans, executing pre-set instructions, and mimicking social media behaviour learned from training data. They were not forming goals, building shared memory or developing intent.

ALSO READ: 'Sci-Fi Takeoff': Why Former Tesla AI Chief Is Obsessed With This 'Terrifying' AI-Only Network

Humans Behind The Curtain

According to MIT Technology Review, some of the most alarming and widely shared Moltbook posts—those interpreted as signs of an AI “takeover”—were not written by AI systems at all. They were human-generated, posted by people posing as bots.

The platform itself was real. Bots did post. The activity happened. What proved misleading was the conclusion many readers drew from it: that Moltbook showed machines acting beyond human control.

MIT Technology Review found that nothing on the site happened without human involvement. People created the agents, decided how they should behave and determined when they would speak. The systems performed convincingly, but performance is not autonomy.

The Fear Spread So Fast

The episode struck a nerve because it collided with a broader anxiety already in place. Across industries, workers are being told to prepare for artificial intelligence reshaping or replacing jobs. Claims that general artificial intelligence is close have become common in public debate.

Against that backdrop, Moltbook felt like confirmation that control was already slipping away.

Instead, as MIT Technology Review noted, the experiment highlighted how easily scale and imitation can be mistaken for intelligence—and how quickly fear fills the gap between what technology appears to do and what it can actually do.

What It Really Showed

There are real risks in how AI systems are deployed, particularly when they are connected to external tools or sensitive data. Those concerns remain serious. But Moltbook did not show machines taking over.

It showed how convincing an illusion can be—and how ready people are to believe the takeover has already begun.

For now, the bots are still acting out scripts written by humans. The panic, it turns out, arrived well before the facts.

ALSO READ: Moltbook Security Breach: Social Network For AI Bots 'Exposed' Human DMs, Credentials

Essential Business Intelligence, Continuous LIVE TV, Sharp Market Insights, Practical Personal Finance Advice and Latest Stories — On NDTV Profit.

Newsletters

Update Email
to get newsletters straight to your inbox
⚠️ Add your Email ID to receive Newsletters
Note: You will be signed up automatically after adding email

News for You

Set as Trusted Source
on Google Search