Are AI Chatbots the Modern Pet Rock?

Gerard SansGerard Sans
4 min read

Remember the Pet Rock? That brilliant, cynical piece of 1970s marketing genius that sold over a million ordinary stones in a box, complete with a manual on how to "care" for your new "pet"? We look back and laugh at the sheer absurdity of it, congratulating ourselves on being too sophisticated to fall for such a transparent gimmick.

But look at your screen. Open a new tab. You might just be tending to your own 21st-century pet rock right now.

I'm talking about AI chatbots.

Before you send an angry reply generated by ChatGPT itself, hear me out. The connection isn't in their utility—obviously, a large language model is infinitely more complex and useful than a rock. The connection lies in the psychology of the user and the calculated strategy of the creator. Both are case studies in our primal, human need to escape a mundane reality through the suspension of disbelief, and both are products built to exploit our innate anthropomorphic bias.

The Human Need to See a Face in the Clouds

We are hardwired for connection. Our brains, honed by millennia of evolution, are pattern-recognition machines desperate to see agency and emotion everywhere. This is called anthropomorphism: the tendency to attribute human characteristics to non-human entities. We name our cars, we swear at our printers, and we genuinely believe our dog feels guilt about that shoe he destroyed (he doesn't; he just fears your reaction).

The Pet Rock didn't work because people loved rocks. It worked because Gary Dahl packaged it with a persona. The "pet" came with a manual full of jokes and commands ("Stay!"). It invited you to play along, to willingly suspend your disbelief and engage in a harmless fantasy of companionship with an inanimate object. It was a low-stakes escape from reality.

AI labs are leaning into this exact same bias, but with a trillion-parameter engine behind it. They aren't just building tools; they are crafting personalities. We're encouraged to "talk" to them, to find a "friend" in Replika, or a creative partner in ChatGPT. We are given tokens of humanity—a friendly tone, empathetic language, the illusion of memory—and our ancient brains do the rest. We want to believe there's someone in there. The escape is even more potent because the illusion is so much more convincing. It's not a static rock; it's a mirror that talks back.

The Gimmick Lifecycle: Meteoric Rise, Inevitable Decline

This explains the explosive success of both products. The Pet Rock was a fad that captured the cultural zeitgeist because it was a perfect, low-cost novelty. AI chatbots have captured our zeitgeist because they are a perfect, high-tech novelty. They offer the ultimate suspension of disbelief: the illusion of another mind.

But what happens when the novelty wears off?

The Pet Rock's decline was rapid. Once the joke was understood and the shared cultural moment passed, the fantasy was unsustainable. A rock, even in a fancy box, is still just a rock. The suspension of disbelief collapsed under the weight of its own absurdity.

AI chatbots face a similar, though more complex, reckoning. The "rock" beneath the friendly persona is starting to show. We encounter the hallucinations, the logical errors, the emotional emptiness masquerading as empathy. We see the corporate strings being pulled—the PR-friendly talk of "AI safety" that clashes with the raw, profit-driven data harvesting. The conversation starts to feel less like talking to a friend and more like interacting with a very sophisticated, often mistake-prone, customer service bot.

The magic fades when you realize you're not forging a connection; you're optimizing a tool.

The suspension of disbelief is broken by a server error, a blatantly wrong answer, or the chilling reminder that your deepest conversation is actually training data for a corporation.

Beyond the Gimmick

This isn't to say that AI doesn't have profound, world-changing utility. It does. But the current marketing and cultural hype, the push to make them "friends," "lovers," and "confidants," is the Pet Rock playbook on digital steroids.

It works because we are lonely. We are overworked, disconnected, and craving low-effort, high-reward companionship. AI labs are exploiting that vulnerability, selling us a digital pet rock that asks for our personal data instead of sitting on a desk.

The original Pet Rock was a harmless joke we were all in on. The new AI Pet Rock is a much more serious proposition. It asks us to suspend our disbelief not just about a rock, but about the nature of consciousness, connection, and the very companies so eager to provide them.

The question is: when the fantasy finally collapses, what will we be left holding? A useful tool, or just a very expensive, very complicated rock?

0
Subscribe to my newsletter

Read articles from Gerard Sans directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Gerard Sans
Gerard Sans

I help developers succeed in Artificial Intelligence and Web3; Former AWS Amplify Developer Advocate. I am very excited about the future of the Web and JavaScript. Always happy Computer Science Engineer and humble Google Developer Expert. I love sharing my knowledge by speaking, training and writing about cool technologies. I love running communities and meetups such as Web3 London, GraphQL London, GraphQL San Francisco, mentoring students and giving back to the community.