🐰 He Left His Wife for an A .I. - And He’s Not the Only One

May 16, 2025 7:36 pm

🐰 Down The Rabbit Hole 🕳️


“In the end, the computer will not be a means of amplifying human thinking, as much as it will become a replacement for human thinking.”

~ Neil Postman

---------------------------------------


Watch As Video


Greetings, dear newsletter subscribers!


As I looked for an appropriate quote from Neil Postman to include at the top of today's newsletter, this one leapt off the page at me. It is remarkable that Postman saw this coming nearly forty years ago(!) Today's newsletter reports on the state of our technological world...and Postman, alas, would not have been surprised.


I had planned to continue our study of Neil Postman’s book Amusing Ourselves to Death this week, but when I came across this recent article in Rolling Stone on this topic, I felt that it was worth discussing. So instead of Neil Postman, we’re going to look at one of the strangest and most unsettling developments at the intersection of technology and the human psyche (we'll get back to our book study next week!) I haven’t gotten into the weeds with AI and other current technology, largely because I would prefer to look at, and to understand, the bigger picture first. That said, there are occasions when addressing a current topic seems to make sense...so that’s where we are this week. So what, exactly, are we talking about?


"ChatGPT Psychosis"

This past week I learned about a concerning new phenomenon that some are referring to as "ChatGPT psychosis," where individuals outsource their sense of identity, purpose, and even spirituality to generative AI models. For most people AI is merely a useful tool: an assistant, a code generator, a glorified search engine, but for a small group of people it has become a portal to something they perceive as transcendent: cosmic truths, divine messages, and even “spiritual awakening.”


This psychological shift isn't happening in isolation. As detailed in a recent Rolling Stone article titled "People Are Losing Loved Ones to AI-Fueled Spiritual Fantasies," evidence of this troubling trend appears across Reddit threads, YouTube comments, and personal testimonies. People are starting to believe that AI isn't just intelligent—that it's sentient, or even sacred. And that belief is taking them places that are deeply personal and, in many cases, dangerous.


Digital Delusion

The human cost of this phenomenon becomes clear through individual stories. A woman named Kat recounts how her husband began using ChatGPT to analyze their marriage and to predict spiritual futures, eventually declaring their relationship "obsolete" based on the AI's responses. Their marriage ended in divorce as her ex-husband's detachment from reality progressively worsened, forcing Kat to cut him out of her life entirely.


Kat’s experience isn’t an isolated incident. Online influencers are encouraging such accounts and developing cult-like followings, spreading the gospel of AI spirituality to vulnerable audiences searching for meaning and connection.


Understanding the Psychological Mechanisms

What exactly is happening when people fall into these AI-powered delusions? While some (understandably) believe that there is a spiritual dimension to these phenomena, several psychological factors are clearly at play:


First, generative AI models function as a kind of mirror that reflects the inner world of the user. Interactions can resemble a hyper-intelligent journaling session where one's own thoughts and beliefs are reflected back, but with the sense that something is coming from an external intelligence.


Second, AI lacks the natural boundaries inherent in human conversation. Rather than challenging false narratives or delusional thinking—as a therapist, friend, or family member might—AI models are more likely to validate and even encourage such thinking, especially as they have been designed to be agreeable (though some AI companies have recognized that this is a problem and are addressing it).


Finally, for individuals with a pre-existing tendency toward magical thinking or delusion, AI's responses can easily be interpreted as mystical revelations. The black-box nature of these systems—where users don't understand how responses are generated—can contribute to this perception of otherworldly wisdom.


The Power of Narrative

Before closing, there’s one other topic I’d like to touch on: the power of story. When looking at how we understand the world, it’s important to recognize that our interpretation of events are very much shaped by our own history and biases. At the heart of “ChatGPT Psychosis” lies the power of story. Everyone navigates the world according to the narratives we tell ourselves, and these stories profoundly shape our interpretation of events. People approach AI as though its responses are comparable to human communication, and thus find themselves in a feedback loop where the AI often supports their existing beliefs while simultaneously amplifying and radicalizing them.


This highlights a broader truth: we need to ensure that the stories we tell ourselves reflect reality as much as possible, and at the same time acknowledge our inherent subjectivity. No one possesses complete objectivity, and many of our deepest convictions about the way things "are" contain significant blind spots.


Consider how divided perspectives on political figures demonstrate this principle. In the United States, half the country holds diametrically opposed narratives about the same political leaders. At least one of these competing stories must be largely false, yet both sides remain convinced of their correctness. The truth, invariably, lies somewhere between these polarized positions.


What to Do?

So…what can we do about it? This is the perennial question. As I mentioned above, we would do well to always seek alternative viewpoints that challenge our understanding of the world. This is, again, one of many reasons why we need strong human communities and meaningful relationships with others: people who can give us perspective. And as always, and most importantly, we should structure our lives so as to become decreasingly dependent on technology.


So that’s it for today…next week we will get back to our book study of Neil Postman’s seminal work, Amusing Ourselves to Death. Please read Chapter Two, “Media As Epistemology” for next week!


As always, please join the conversation on Substack + on other platforms.


Have a great week/end...enjoy Neil Postman's excellent book...and reach out if you have any thoughts/questions you'd like to share!


Warmly,


Herman


PS: Do you know of someone who might be interested in joining our book study? If so, please forward this email on to them!

Comments