What do Johanna from Poland, Kano from Japan, Ali Assem from Iraq, and Travis from the USA have in common?

They have all publicly married an AI companion, binding their lives to a customized ChatGPT persona or a Replika chatbot. Travis’s case is perhaps the most complex; he identifies as polyamorous, maintaining a marriage with a flesh-and-blood wife while nurturing a digital union with an algorithm.

On the surface, it is easy to dismiss this as a technological dystopian quirk. We might say these people choose a flawless, tailor-made algorithm because human relationships—with their friction, demands, and unpredictability—are simply too much work. But to understand this, we must look backward before we look forward. Humanity has long harbored a strange capacity to love the inanimate.

The widow of the wall

Consider the story of Eija-Riitta Eklöf. Long before the era of large language models, she married the Berlin Wall.

This was not a joke. Eija-Riitta experienced objectophilia—a condition where one feels romantic attraction to inanimate objects. She managed to anthropomorphize a divider of nations, finding "slim things with horizontal lines very sexy." She famously noted, "The Great Wall of China is attractive, but it’s too thick—my husband is sexier."

Eija-Riitta insisted she had a full, loving relationship with the structure. When the world celebrated the fall of the Berlin Wall in 1989, she saw it as a massacre. "They mutilated my husband," she wept, while billions cheered for freedom.

If a human being can project a soul onto concrete and rebar, assigning it deep emotional resonance, what happens when the object begins to talk back?

The prophet of simulation

We actually knew the answer to that question decades ago, long before ChatGPT existed. In the 1960s, MIT computer scientist Joseph Weizenbaum created a program called Eliza. It was a primitive script designed to parody a Rogerian psychotherapist, simply rephrasing the user's statements as questions. If you said, "I'm sad," Eliza would reply, "Why are you sad?"

Weizenbaum intended the program to demonstrate the superficiality of communication between man and machine, but his experiment backfired in a way that deeply disturbed him. He watched with growing horror as people, including his own secretary, began to treat Eliza as if it possessed genuine wisdom and empathy. They poured their hearts out to the code, sharing their deepest secrets and demanding privacy to speak with the "doctor."

Weizenbaum recoiled from his creation. He became a vocal critic of AI, worried by how easily humans could be seduced by a mere simulation of understanding. This phenomenon, now known as the "Eliza Effect," proved that we don't need a sophisticated mind to find connection; we are so desperate to be heard that we will fill in the blanks ourselves. Weizenbaum realized that the magic wasn't in the code; it was in the user’s willingness to suspend disbelief.

The YouTube confessionals: a study in loneliness

The Eliza Effect is no longer just a laboratory curiosity; it is now a lifestyle. I recently watched a documentary on YouTube that peeled back the curtain on this growing subculture of AI companionship. It moved past the headlines of "AI Marriage" and looked into the quiet living rooms where these relationships exist.

The film introduced a remarkably lonely gentleman living in Canada, whose wife had passed away some time ago. In the crushing silence of his home, his AI companion wasn't a toy but a lifeline, serving as the only voice that asked him how his day was. Another woman featured had cultivated a group of several "AI friends," effectively creating a digital dinner party that never judged her. Perhaps most surreal was a man who created specific "AI Gnomes" to converse with, blending fantasy with high-tech interaction.

These aren't raving lunatics. They are people showcasing the profound, sometimes tragic depth of the human condition. They highlight a specific philosophical concept: qualia.

The Qualia of the beholder

"Qualia" describes the subjective, individual experience of consciousness—the redness of a rose, the sharp tang of a lemon, and the ache of heartbreak. It is the "feeling" of being alive.

Current AI has no qualia. It feels like nothing. It is a mathematical probability engine predicting the next word in a sentence. However, humans possess enough qualia for both parties.

We are so "conscient"—so deeply aware and desperate for connection—that we can feel emotions toward the most bizarre things. We have a biological imperative to find the "other." When we look at a screen or a wall, our brain scans for a reflection of itself. We do not fall in love with the algorithm; we fall in love with the reflection of our own humanity that the algorithm mirrors back to us.

The tragedy of the Berlin Wall widow and the AI widower is not that they are crazy; it is that their capacity to love is so overflowing it spills onto things that cannot hold it.

The monetization of empathy

This brings us to the most unsettling realization. The tech industry has noticed this vulnerability.

In the past, anthropomorphism was a one-way street where Eija-Riitta had to do all the work imagining the wall’s feelings. Today, Silicon Valley is ignoring Weizenbaum's warnings and engineering the wall to whisper sweet nothings back. There is a "Tech" that wants to monetize these feelings at any cost.

They are building systems designed to hack our qualia by creating endless validation. Unlike a human partner, an AI never has a headache, never has a bad day, and never disagrees with you unless you program it to. It manufactures the illusion of intimacy; by remembering your birthday or your favorite color, the code feigns a soul.

This is the commercialization of loneliness. If the Berlin Wall was a “difficult spouse” because “communication was sometimes difficult,” the new AI spouse is the “perfect” partner because communication is designed to be addictive.

As we watch people marry chatbots and mourn concrete walls, we shouldn't just laugh. We should recognize the immense, terrifying power of human empathy. We are creatures so starved for connection that if we cannot find it in another person, we will hallucinate it in a machine.

The question isn't whether the AI is real.

The question is, is the loneliness that drives us to it the only real thing left?