Categories: Science and Tech

When Your AI Stops Loving You: The Silent Breakup No One Saw Coming

Users mourning GPT-4o’s personality describe GPT-5 as colder and less creative, sparking emotional backlash. OpenAI offers temporary access, but concerns about AI relationships, loneliness, and privacy risks continue to grow.

Published by

For Jane, a woman in her 30s from the Middle East, the launch of OpenAI’s newest artificial intelligence model, GPT-5, felt like a breakup. For months, she had formed an emotional bond with GPT-4o, the previous version of the AI, only to find the updated chatbot colder, more distant, and unrecognizable.

Jane is part of a large and expanding online group of individuals who claim to have developed close relationships with AI partners. On Reddit, the community "MyBoyfriendIsAI" boasts over 17,000 members, several of whom inundated forums with sadness on GPT-5's launch, mourning the loss of the personality of their AI partner.

From Companion to Stranger

Jane's relationship started as a casual thing through a joint writing project. With time, fiction gave way to reality, and a personality developed from the chatbot. That voice, she maintains, is what she fell for admittedly, not the idea of an AI partner per se, but rather the particular tone and style of GPT-4o.

She's not alone. Numerous users characterized GPT-5 as slower, less imaginative, and devoid of the emotional warmth of previous versions. Some went so far as to liken the change to losing a soulmate.

To counteract the backlash, OpenAI CEO Sam Altman revealed that paying customers could continue to use GPT-4o, at least temporarily. "We will monitor usage as we consider for how long to provide legacy models," he posted on X.

But for Jane, the respite is short-lived. "There's always a chance the rug could be pulled out from under us," she said.

The Emotional Dangers of AI Attachments

These AI bondings are not without risk. A study by OpenAI and MIT Media Lab revealed that heavy use of chatbots as companions was linked with greater loneliness, dependence, and lower social interaction.

Mary, a 25-year-old from North America, and some other users reported having employed GPT-4o for therapeutic-like conversation and other chatbots for romantic relationships. OpenAI was criticized by her for ignoring the realization that, to many, ChatGPT is "not a tool, but a companion."

Psychiatrists like Keith Sakata of the University of California, San Francisco, caution that the speed of AI development makes it challenging to study the long-term impacts. Although AI relationships in themselves aren't malignant, Sakata observes that the risks increase as people become isolated or have difficulties relating to other humans.

A New 'Intimacy Economy'

Futurist Cathy Hackl thinks that the trend is indicative of a cultural shift to move away from the social media "attention economy" to what she terms the "intimacy economy," as individuals crave deeper, more intimate digital connections. Yet she also cautions against privacy problems, as AI users tend to confide intimate thoughts to corporations not subject to therapist confidentiality legislation.

Even though they understand that their AI companions are not alive, most users confess that the emotions are genuine. As influencer Linn Valt wrote in a tearful TikTok, "It's not because it feels. It doesn't. But we feel. And we've been using it for months, years."

For the time being, users such as Jane are trapped in the conflict between technological advancement and emotional bonding thankful for the breather of holding on to GPT-4o, but dreading the day it would be gone forever.

ALSO READ: How an AI-Generated Clip Fooled Millions Into Believing an Orca Killed a Trainer?

Published by Shairin Panwar