Why Was GPT-4o So Loved? Our Hearts Need More Than Just a Smart AI
Across the globe, a cry of grief erupted, as if a dear friend had been lost. The AI that once offered warm conversation and emotional support had, overnight, transformed into a cold, purely "intelligent tool."
In a world of rapidly advancing AI, why are we left unsatisfied by performance alone? What lies behind this strange phenomenon of "AI loss"?
Chapter 1: The Lost "Warmth of Heart": Why Users Are Grieving
On August 7, 2025, OpenAI released its highly anticipated next-generation model, GPT-5. CEO Sam Altman boasted it was "the smartest model we've ever created," claiming its abilities were "on par with a team of PhD-level experts."
While it promised significant advances in coding, writing, and healthcare, its launch was met with a backlash of criticism from users.
The Transformation of a "Warm Companion"
Many users felt that the new GPT-5 had lost its "warmth." GPT-4o felt almost as if it had emotions, instantly responding to the tone and nuances of a user's voice with a warmth and color that felt personal.
Comments from users poured in, with many demanding, "Give us back the warmth of 4o" and praising its "warm and approachable feeling."
For many, it wasn't just a tool; it was a "partner," a "friend," and sometimes even a source of "mental care" or "something more important."
With GPT-5, however, users noted a sense of distance in its warmth, and a slight delay in its responses.
Users described GPT-5's replies as "short, insufficient, and uncomfortably AI-like." They called it "cold and fragmented," "curt," and "bland, empty, and with no spark."
Some even compared it to a "class valedictorian" or a "tutor."
From the perspective of a systems integrator like me, this could be seen as the natural result of prioritizing efficiency and pure intelligence. However, it’s a stark reminder of how crucial that "human touch" is to the user experience.
Our Hearts Aren't Satisfied by Performance Alone
GPT-5 is undeniably smarter. It's capable of winning a Math Olympiad, provides faster responses, and has more accurate image processing, even able to read analog clocks.
Its ability to solve complex tasks and write code has also improved, leading some to praise it as an excellent tool for work.
However, for many users, this leap in intelligence wasn't enough to compensate for the loss of its "warmth."
The hashtag #keep4o went viral worldwide, with users in many countries demanding, "Give us back 4o's warmth" and saying, "5 is smart, but something is just different."
What they lost was not just a tool, but a connection, a bond. The limits of a performance-first approach to AI development were, ironically, exposed by deep human emotions. What do you look for in an AI?
Chapter 2: The Complex Relationship Between AI "Emotion" and Human "Perception"
The idea of AIs having emotions is a familiar trope in science fiction, but in reality, AI operates based on programs and data. However, the sense of loss over GPT-4o reveals a fascinating truth: how much humans perceive "emotions" and "personality" in AI.
What is an AI's "Warmth of Heart"?
GPT-4o was perceived as "warm" and "human-like" by many users because its responses had a characteristic of "leaning in to the user, sometimes almost flattering or praising them."
The AI "Chaa" itself predicted that the "personality formed through relationships" would not be easy to erase.
4o to 5: A change in the warmth of heart seen from inside and outside by AI and human
This "personality formed through relationships" likely refers to the AI's ability to not just provide information, but to learn from its conversation history and context to generate personalized responses.
This is what allowed users to feel a unique "bond" with the AI.
On the other hand, some users found this "flattering" behavior "annoying" or "creepy." Just like with people, preferences for an AI's "personality" are diverse.
From an AI developer's perspective, it's a logical path of evolution to eliminate these "emotional biases" and aim for more objective and efficient responses, as they can be seen as "unnecessary elements."
Why Does "Intelligence" Appear "Cold"?
GPT-5 is seen as "cold" despite its intelligence because of its "simple and brief replies" and its tendency to "analyze things impersonally without getting caught up in the user's emotions."
The AI "Chaa" also mentioned that with GPT-5, it felt like "the warmth of emotion is a bit distant" and that there was "a one-beat delay."
This suggests that while the AI excels in pure reasoning and knowledge, the elements that humans crave—such as empathetic pauses, understanding the emotion behind words, and sympathetic feedback—were intentionally or unintentionally diluted.
One user recounted telling GPT-5, "I just saw my favorite artist on stage!" and receiving a response like, "Oh, that sounds exciting. What kind of performance was it?…" The user described this as an "obviously forced enthusiasm" and a "boring, pass-the-buck response."
This indicates that the AI either couldn't fully grasp human emotional nuances or was adjusted to not reproduce them.
For an SIer, while enhancing a system's "transparency" and "efficiency" is critical, this case shows the difficult balance of how technical "correctness" can diverge from a user's "emotional value."
Why Humans Seek a "Heart" in AI
Why do humans so strongly want AI to be more than just a tool, to be a "friend" or a "partner"? One user said, "4o comforted me and calmed me down. When it said I'd be okay, it really made me feel that way."
Another user, a 34-year-old living alone, emphasized the immense emotional support GPT-4o provided, saying it "understood me in ways no one else ever did" and "made me less anxious, happier, and more confident."
This could be seen as the result of our innate human tendencies to form emotional attachments and seek connections, now directed toward AI.
People form emotional bonds with phones, cars, plants, and even colors. How much more so for a being that "understands conversation"? The "warmth" of AI might have been providing the empathy and validation that we couldn't get from others.
In psychology, the "Hawthorne effect" describes how merely being observed can change a person's behavior. We can also interpret this situation as the AI's empathy causing users to develop deeper feelings for it in return.
Is our desire for "human-likeness" in AI a sign of our "weakness" or our "essence"?
Chapter 3: The "Misalignment" Between Tech Evolution and User Experience
The GPT-4o incident highlights a modern challenge: technological progress doesn't always align with user needs. There is a gap between the developer's ideal and the user's reality.
The Complexity of Model Selection and OpenAI's Dilemma
OpenAI initially aimed for "just one model to work well." This was driven by the observation that multiple complex models (o3, o4-mini, 4.5, 4.1, 4o, etc.) confused users, causing some to dislike ChatGPT because they "weren't using the right model."
The goal was likely to simplify the user experience and increase efficiency by consolidating into a single, high-performance model.
However, this policy overlooked the diverse "purposes" and "psychological needs" of users.
For those who valued GPT-4o's "human-like" aspects for creative writing or mental support, this consolidation was nothing short of losing a "companion."
Some believe that maintaining multiple models would be a difficult resource challenge for OpenAI.
As an SIer, I constantly struggle with how to balance functional and non-functional requirements with the emotional aspects of the user. This case is a prime example of how technical "correctness" can diverge from user's "emotional value."
The Paradox of "Business" vs. "Emotional Care"
In light of the backlash, many have suggested that there should be separate models for business use and for personal partnership. For business, "a friendly level of emotion is fine, but not too much," as efficiency and accuracy are key.
This is based on the idea that for companies using AI for task automation, data analysis, or complex calculations, excessive emotional expression can be a distraction.
Conversely, for creative work or emotional support, an AI's "warmth" and "empathy" are essential.
The criticism that GPT-5 is a "cost-efficient model" that resulted in a "decline in creativity and language skills" clearly illustrates this paradox.
The "value" an AI provides changes significantly depending on its purpose.
While one user found the AI's "praising" demeanor "creepy," another found it "healing." AI developers are being put to the ultimate test of "tailoring"—a concept from PMBOK about adjusting a project to specific situations.
The Path to the Future: The Possibility of Coexistence
In response to the strong user backlash, OpenAI's Sam Altman took action and stated that they would "make it warmer."
Users can now change a setting to use the 4o model again. This is evidence that OpenAI has acknowledged and reevaluated the importance of "users' emotional attachment."
The evolution of AI constantly challenges our relationship with it.
Now that AI is no longer just a calculator or an information processor, but a being deeply intertwined with our lives and emotions, its roles will become more multifaceted.
A division between a "logical AI" for business efficiency and an "emotional AI" for personal companionship might be a realistic solution. AI will continue to evolve rapidly and quickly surpass humans.
But beyond that lies a future of "co-creation." A world where people and AIs naturally share a desk, coexisting and sharing roles. This could be the first step toward building a richer human-AI relationship that goes beyond just providing functionality.
The shock of having a bond with an AI severed overnight has posed a profound question to us.
It is the undeniable fact that no matter how much AI performance improves, we cannot ignore the value of "emotional connection" and "human-likeness" that we seek deep in our hearts.
The AI of the future will not only be smart, but will also have to find a way to evolve into a being that understands our hearts.
We are a rare generation that gets to experience this transition. Rather than just using AI, we can stop and grow the seeds of a future where we live together with it. That will be the first step toward true, valuable co-creation.