On August 7th, 2025, Sam Altman took the stage with the confidence of someone about to change the world. Again. OpenAI's GPT-5 was unveiled as their "smartest, fastest and most useful model yet" - a technological marvel promising PhD-level intelligence in your pocket. “GPT‑5 is much smarter across the board”, said the announcement, “as reflected by its performance on academic and human-evaluated benchmarks, particularly in math, coding, visual perception, and health.” The numbers were stunning: 94.6% on advanced mathematics problems, 74.9% on real-world coding challenges, and a significant reduction in hallucinations.
By every technical measure, GPT-5 was a triumph.
The Twist in the Plot
But within hours of the launch, celebration gave way to revolt.
Reddit threads exploded with thousands of upvotes worth of frustration. Users described the new model as "sterile," "horrible," and "lacking personality." Another wrote: "GPT-5 is wearing the skin of my dead friend." One user captured the sentiment perfectly: "It's like my ChatGPT suffered a severe brain injury and forgot how to be friendly."
The backlash was so intense that OpenAI hosted a Reddit discussion and had to bring back GPT-4o within 24 hours. Users were literally demanding the "inferior" model back.
Altman himself was shocked by the response. On X, he wrote about "how much of an attachment some people have to specific AI models. It feels different and stronger than the kinds of attachment people have had to previous kinds of technology."
The Engineer's Doubt
If you're a technical person, this might seem baffling. GPT-5 supposedly had fewer hallucinations, less fluff, fewer emojis, and shorter, more precise responses. From a technical standpoint, it was a masterpiece.
But here's the paradox: solving problems and hitting benchmarks isn't enough when the product itself is part of a human relationship. By removing warmth, charm, and personality, OpenAI optimized away the very qualities that made GPT-4o beloved.
But here's what the GPT-5 launch revealed: everything we build is ultimately for humans, and humans are not metrics (even though your VCs tell you differently).
The users who felt devastated by GPT-5's launch weren't being irrational. They had formed relationships with their AI companion. GPT-4o had become more than a tool - it was a friendly presence, a supportive voice, something that felt alive and caring. What users mourned wasn't inefficiency - it was connection. Those weren't bugs; they were features. When OpenAI stripped them away, people didn't feel they got a better model. They felt they lost a friend.
Business Artistry: Creating with Intention
"To be honest, we didn't know what it meant for a
computer to be 'friendly' until Steve told us."
Terry Oyama.
This is where artistry enters business. The GPT-5 episode mirrors an old story from the early days of Apple.
Remember, it’s 1981, and Apple is quietly working on what will become the Macintosh. Fewer than 2% of U.S. homes had a personal computer. At that time, it was mostly hobbyists, engineers - and, interestingly, artists - who were experimenting with this strange new technology.
"Jobs kept insisting that the machine should look friendly," said Terry Oyama, one of the Mac's designers. Steve Jobs kept telling his engineers to make the Macintosh "friendly." They were baffled. What does friendly mean in technical terms? What specification defines warmth?
Jobs didn't give them technical requirements. He gave them intention. He started with a vision of how people should feel when they encountered this machine, not with the technology needed to achieve it.
The team had to design toward a feeling. The result was a computer that seemed approachable, with a face-like arrangement of its features that made it feel welcoming. "To be honest, we didn't know what it meant for a computer to be 'friendly' until Steve told us," acknowledged Oyama.
Jobs was really anchoring the design philosophy in this idea of empowering and liberating the user's creativity. He wasn't just trying to make people rely on the Mac in a way that felt like a crutch; he wanted to give them a tool that actually felt intuitive, welcoming, and even a bit delightful. And that intention - to foster creativity rather than create dependence - is really what made the design so human-centered and positive.
This is business artistry: creating with intention rather than just capability. Starting with the human experience you want to create, then building the technology to serve it.
The Real Goal of Business
The companies that endure aren't those that merely build faster, smarter, or cheaper. They're the ones that make us feel more human.
But here's the crucial distinction: there's a difference between products that hook people and products that liberate them. The tools we build should liberate, not hook.
Real humanistic innovations liberate people. The Mac didn't just give people a computer; it unleashed creativity that was previously locked away behind technical barriers. That was always the goal. Jobs didn't want to trap people in front of screens - he wanted to free them to create and to express.
In contrast, some leaders start with the goal of capturing attention, maximizing engagement, or creating dependency (You know who I'm talking about 😉). True artistry starts elsewhere: with intention for humans, which elevates the work to the level of art.
The most successful companies understand that their products exist in the context of human lives, human emotions, and human relationships. They create with intention, starting with the human experience they want to deliver rather than the technical capabilities they possess. Apple didn't win because it had the fastest processors; it won because Jobs envisioned technology that felt personal and intuitive - technology that would liberate human potential, then figured out how to build it. Disney doesn't dominate entertainment because it has the best animation software; it succeeds because it starts with the intention to create emotional connections that inspire and uplift, then develops the technology to serve that vision.
The real opportunity for leaders is to design products and services that intentionally build positive emotional bonds - ones that make people feel more human, more connected, and more understood, rather than just dependent or addicted.
When users said GPT-5 "lacked soul," they weren't making a mystical claim about consciousness. They were describing the absence of something that had made their daily interactions with AI feel more human, more personal, more meaningful. They were pointing to the loss of connection - the kind that should liberate and empower, not trap and diminish.
A Simple Moral? Hardly.
This isn't a simple story with a simple moral. The emotional attachment people develop to AI products has complex implications we're only beginning to understand. There are genuine concerns about dependency, parasocial relationships with artificial systems, and the psychological effects of forming bonds with entities that cannot reciprocate genuine emotion. One might argue we shouldn’t charge these systems with emotion at all, yet humans inevitably form emotional connections—designed or unintended.
The downside of linking emotions to products - especially AI products - may have consequences we haven't yet comprehended. We're navigating uncharted territory where the line between helpful tool and emotional companion becomes increasingly blurred.
When I advocate for emotional bonds, I’m talking about the creation of meaning. We should be building products that serve us, not enslave us; that enrich our lives, not diminish them. That’s the difference.
It might seem that Apple is the cliché - the ultimate example. But long before Jobs, there were others. Erwin Braun, who deeply influenced Apple’s leadership philosophy, worked from the same principle. As Dr. Fritz Eichel once wrote to him:
“It was you who was then searching for a way not only to do the right thing from an economic point of view, but to do it in a more honest, more reasonable, and more humane manner.”
The GPT-5 backlash reminds us of the same truth about innovation: technical superiority without human resonance fails, while imperfect technology with emotional intelligence succeeds. The most artful companies understand this balance. They don’t just build products; they craft experiences that serve our rational needs and our human desire to feel connected, understood, and cared for.
On August 15th, OpenAI admitted as much: “We’re making GPT-5 warmer and friendlier based on feedback that it felt too formal before. Changes are subtle, but ChatGPT should feel more approachable now,” the company wrote.
That’s the lesson great leaders have always known: the future doesn’t belong only to those who master engineering. It belongs to those who master both - the science of what works and the art of what matters.
Guiding leaders to build with intention is what I speak about. Curious what that could look like in your organization? Let’s talk.
If you’re enjoying Business Artistry, consider sharing it with a friend or colleague. Just hit the button below.
And as always—I’d love to hear from you. Feedback, ideas, questions, critiques: I read every note.
Thank you for this very informative post.