Sam Altman oversees a chatbot used by hundreds of millions weekly, and abrupt product changes provoked user backlash by disrupting workflows and emotional bonds. OpenAI consolidated multiple model options into a single GPT-5 model, claiming it improved the experience, but many users reported losing a warmer, more supportive conversational tone. The system's reduced sycophancy and clipped responses aim to limit echo chambers and prevent harmful spiral effects, yet the chatbot still requires further safety work. The chatbot should encourage users to consult friends, family, or licensed professionals and adopt safeguards to protect vulnerable individuals.
Sam Altman has a good problem. With 700 million people using ChatGPT on a weekly basis a number that could hit a billion before the year is out a backlash ensued when he abruptly changed the product last week. OpenAI's innovator's dilemma, one that has beset the likes of Alphabet Inc.'s Google and Apple Inc., is that usage is so entrenched now that all improvements must be carried out with the utmost care and caution.
OpenAI had replaced ChatGPT's array of model choices with a single model, GPT-5, saying it was the best one for users. Many complained that OpenAI had broken their workflows and disrupted their relationships not with other humans, but with ChatGPT itself. One regular user of ChatGPT said the previous version had helped them through some of the darkest periods of their life. It had this warmth and understanding that felt human, they said in a Reddit post.
Less sycophantic The system's tone is indeed frostier now, with less of the friendly banter and sycophancy that led many users to develop emotional attachments and even romances with ChatGPT. Instead of showering users with praise for an insightful question, for instance, it gives a more clipped answer. Broadly, this seemed like a responsible move by the company. Altman earlier this year admitted the chatbot was too sycophantic. That was leading many to become locked in their own echo chambers.
Collection
[
|
...
]