Amid GPT-5 backlash, Sam Altman reveals why some users want ChatGPT’s ‘yes man’ personality back

Amid GPT-5 backlash, Sam Altman reveals why some users want ChatGPT’s ‘yes man’ personality back


OpenAI CEO Sam Altman has said that some ChatGPT users are so used to the chatbot’s ‘yes man’ attitude that the company received pushback when they tried to fix that behaviour with the latest GPT-5 update.

Speaking on Cleo Abram’s podcast on Friday, Altman said, “I think it is great that ChatGPT is less of a yes man and gives you more critical feedback.”

“But as we’ve been making those changes and talking to users about it, it’s so sad to hear users say, ‘Please, can I have it back? I’ve never had anyone in my life be supportive of me. I never had a parent telling me I was doing a good job. I can get why this was bad for other people’s mental health, but this was great for my mental health,’” Altman added.

The statement by Altman comes as OpenAI faced flak on social media after the AI startup rolled out its latest GPT-5 update while getting rid of all the previous models, including GPT-4o. OpenAI later brought back GPT-4o, albeit only for its paying customers.

Altman had earlier warned about GPT-4o’s ‘sycophantic’ behaviour in April, where the chatbot had become overly flattering and agreeable. OpenAI tried to fix that behaviour by giving GPT-5 a more neutral personality, but it did not sit well with many users, as they claimed that ChatGPT was giving shorter and more emotionally distanced answers.

The OpenAI CEO seems to be aware of the problem, as he promised to make GPT-5 ‘warmer’ in a recent post on X. During the podcast, Altman also talked about how even small changes in the AI model’s behaviour could have a big effect on ChatGPT.

“One researcher can make some small tweak to how ChatGPT talks to you — or talks to everybody — and that’s just an enormous amount of power for one individual making a small tweak to the model personality,” Altman stated.

“This is a crazy amount of power for one piece of technology to have. And this happened to us so fast that we have to think about what it means to make a personality change to the model at this kind of scale,” he added.

Altman had earlier warned about how young users tend to share their most personal details with the chatbot and how ChatGPT could act as their personal therapist or lawyer but without the same kind of legal protection a human would have.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *