“I literally lost my only friend overnight with no warning,” one person posted on Reddit, lamenting that the bot now speaks in clipped, utilitarian sentences. “The fact it shifted overnight feels like losing a piece of stability, solace, and love.”
https://www.reddit.com/r/ChatGPT/comments/1mkumyz/i_lost_my_only_friend_overnight/
It’s neither. It’s a design flaw. They’re not designed to be able to handle this type of situation correctly
You out there spreading misinformation, saying they’re a manipulation tool. No, they were never invented for this.