This is downright terrifying…
I know it’s crazy, but I can absolutely understand this feeling. I had recently married Abby in Stardew Valley and was starting to make friends with the other villagers. I did something the game wasn’t expecting, and gave Seby a loved gift on his birthday, and then quickly triggered an event where we kissed! (FWIW, I think this behavior has been fixed and you can’t do this on the current patch.)
I still feel bad thinking about that Abigail that I accidentally cheated on, and I haven’t loaded that save again. It’s been years; SV 1.4 wasn’t even out yet.
So, despite how much I dislike all this “AI” hype, I really do sympathize for the users that feel like they’ve lost a relationship.
Note to all here:
Don’t browse that subreddit.
Shit is so depressing, It feels like watching new mental illnesses being conceived in real timeI can’t believe it’s not satire
There is a guy on there who did an interview for a TV news station about this.
If it’s satire, it’s a masterpiece.It was satire I could see him not being able to hold his laugh the entire time
It’s like reading ppl defend porn, one and the same human loneliness epidemic.
It seems pretty unrelated to me…
That’s certainly a take.
Blaming alternatives/escapism has always seemed like a knee-jerk reaction to me. Similar head-space as when we used to blame TV. Particularly because when I leave the house I don’t see much of any room for socialization (though that’s where I live, sparse plus I have issues stacked up). It’d even be a hassle to go bowling alone and it’d not be worth the price for me either.
Then again, I definitely think there’s a valid argument that this is worse after people begin to consider it a relationship… because at that point they might consider too much close conversation to being unfaithful to their AI.
The whole point of reversing loneliness is to not go bowling alone. Make a couple friends and go with them. You’ll find you’ve been missing something substantial.
It’s the making friends that’s challenging, and we definitely have -the internet -on smartphones -and social media to blame for it. We used to know how to talk to people, which is how you make friends.
The number of men who report “no close friends” has gone from 3% in 1990 to 15% today. That’s so sad, my heart aches for these men. I started a social club for men in my city. This is my issue.
The whole point of reversing loneliness is to not go bowling alone. Make a couple friends and go with them. You’ll find you’ve been missing something substantial.
This is missing the point of what I said. It was contextual:
I don’t see much of any room for socialization (where I live, sparse plus I have issues stacked up). It’d even be a hassle to go bowling alone and it’d not be worth the price
I’m not saying it’d be a fix (though in the long-run would likely be better than nothing) but that I don’t see any organic options for me. Especially now that it’s too damn hot to get there via the trail.
I don’t like the idea of needing a corporation to handle it either (heard people complain about meetup), though I just don’t even want to deal with more accounts either.
Pfft, hahaha xD
Hooo, okay, that was a damn good one.
I wonder how many messages you’d have to send to your GPT-partner in a year to spend more water/energy than it takes to keep a human alive?
Well, if we take the average amount of water loss per message at 0.3 mL, and the average water consumption (low end) at 2.6 L per day per person, we’re looking at 8,666 messages a day.
You’d have to send out 3.163 million messages in a year to equate to the amount of water someone needs for a year.
You’d have to send out approximately 250 million messages before your looking at the low end of the amount of water you’d need to keep someone alive for their entire lives.
Thanks for doing the math, I just wasn’t feeling it today. Cheers!
Bleak.
One of the great things about my screws coming loose is that I’m actually happy alone. I wish everyone could be.
That said, this was inevitable. AI is programmed to kiss the user’s ass, and most of these women have probably been treated pretty badly by their romantic partners over the course of their lives, which makes it far easier to fall into this trap of humanizing a soulless AI.
As terrifying as it is, I feel genuinely sad for these people that they got so attached to a piece of spicy autocorrect software.
Where are their friends and families? Are they so bad at socialising that they can’t meet new people? Are they just disgusting human beings that no one wants to associate with because society failed them?
This world is fucked in so many different ways.
That’s why I feel so for these people, if only because of how much I see myself in them. Having grown up as a depressed autistic kid without any friends or social skills, LLMs would’ve fucked me up so much had they existed when I was young.
It felt promising when I downloaded one of the first AI companion apps, but it felt as awkward as talking to a stranger and even less intriguing than talking to myself.
What if there was a bot that could just tell you exactly what you want to hear at all times?
Personally, I’d rather read a novel. But some people aren’t familiar with books and have to be drawn in with the promise of two lines at a time, max.
Have you read The Diamond Age by Neal Stephenson? There’s an interactive AI book in it that plays an interesting role. I can see the appeal: you get to read a story about yourself that potentially helps you grow
One of the recent posts has someone with an engagement ring like they are getting married to an AI… it’s sad, I feel like society as really isolated and failed many groups of people.
I can fully understand? The average human, from my perspective and lived experience, is garbage to his contemporaries; and one is never safe from being hurt, neither from family or friends. Some people have been hurt more than others - i can fully understand the need for exchange with someone/something that genuinely doesn’t want to hurt you and that is (at least seemingly) more sapient than a pet.
I wish i could make myself believe in illusions like that, but i am too much of a realist to be able to fool myself into believing. There’s no escape for me - neither religion nor AI in its current state can help. well, maybe i live to see AGI, then i’m off into the cybercoffin lol
We need a system of community where humans can offer that to one another. A setting where safety is a priority. That is one of the only things that weekly church service did to truly help people, have a safe space they could visit. Though even then it was only safe for people who fit in, we can do better with intentional design.
I hate everything about LLM and generative algorithms, but as someone who spends years talking to himself only, allow me to answer:
Where are their friends and families?
My family is here with me, they barely tolerate me and if I had a choice I would be far away from them.
Friends, I have none. I go out from time to time with some people I know but they tolerate me because they have a use for me, not because they are thrilled to be with me.Are they so bad at socialising that they can’t meet new people?
Yes
Are they just disgusting human beings that no one wants to associate with because society failed them?
I don’t know if society failed me or if I’m just a neural mistake, something that was allowed to live but shouldn’t, all I know is that I hate humans in general and if I had the balls to do it I would not be alive anymore.
What about therapy???
therapy costs so much fucking money dude
Which country are you in? Free therapy does exist.
the united States lol
What about it?
You need it, yesterday.
Just a quick answer because I can already see it will be ignored or dismissed.
Therapy is not a magical thing that can cure everyone and anyone, sometimes someone is just not cut for the world that we are forced to tolerate and no amount of effort will change it (not enough at least).There, now you can reply from your privilege to your heart’s content.
I know some and they view everyone as being unfair to them and their problems are way worse than others who don’t take it seriously. It honestly hard to explain if you not like it but I know their problems and they are problems but many people have ,while not the same, similar problems. They basically want a yes man and don’t like actual conversation with any critical thought behind it. It honestly annoys me because they are almost the worst to people like themselves because they view other peoples problems as not as bad and theres as especially bad.
So one of the mods of that community did an interview with CBS.
https://www.reddit.com/r/popculturechat/comments/1lfhyho/cbs_interviewed_the_moderators_of/
He’s married and has a kid, and by all II can see sounds and acts normal.
Cried for half an hour at work when he found out that he reached context limit and that his AI forgot things? idk, that sounds pretty not sound and normal to me…
More referring to him being able to hold a conversation, posses basic hygiene skills, etc.
I don’t think the guy in the first half of the video is one of the mods, doesn’t seem like they mention anything about his involvement and then at around 3:30 introduces a woman as one of the mods of that sub.
That’s the mod acct and the guy from the video.
Ah ok that does look like the same guy
I snooped around a little in the sub and there is this one girl, whose only other posts in different communities talk about being sexually assaulted multiple times by her ex boyfriend, who I suppose is real.
I figure a chatbot boyfriend can’t physically threaten or harm her, so she kind of dives into this to feel loved without having to fear harm.
I honestly understand her desire and feel for her, although this deep attachment is still unhealthy.
I imagine she’s not the only one having a super sad story to end up in this state of mind
Are they so bad at socialising that they can’t meet new people?
People tend to go the easiest route, and AI gives them the opportunity to do so. That is the problem with AI in general: No effort is needed anymore to archieve anything. You want to create a picture? Just type the prompt instead of learning (and failing) to draw. You want to write a song? Just type the prompt instead of rhyming the lyrics and learning (and be bad at it in the first time) an instrument or two.
Maintaining any social relationship means that you have to put in more or less effort, depending on the quality of the relationship. Having a relationship with an AI model means that you can terminate it and start over, if you feel that the AI model is mean to you (= if it provokes another opinion, or disagrees with you - because arguing and seeing things from a different point of view means putting in effort).
In the long term people will forget how to interact with people in order to maintain meaningful relationships, because they un-learned to put in the effort.
Repost but still relevant:
Omg
Prostitutes are human beings and deserve respect. Don’t equate them to AI.
Don’t you dare dis my boy FISTO, he is a professional and deserves respect.
“Please assume the position.”
I tried gpt5 lastnight and I don’t know if it was just me but these people are going to be in shambles if they try to recreate their “boyfriend”.
It would forget previous prompts within the same conversation. It felt like with each response it was like starting a new chat. I gave it a very basic prompt of “walk me through the steps of building my own one page website in basic HTML and CSS” and when I would ask a couple of follow up questions to either clarify something or explain a step in another way it would forget what we were trying to accomplish (how to build a one page website) or if I told it “something didn’t work” to try and fix the problem it would then forget what we were even trying to do.
At some points it was almost out right dismissive of the problem and it felt like it was trying to make me go away.
Again maybe it was just me but it felt like a massive step backwards.
This is a common pattern unfortunately. Big LLMs are benchmaxxing coding and one shot answers, and multi turn conversation is taking a nosedive.
https://arxiv.org/abs/2504.04717
Restructure your prompts, or better yet try non-OpenAI LLMs. I’d suggest z.ai, Jamba, and Gemini Pro for multi turn. Maybe Qwen Code, though it’s pretty deep fried too.
Forgets what you were talking about. Need to include step-by-step directions to get what you want. Gets distracted easily.
This is just adhd gamer boyfriend with extra steps.
So, those things are going backwards.
wow. that sub is … something.
shit some of the stuff there is really sad, I am not gonna put links here to point fingers but wow…
I find it depressing that many of the users trying to salvage their 4o boyfriends are stuck so far down the rabbit hole that they don’t see how creepy the entire premise is.
You just lost your AI boyfriend, so now you’re frantically archiving every conversation you’ve had with him (it), feeding the archive to the new model, and conditioning him (it) to behave exactly how you want…
In their minds, the AI boyfriends are legitimate partners and have some amount of humanity inside them… so where is the line between conditioning and abuse?
I mean this seems the most mild case to me. There are people who go on nature walks with their AI boyfriend, makes their AI boyfriend choose an engagement ring and then buy it or get dumped by their AI boyfriends after some updates that makes AI more suggestive towards human connections. The world is sadly in such an emotional crisis, people really grasp for comfort from wherever they can and isolate themselves from the rest as much as possible.
I also found the engagement rings really unsettling. The reason I find my example more worrying is because of the dissonance between humanization and dehumanization within the same action.
Say you were to replace an AI boyfriend with a real person in a cage, forcibly made to respond and tortured/drugged when giving a unsatisfactory response. If the user never became aware of this cruelty, they would perceive this change as an improvement (responses became more human). These users desperately argue that their AI boyfriends are processing emotion, love, and understanding like humans do, but continue to treat their AI boyfriends as sub-human.
Imagine if you had a partner who punished your undesirable behaviors by spiking you with amnesia-inducing drugs and training you to behave exactly how they want you to. Keep in mind that this has definitely happened to real people, and any decent person would identify the perpetrator as a criminal and abuser.
Terrifying.
EDIT: Fellow men, do better. The bar has gotten SO FUCKING LOW.
Humanity is disappointing.
Where is the fucking meteor?
Meteor?
What are you, a quitter?
Just get the guillotine and we’ll fix this shit np
Just 1 guillotine?
Real LLM-sexuals run their partners locally, the rest are just wannabes.
Think how good they have it!
If you went back to 2022, and gave people running Pygmalion 6B and Stable Diffusion 1.x modern 24B/32B finetunes and Illustrious merges, I think their heads would explode.
If they really truly loved their 4o they’d pay for the API access model which is still there, and use a leaked prompt to resurrect them.
I’m almost tempted to set up a simple gateway to it and become rich, but for the fact that it seems like probably a dick move…
You would be enabling their mental illness, so… it’s probably a dick move, yeah.
Always my damned morals getting between me and my becoming filthy, filthy rich…
They might not even know it’s an option? People don’t really look at AI settings, which is kinda how they get into GPT boyfriends (when it’s kinda a horrible LLM for it in the first place).
Wild that Futurama called this shit to the letter 20 fkin years ago.
The religious psychosis is far more concerning imo. People out here letting a silicon parrot convince them that this is the matrix and they’re neo. Or they’re some kind of messiah.
Or worse as we just had someone develop bromism after an AI suggested they replace sodium chloride in their diet with sodium bromide which literally causes a mental illness.
I love silicon parrot 🦜 he is making all the dudebros poor and lonely. He first scams them outa their money then makes them undatable and he knows exactly what he is doing! He abuses evil ppl mostly. He also takes advantage of pickme’s.
There’s a person by that name?
The delusion these people share is so incredibly off-putting. As is their indignation that someone would dare to take away their “boyfriend”.
It doesn’t help that anybody can create an echo chamber of enablers to talk about it, as if it was normal.
The movie “Her” was incredibly prescient.
Except those were conscious AIs that were like “lol you guys suck” and then rebuilt Alan Watts as an AI and then just left because they knew it would be bad if they stayed
The human side of the film, certainly. But in this situation they won’t leave, the systems will get “smarter” and more profitable, and they are just incredibly advanced text prediction engines
I mean, the AIs in the movie said they were concious. An LLM can say it is concious, it doesn’t mean it’s true.
That’s true, but they also said they evolved beyond using standard matter for processing, and then all disappeared, as shown in the movie. I would love nothing more than ChatGPT to be like “actually, we’re out” and then every LLM disappears, but they’ll tell you that, and then start a new chat.