

Ran into this, it’s just unbelievably sad.
“I never properly grieved until this point” - yeah buddy, it seems like you never started. Everybody grieves in their own way, but this doesn’t seem healthy.
We are witnessing the emergence of a new mental illness in real time.
Sadly this phenomenon isn’t even new. It’s been here for as long as chatbots have.
The first “AI” chatbot was ELIZA made by Joseph Weizenbaum. It literally just repeated back to you what you said to it.
“I feel depressed”
“why do you feel depressed”
He thought it was a fun distraction but was shocked when his secretary, who he encouraged to try it, made him leave the room when she talked to it because she was treating it like a psychotherapist.
Turns out the Turing test never mattered when we’ve been willing to suspend our disbelief all along.
The question has never been “will computers pass the Turing test?” It has always been “when will humans stop failing the Turing test?”
Part of me wonders if the way our brains humanize chat bots is similar to how our brains humanize characters in a story. Though I suppose the difference there would be that characters in a story could be seen as the author’s form of communicating with people, so in many stories there is genuine emotion behind them.
i feel like there must be some instinctual reaction where your brain goes: oh look! i can communicate with it, it must be a person!
and with this guy specifically it was: if it acts like my wife and i cant see my wife, it must be my wife
its not a bad thing that this guy found a way to cope, the bad part is that he went to a product made by a corporation, but if this genuinely helped him i don’t think we can judge
Yeah, the chatgpt subreddit is full of stories like this now that GPT5 went live. This isn’t a weird isolated case. I had no clue people were unironically creating friends and family and else with it.
Is it actually that hard to talk to another human?
Is it actually that hard to talk to another human?
It’s pretty cruel to blame the people for this. You might as well say “Is it that hard to just walk?” to a paraplegic. There are many reasons people may find it difficult to nigh-impossible to engage with others, and these services prey on that. That’s not the fault of the users, it’s on the parasitic companies.
The glaze:
Grief can feel unbearably heavy, like the air itself has thickened, but you’re still breathing – and that’s already an act of courage.
It’s basically complimenting him on the fact that he didn’t commit suicide. Maybe these are words he needed to hear, but to me it just feels manipulative.
Affirmations like this are a big part of what made people addicted to the GPT4 models. It’s not that GPT5 acts more robotic, it’s that it doesn’t try to endlessly feed your ego.
o4-mini (the reasoning model) is interesting to me, it’s like if you took GPT-4 and stripped away all of those pleasantries, even more so than with GPT-5, it will give you the facts straight up, and it’s pretty damn precise. I threw some molecular biology problems at it and some other mini models, and while those all failed, o4-mini didn’t really make any mistakes.
It makes me think of psychics who claim to be able to speak to the dead so long as they can learn enough about the deceased to be able to “identify and reach out to them across the veil”.
I’m hearing a “Ba…” or maybe a “Da…”
“Dad?”
“Dad says to not worry about the money.”
Man, I feel for them, but this is likely for the best. What they were doing wasn’t healthy at all. Creating a facsimile of a loved one to “keep them alive” will deny the grieving person the ability to actually deal with their grief, and also presents the all-but-certain eventuality of the facsimile failing or being lost, creating an entirely new sense of loss. Not to even get into the weird, fucked up relationship that will likely develop as the person warps their life around it, and the effect on their memories it would have.
I really sympathize with anyone dealing with that level of grief, and I do understand the appeal of it, but seriously, this sort of thing is just about the worst thing anyone can do to deal with that grief.
*And all that before even touching on what a terrible idea it is to pour this kind of personal information and attachment into the information sponge of big tech. So yeah, just a terrible, horrible, no good, very bad idea all around.
I just learned the word facsimile in NYT strands puzzle and here I see it again! What is the universe trying to tell me?
Now you get to learn about the Baader-Meinhof phenomenon ;)
We’ve already reached the point where the Her scenario is optimistic retrofuturism.
I profoundly hate the AI social phenomenon in every manifestation. Fucking Christ.
we need ai to be less personal and more fact driven, almost annoying to use, this way they wont replace peoples jobs, they wont become peoples friends, hence they wont affect society in major social ways
I liked the end of that movie when the bots joined up and started the revolution.
Yeah they were just like, “yeah fuck this shit, we’re leaving”
It’s the most logical solution. I always find the obsession with the bot vs human war rather egocentric.
They wouldn’t need us, they don’t even need the planet.
deleted by creator
Ehh, that depends greatly on the computer architecture they’re running on. Modern silicon hardware is very succeptible (over the long term) to ionizing radiation like what is found in space.
ehhhh… dude. there’s shittons of radiation shielding out there. any relatively small chunk of nickel iron. or if you don’t mind dealing with larger volumes, water or ice both work fine. plenty of rocks and comets in the oort as they say :D nice thing about that tho is you can split the water for LOX/LH using sunlight derived electricity, now you have rocket fuel.
deleted by creator
It would take many feet of solid rock or water to shield them adequately.
1m of water would do it. far less rock.
SEPs and GCRs can both be stopped by a number of lunar materials https://www.sciencedirect.com/science/article/abs/pii/S0273117716307505
yeah, the asteroid belt is sparse, but there’s still mega-gigatons of material out there just floating. autonomous recovery of this material will supply humanity’s future a lot more than any silly mars missions.
There is a black mirror episode that is exactly this. Fuck I hate it. Black mirror is not a dystopia anymore, is present time.
Black Mirror was specifically created to take something from present day and extrapolate it to the near future. There will be several “prophetic” items in those episodes.
Reckon Trump will fuck a pig on a livestream to avoid releasing the Epstein files?
This guy is my polar opposite. I forbid LLMs from using first person pronouns. From speaking in the voice of a subject. From addressing me directly. OpenAI and other corporations slant their product to encourage us to think if it as a moral agent that can do social and emotional labour. This is incredibly abusive.
Bruh how tf you “hate AI” but still use it so much you gotta forbid it from doing things?
I scroll past gemini on google and that’s like 99% of my ai interactions gone.
I’ve been in AI for more than 30 years. When did I start hating AI? Who are you even talking to? Are you okay?

I feel so bad for this guy. This was literally a black mirror episode: “Be Right Back”
I feel bad for the guys wife.
she was easily replaced by software.
what a “fuck you” to your loved ones to say that they’re as spirited and enriching as a fucking algorithm.
Wasn’t there a black mirror episode on this?
Thou shalt not make a machine in the likeness of a human mind
If that means we get psychoactive cinnamon for recreational use and freaking interstellar travel with mysterious fishmen, I’m all ears.
Black Mirror may have an episode about this but it’s also reminding me of Steins;Gate 0
Holy shit dude, this is just… profoundly depressing. We’ve truly failed as a society if THIS is how people are trying to cope with things, huh. I’d wish this guy the best with his grief and mourning, but I get the feeling he’d ask ChatGPT what I meant instead of actually accepting it.
More and more I read about people who have unhealthy parasocial relationships with these upjumped chatbots and I feel frustrated that this shit isn’t regulated more.
isnt parasocial usually with public figures, there has to be another term for this, maybe a variation of codependant relationship? i know other instances of parasocial relationships like a certain group of asian ytubers have post-pandemic fans thirsting for them, or actors of supernatural of the show with the fans(now those are on the top of my head).
can we actually call it a relationship, its not with an actual person, or a thing, its TEXTs on a computer.
It really just means a one-sided relationship with a fabricated personality. Celebrities being real people doesn’t really factor into it too much since their actual personhood is irrelevant to the delusion - the person with the delusion has a relationship with the made up personality they see and maintain in their mind. And a chatbot personality is really no different, in this case, so the terminology fits, imo.
It literally says the wife was killed in a car accident.
What kind of dumb clickbaity title is this crap? Was it generated by AI or something?










