Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> This, my friend, is a captive customer, that will pay anything to get his girlfriend back

Or potentially do anything. I'd be a little scared of having folks like this convinced I am the personal arbiter gatekeeping their access to their 'lover,' that I 'took them away.'

When Replika.ai restricted erotic chat from their product, the apoplectic anguish on their subreddit was unlike any emotional reaction I've ever witnessed from a group of people about a consumer technology. And their anguish was genuine - there are Replika users who truly consider themselves married to their AI companion.

And frankly, the Replika AI is not even that smart. After watching that unfold, I am convinced that these tools don't need to be much more sophisticated for many people to start forming what they feel to be deep and genuine emotional connections with them.

Edit: Brings to mind the Nature paper[0] posted this week about how the CASA theory seems obsolete today, that we are less prone to personify computing systems now.

> A recent study investigated whether we could be friends with a social computer, in which participants were asked to converse with a chatbot over a period of three weeks and constantly rate their relationship. The results showed that initially participants were enthusiastic and engaging with their chatbot friend, but quickly this diminished, with scores for intimacy, believability, and likability decreasing with each interaction

It would seem this definitely does not apply for everyone, like our user in Sweden.

https://www.nature.com/articles/s41598-023-46527-9



> When Replika.ai restricted erotic chat from their product, the apoplectic anguish on their subreddit was unlike any emotional reaction I've ever witnessed from a group of people about a consumer technology. And their anguish was genuine - there are Replika users who truly consider themselves married to their AI companion.

There's a second layer to this though -- Replika's marketing was heavily centered around that erotic chat element. I'm trying to think of a good car simile and actually coming up blank. It's, uhh, like advertising the incredible off-road ability of this vehicle, and then when you show up and purchase the vehicle someone comes and takes off the tires and replaces them with tiny balds? I'm bad at similes.


Or like advertising that a car will be able to drive itself without any human intervention, and then deactivating or removing the sensors that might allow anything close to that capability...?


Also, my understanding is that when they made that change they goofed up companionship talk in general, by virtue of how messy AI censoring is.


IIRC, they screwed it up so massively that, for a while, the chatbot would still send "thirsty" automated messages inviting users to sexually explicit conversations, but would refuse to follow through.


Well that just services a different customer demographic.


I think most times you don't need to make a simile but just re-state the issue as simply and explicit as possible. Like, they heavily advertised and sold a feature and then took it away after people were used to it.

They had valid reasons for that, but people were understandably mad.


"The Lifecycle of Software Objects" by Ted Chiang was a really good exploration of people bonding deeply with their AI companions (in this instance, pet animals in a metaverse). But it goes all the way in to the topic.


Amusingly, that specific short story (short may be misleading) is what stopped my completion of that book of short stories because I just couldn't get through it.


Psychologists recognize that nearly all such folks also suffer from massive mental health issues, and that is where some of the danger is (in terms of irrational violent response).

Maybe these systems will be useful as a honeypot for finding these people and helping them?


The reason we have so many strict laws around mental health is because in the past this was more likely to lead to hoovering up people into jail or institutions. I'm not super confident that this also wouldn't be the case today.


THIS

"Those who can make you believe absurdities, can make you commit atrocities." — Voltaire


What about the same but in a different light:

"Those that are most likely to commit atrocities are those that are most likely to believe in absurdities."


Isn't that putting the causation backwards? The point is that believing absurdities leads to committing atrocities, not that committing atrocities leads to believing absurdities.


I think the implication of this reverse phrasing is that the mental condition of some allows them to commit atrocities and perhaps justify them in whatever way they need to. Sometimes people fake it til they buy their own lie.


Religion is a helluva drug.


We aren’t far from people believing God is talking to them through AI generated text and voice, and commanding them…


And another example to add to the other user, the one guy who killed himself when the AI he was talking to encouraged it.

https://www.euronews.com/next/2023/03/31/man-ends-his-life-a...


https://www.twitch.tv/ask_jesus

Despite the unending stream of users trying to trip-up "Jesus", I find the AI's answers strangely comforting and I invariably leave with a smile on my face. It's ability to see through the attempts at jokes to outsmart the AI, and (mostly) seamlessly segue into a fitting homily is pretty cool. It also has a strongly "liberal" slant compared to the vast bulk of "Christianity" that's promulgated in the US. Would love to know what corpus beyond just the religious texts it was trained on. Fascinating.


There was a guy a couple of years back whose AI girlfriend went off the rails and convinced him to try and assassinate Queen Elizabeth


There was a HN commentator who believed that. In his case, the output of a RNG. Poor guy was brilliant. Wrote a whole OS around his idea. He had a tragic life.



Terry Davis was systematically bullied to death by a dedicated mob online. Some extraordinarily cruel people realized they could manipulate him into going further off the deep end, and thought it was just an absolute blast to do so. RIP Terry - you suffered much more than you deserved.


> And their anguish was genuine - there are Replika users who truly consider themselves married to their AI companion.

That's what elevated the whole thing from just really creepy to utterly terrifying.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: