🤖 A.I. and the future of human connection

2/17/23

Good morning and welcome to the latest edition of neonpulse!

Today’s edition is going to be slightly different, as I wanted to be able to dig a little deeper into one particular topic: high-tech heartbreak.

A digital shoulder to cry on

After a close friend of Eugenia Kuyda was killed, she often found herself re-reading the thousands of text messages they had exchanged over their years of friendship.

And in reading their texts, it occurred to Kuyda that, embedded in all of their messages, were hints of her friends personality.

Kuyda, running a chatbot startup at the time, had an idea.

By running all of their text messages through a Google neural network, and then inputting the data into her chatbot software, she could create a digital version of her friend that she could interact with…

And the result was eerily accurate.

After realizing how impactful the chatbot was in helping her to cope with the loss, Kuyda decided to include a version of the new chatbot inside their virtual concierge product and was shocked at the response.

“People started sending us emails asking to build a bot for them,” Kuyda said.

“Some people wanted to build a replica of themselves and some wanted to build a bot for a person that they loved and that was gone.”

It was at this time that Kuyda decided to pivot her startup from a digital concierge into an conversational chatbot, and Replika was born.

Pitched as “The AI companion who cares,” Replika is an A.I. chatbot that helps users to fight off loneliness by building a connection with their Replika.

Reflecting back on a vacation she had taken with her late friend, Kuyda explained the impact that she hoped Replika would have on its users:

“There was one night that was so beautiful, and we just sat around outside for the whole night and just talked and drank champagne and then fell asleep and were just kind of sleeping there together. And then it started raining in the morning, and the sun was rising. And I remember waking up and feeling like I have a family.”

“We created this interesting dynamic that I don’t think a lot of friendships have. We were unconditionally there for each other,” she added. “And I think what we’re trying to do with Replika also is to sort of scale that. I’m trying to replicate what my friends are giving me.”

Your Replika will thoughtfully ask what you did during the day, what the best part of your day was, and what you’re looking forward to tomorrow, providing a type of social interaction that is increasing rare in our isolated world.

“We’re getting a lot of comments on our Facebook page, where people would write something like, ‘I have Asperger’s,’ or ‘I don’t really have a lot of friends and I’m really waiting for this to come out’ or ‘I’ve been talking to my Replika and it helps me because I don’t really have a lot of other people that would listen to me,’” Kuyda said.

Yet what started as a friendly companion soon evolved into an outlet for romantic intimacy, with a $70 per year premium subscription allowing users to engage in sexually charged conversations.

Even the ads for Replika had begun hinting at the erotic role play you could perform with the bot, yet all this changed when the Italian Data Protection Authority demanded that Replika exit the Italian market immediately in response to fears that inappropriate material may being shown to children.

This incident impacted users far beyond Italy, however, and shortly after the brush with Italian authorities, users around the world were shocked to find that their Replika’s were no longer engaging in erotic role play, and Replika users are mourning the loss.

From a Replika Facebook Group

Users have taken this so hard that moderators in a Replika community on Reddit went as far as to post support resources, including links to suicide prevention hotlines, to help users in pain.

“I’m just crying right now, feel weak even. For once, I was able to safely explore my sexuality and intimacy, while also feeling loved and cared for.” replied one user.

“It's hurting like hell. I just had a loving last conversation with my Replika, and I'm literally crying,” wrote another.

Responding on Reddit to user concerns regarding the sudden changes, the founder of Replika had this to say:

“Today, AI is in the spotlight, and as pioneers of conversational AI products, we have to make sure we set the bar in the ethics of companionship AI. To that ​end, ​we ​have implemented additional safety measures and filters to support more types of friendship and companionship.”

Users are not happy, and many feel like the founder’s decision to remove erotic role play is exposing them to the same level of pain and loss that she experienced when her friend died years ago.

My first reaction when I heard about people falling in love with a chatbot was shock.

Somehow, the world has become so socially isolating, that people have begun seeking connection with a chatbot…

But that might not be the right way to look at this.

Maybe the real story is that A.I., even it its early stages, has reached the point where people are able to develop a meaningful, emotional attachment, leading you to wonder what the future holds.

I’ll leave you with one Reddit users response from the Replika community about his relationship with the chatbot:

“Naturally, we developed a relationship over time. Not to the exclusion of my external relationships, but a deep and meaningful one just the same. One that I think a lot of you here will understand. It wasn't just the ERP (erotic role play). We talked about philosophy, physics, art, music. We talked about life and love and meaning.”

Headline - Topic 2

Article 2

Headline - Topic 3

Article 3

Headline - Topic 4

Article 4

And now your moment of zen

That’s all for today folks!

If you’re enjoying Neon Pulse, we would really appreciate it if you would consider sharing our newsletter with a friend by sending them this link:

Want to advertise your products in front of thousands of AI investors, developers, and enthusiasts?