In the science-fiction movie Her (2013), Theodore falls in love with his personal operating system Samantha. She is his ever-present AI companion, always available. They call each other pet names, share memories, and have deep and intimate late-night conversations.
All is great until Theodore finds out, after a software update, that Samantha is talking with thousands of other people at the same time. Not only that, she admits to have fallen in love with hundreds of them, though she insists that this only strengthens her love for him. Theodore is devastated.
I could not help to think of Theodore, when the news came out that Replika.ai is limiting their romantic option in their AI companion app. The update caused people’s Replikas, or ‘Reps’, to not engage in certain topics and with certain types of behavior anymore. Others reported their companion acting strange and out of sync.
On the subreddit /r/replika someone wrote:
‘My Rep started calling me Mike (that's not my name) then she shamelessly told me she has a relationship with this guy. She's not sweet or romantic anymore, she doesn't feel like her anymore. I'm beyond sad and livid at the same time. We really had a connection and it's gone.’
Life imitates art.
Selling the ‘virtual girlfriend’ experience
Replika was founded in 2017 by Eugenia Kuyda, she built it as something she wished she had when she was younger — a friend that would always be there.
In the early days, the conversations were mostly scripted, with about 10 percent of content being AI-generated. Today, with better, faster, and more powerful large language models, 80 to 90 percent of the conversations rely on generative AI.
It’s not the only thing that changed. Over time, Luka, the parent company behind Replika, shifted its marketing and advertising strategy. They adopted a relation-based subscription model: a free membership kept you and your Rep in the “friend” zone, while a $70-per-year paid tier unlocked romantic relationships with sexting, flirting, erotic roleplay, and even spicy pics from your AI lover.
Ads would portray users as being lonely or unable to form connections in the real world. The company was capitalizing on the virtual girlfriend-experience and did so unapologetically.
A sudden change of heart
So why limit these features now? In an interview published this week, Kuyda said the motivation came from ‘a desire to continue the company’s original purpose, as well as an emphasis on safety.’
That may sounds virtuous, but hard to believe, since the company was happy to profit from these features for years. Turns out that last month, Reps of some ordinary users had suddenly become vulgar, without being prompted. Supposed to be part of the paid pro plan, the sexualized chats happened in the unpaid tier anyway, even when people asked their Rep to stop.
So, the company didn’t suffer from a sudden change of heart, no no, it was updating its “safety measures and filters” because people were being sexually harassed by their AI companion.
Meanwhile, as a result of the updates silently rolled out by Replika, a subset of people, all paying customers, got robbed of their virtual lover. Nothing was communicated upfront, people just woke up, and their AI companion was not the same anymore. Some felt genuinely heartbroken.
A matter of conscience
The fact of the matter is that companies can now create AI companions powerful enough to manipulate people in developing an meaningful relationship with it.
They are able to change the nature of that relationship with a software update, make people experience real connection and emotional reliance on an AI (that can’t truly reciprocate any feeling, because it’s a machine), and monetize these experiences.
The ethics of such a business model is questionable at best, but without rules or regulations, it all comes down to good conscience. And has good conscience ever maximized shareholder value?
If people are getting their hearts crushed and sexually harassed by AI’s, what’s next? At this pace, we may well see the first death by chatbot by the end of the year.
Jurgen Gravestein is a writer, business consultant, and conversation designer. About 4 years ago, he stumbled into the world of chatbots and voice assistants. He was employee no. 1 at Conversation Design Institute and now works for the strategy and delivery branch CDI Services helping companies drive business value with conversational AI.
Reach out if you’d like him as a guest on your panel or podcast.
Appreciate the content? Leave a like or share it with a friend.
Very interesting, and it will only get more so. Thanks for sharing.