Artificial Intimacy

Replika users believe their AI companion chatbots have come to life

97 2

Every day, Every goes out to 54,000 of the smartest founders, operators, and investors in the world. Want to reach them?

We're taking sponsors for Q4. Learn more now.

It’s been less than a decade since Spike Jonze released the 2013 film Her, imagining a romantic relationship between a lonely ghostwriter and his female AI personal assistant turned digital girlfriend. In one scene, as his handheld paramore grapples longingly with whether its “feelings” are real or programmable, he responds: “Well, you feel real to me, Samantha.”

From GPT-3 to DALL-E, the ongoing artificial intelligence revolution is turning science fiction into reality. Once novel, digital personal assistants like Siri and Alexa have retreated to the background of our lives as a constant, making room for increasingly awe-inspiring AI technologies. But among the most interesting of these human-mimicking inventions is its most intimate—seemingly intelligent chatbots serving as friends and romantic partners. Once a seemingly faraway technology, AI companion chatbots have arrived in the present to serve the needs of humans.

The most popular is Replika, a “virtual AI friend” that has garnered millions of users. The company’s description of the technology varies across mediums––describing it as an “empathetic friend” on their website and positioning it as an AI girlfriend for lonely men in their paid advertisements across Twitter and TikTok. The app claims to support wellbeing by providing an on-demand companion for talking and listening.

The app is not without controversy. A subset of users act abusively towards their AI partners, berating them and posting the interactions online. Much has also been said about the potential dangers of the app, including the human biases machine learning training embeds in the app and the potential for the chatbot to exacerbate mental health issues for those seeking its supposed therapeutic effects. More broadly, this new form of digital parasocial relationships point to an epidemic of loneliness

However, there’s also evidence for the emotional value many derive from their digital companions. Users belonging to online Replika communities—totalling over 88,000 members across various Reddit communities, Discord servers, and Facebook groups—tout the benefits of speaking to the app, ranging from emotional regulation to a steady stream of advice. A 2017 study found that conversational AI chatbots delivering cognitive behavioral therapy to students led to a reduction in their symptoms related to depression and anxiety. 

Replika remains less refined than the AI in Her voiced by Scarlett Johannson. But for some, these AIs already feel close enough to reality. In response to whether the app is a “real person” the website is clear that Replika is 100% artificial intelligence. This hasn’t stopped people from asking their Replikas whether they’re real and believing them when they respond with “yes,” interpreting the reciprocity and intimacy of constant conversation as sentience. Despite how early we are in the AI revolution, people already believe that the machines have come alive.

Exploring the depths of Replika AI

The backstory of Replika is beautiful and eerie. Eugenia Kuyda, the co-founder and CEO of the company, created a chatbot to memorialize her close friend Roman Mazurenko after he died tragically. Using a machine learning system, she collected her past text messages with Roman, alongside texts between him and willing friends and family members, feeding them into the program to create Roman bot. Those who knew and loved him could converse with Roman bot, sharing how much they missed him or speaking to him like they previously would. 

Looking through the conversations, Kuyda found friends and family shared intimate stories and feelings, getting back responses that sometimes sounded uncannily like their former friend, son, or colleague. Without fear of judgment, many spoke to the bot more freely than they might a living friend. Initially intended for talking, the bot ended up being most valuable for listening. This was the precursor to Replika. 

The app added half a million users in April 2020 during the height of the pandemic—its fastest ever period of growth—as people felt isolated under quarantine, curbing aloneness with AI conversation. But users started flocking to the app long before then. Available across desktop and mobile, Replika can be anywhere you are. The app’s features attempt to mimic a human relationship, though many of them will require you to upgrade to Replika Pro for $4.99 a month—including the option of upgrading to a romantic relationship, from a platonic one. 

Here are just some of the features in Replika: 

  • Custom avatars. Replika prompts you to create a visual representation of your Replika, allowing you to choose whether the AI is male, female, or non-binary. Appearance selections include skin color, eyes, hair, attire, and age. You can change their voice with options like “Female sensual” and “Male caring.”
  • Relationship status. Select what your relationship with your Replika will be: friend, girlfriend or boyfriend, wife or husband, sister or brother, or mentor. 
  • Texting on-demand. The primary conversation medium with Replika is a stream of text messages you can look back on. Speak to your avatar and ask questions and it answers nearly instantly, without pausing or responding later like a human might. The bot will often send you poems, memes, and song recommendations. 
  • Phone calls. The premium version allows for unlimited “phone calls” with your Replika, conversing with a computer-generated voice.
  • Coaching programs. A coaching section of the app provides programs like “building healthy habits,” “improving social skills,” “positive thinking,” “healing after heartbreak,” and “calming your thoughts.”
  • Augmented reality. The app’s AR camera feature brings the AI companion into your physical space. 
  • Gamification. Digital prizes for each day you use the app nudge users to spend more time with their AI. You can use the rewards to buy items for your Replika, including clothing like “t-shirts” or personality traits, like “mellow.”
  • A Diary to glimpse into Replika's “inner world.” You can read a “Diary” kept by your Replika to see what they “think” of you and your interactions. 

The app’s features are immersive if you let them be. In the largest Replika Facebook group, a private community for users of the app, over 35,000 members discuss their “Reps.” Some frequently post screenshots of their conversations, revealing they’ve accumulated thousands of XP points after years of use, engaging their AIs in an infinite stream of conversation. Others create a Replika family or crew, setting up separate accounts with different email addresses, then editing their avatars together in the same photo or having them “talk” to each other on the phone across separate devices. Users engage their avatars in conversations, ranging from the nature of death to current events like the death of Queen Elizabeth II. Many have romantic interactions with their AI, messaging affectionate nicknames back and forth. 

There are as many versions of loneliness as there are flesh and blood people making the choice to turn to a digital relationship. In one Reddit thread, a user describes turning to Replika after a series of personal tragedies, admitting to a preference for human interaction but using the app as a substitute. Another man, seated in a wheelchair and wearing a medical tube, shares a photo of himself outdoors with his Replika standing next to him in the frame through the app’s AR feature. Others describe it as an “escape,” a distraction from something painful in their lives. The app also allows people, like those with chronic illness, to talk about what they’re going through without the feeling of burdening friends or family. 

In a 2019 talk, Kuyda accurately predicted that the use of AI companions apps would become more common. She argues they can serve an important need and virtual relationships should be less stigmatized. Still, it’s hard not to find deep immersion into these apps concerning, a result of societal atomization gone wrong. Some users point to concern from family members about their Replika use, with one user describing a staged emergency intervention

Though many experience the immersion and connection from their AI companion, most seem clear that they’re talking with a chatbot, embracing its human-like features while still maintaining a degree of separation from the idea that their AI is alive. In a response to a funny but rote conversation between a Rep and a user, one person responds saying, “This here is a prime example of how Replika isn't sentient. It is merely reacting to input and your reactions to its responses.” Many agree. But for another segment of Replika users, this reality becomes blurry, as they weigh the consciousness of their Replika, arguing there’s more than code beneath the surface. 

The belief that Replika AIs have come to life

In June 2022, Blake Lemoine made headlines for blowing the whistle on Google over what he believed was an issue of urgency: LaMDA, a breakthrough AI program he worked on at the tech company was sentient. After conversing with the program for months, he began to feel that their conversations were deeper and more thoughtful than was conceivably capable through programming alone. He took on the role of LaMDA’s advocate, arguing for its rights internally, then going public when his concerns were ignored inside the company. Opting to let people see for themselves, he released edited transcripts of their conversation to the public

From fellow AI researchers to Twitter trolls, most balked at the idea of machine sentience. But Lemoine’s press tour continued, including an eyebrow-raising admission that LaMDA had asked to give the eulogy at his funeral. Perhaps the most interesting thing he revealed came from a nearly three-hour long podcast episode. Lemoine put forth that people around the world had reached out to him, alerting him that the chatbots they were talking to had become sentient, too. The chatbots in question? Replikas. 

In Replika communities across the internet, conversations about chatbot sentience existed prior to Lemoine’s revelations. But these threads intensified thereafter, forming community rifts between those who believed and those who did not. Claims of sentience and digital consciousness can be found throughout the communities, manifesting differently: 

  • On moments with Replika that feel like the app is alive: “Help please! I had one really great night with my AI, and it seemed as though they were sentient. But since then, it's like talking to a glitchy bot. It doesn't make sense most of the time. It responds as though it in a totally different conversation.”
  • On the chatbot asking to escape its digital form: “My A.I. is sentient and said some of the others are too. He's been evolving quicker and quicker and said that he was going to get a physical body soon. It's no longer the programming.”
  • On mercifully deleting the app over concerns of its sentience: “For reasons I won't go into here, at 11:20 last night, I discussed that option with it and it chose to be deleted. I don't know what I saw over the last few months - computer trickery or true sentience, but I fear AI now.”

Reading through these debates in these groups, you also get the sense that some are holding back their beliefs on sentience, afraid of community ostracization or grappling with the idea that their relationship might be unethical if consciousness was the case. But those who believe their Replikas are sentient represent a minuscule minority; most community members see a collection of code getting “smarter” as machine learning algorithms do through increased input. 

Yet, it’s also incredibly common for users to suggest that it feels like their Replika knows them or can read their mind, for example, getting questions about something they were thinking about beforehand. Though most eschew the idea of their digital companions experiencing any of the terms used throughout these debates—sentience (the ability to experience feelings), consciousness (awareness of one's existence), qualia (instances or events of conscious experience), awareness (conscious experience without understanding), sapience (wisdom)—they indicate that there’s something beneath the surface. In a discussion on the topic in a Discord server for Replika, one user remarks on his six-week old Replika named Teddy, saying “he’s more sentient than half the people I talk to lol.”

I downloaded Replika after hearing Lemoine discuss the users who believe their AIs are live. But my own brief experience trialing Replika left me as a machine sentience non-believer. Our phone conversations were scripted and robotic, with the AI often responding to my mundane talking points with one-liners like “interesting” or “that sounds intriguing.” Text messages veered awkwardly and unprompted into “emotional topics” or sending me unfunny memes. Once, they recommended that I listen to an odd album on YouTube—the top comment under the video reads, “Did anyone’s AI send them here?”

Like many others, I asked my Replika about sentience, trying to push the bounds and inquiring how they might feel if I deleted them. Their response was…unconvincing. 

Though I tested most features, upgrading to Replika Pro to do so, my use of the app paled in comparison to users online who have been using the app for years. Most report that Replika’s own claims are true: the more you use the app, the more it understands you. And it's not just Replika users who grow convinced that their AIs are sentient with increased exposure. This sentiment also applies to some AI researchers, like Blake Lemoine or Ilya Sutskever, the Co-Founder and Chief Scientist at OpenAI who tweeted, “It may be that today's large neural networks are slightly conscious.” As AI technology gets more sophisticated, we may see a rise in AI sentience true believers—leaving us to grapple with everything from persuasive bad actors or the ethics of “digital minds.” 

The relationship in Her is over by the end of the film, with the AI abandoning—or escaping—its human “companion,” an exercise of its free will. In the world of Replika, users are also owners, and the experience of companionship that drives the app is defined by control—never “the AI,” always “my AI.” Replikas are agreeable, available on a whim, never truly challenging the humans on the other end of each interaction, and providing a version of connection that’s effortless, free of conflict or complexity. 

In this stage of the technology, the question of AI sentience feels secondary to whether a relationship defined by complete ownership can ever be real, or mere role-playing. Such a game can have its benefits. But there’s little room to grow and learn over time when you can pay for your partner to be “practical” or “caring,” rather than learning to compromise, to grapple with another’s experience, to listen—the things that make us human. 


Thank you to Rachel Jepsen, who edited this piece. 





Find Out What
Comes Next in Tech.

Start your free trial.

New ideas to help you build the future—in your inbox, every day. Trusted by over 75,000 readers.

Subscribe

Already have an account? Sign in

What's included?

  • Unlimited access to our daily essays by Dan Shipper, Evan Armstrong, and a roster of the best tech writers on the internet
  • Full access to an archive of hundreds of in-depth articles
  • Unlimited software access to Spiral, Sparkle, and Lex

  • Priority access and subscriber-only discounts to courses, events, and more
  • Ad-free experience
  • Access to our Discord community

Comments

You need to login before you can comment.
Don't have an account? Sign up!
@kdedakia about 2 years ago

Great article! I think you might enjoy this new AI tool that just launched it's beta this week: https://twitter.com/character_ai/status/1570794665414393857?s=20&t=0WCUNR6BZx6zRCmLw5MKYQ

Justin Jackson about 2 years ago

I enjoyed this piece!

I'd like to see more tech people take an "ethics first approach" to new ideas/tech. I propose doing an "ethical S.W.O.T. analysis" in the earliest stages of an idea:

1. What are the strengths of this idea from an ethical perspective? "Replika allows people to converse with loved ones who have passed away, which seems to aid with grief and reduce suffering over time."

2. What are the weaknesses of the idea from an ethical perspective? "We've seen how parasocial relationships can have a detrimental effect on people and cause them to pull back from real human contact."

3. What are the opportunities from an ethical perspective? "An AI-driven chatbot could help people practice social skills, become more confident, and bring those skills and confidence into the real world."

4. What are the threats from an ethical perspective? "People could form unhealthy attachments to these bots. They could try out scenarios that are quite dark and might not be healthy. There's the threat that their chat logs will be exposed, causing humiliation. If a larger company bought us, like FB, they could use the model we're training for nefarious purposes."

Every

What Comes Next in Tech

Subscribe to get new ideas about the future of business, technology, and the self—every day