Was this newsletter forwarded to you? Sign up to get it in your inbox.
To understand the internet, you have to assume that everyone has a constant, low-level amount of horniness. I do not make this observation in a crass or judgmental way, but if you’re wondering what technologies will take hold in the future, the first place you want to check is the adult entertainment industry.
This is why a recent quote from an AI founder caught my eye. In an interview with Wired, Avi Schiffmann pitched his new startup Friend, which sells a Bluetooth-connected pendant that listens to your conversations—not only with Friend but with everyone you encounter. Most of the company’s marketing materials show the device building an emotional connection with the user. “No, I think you were vulnerable. There’s a difference,” Friend responded in one image. (I’ve been chatting with Schiffmann for the last year and will review a demo unit soon for Every.)
Friend isn't specifically designed to be a sexual companion, but he knows some people will use it that way. “For sure there will be some people that try and fuck the USB-C port of this,” he told Wired. “I think I'm shameless enough to understand what I'm building.” It’s startling to read Schiffmann’s assumption—but it’s almost certainly accurate.
Pornographic chatbots are some of the most popular and profitable AI applications in use today. Mainstream chatbots such as ChatGPT and Claude won’t engage in sexual chatter, so a cottage industry has sprouted up to serve the good people of the internet.
Janitor AI attracted 1 million users in its first week. Lest you think that this is simply a problem of lonely men, 80 percent of Janitor’s users identify as non-male, and most of its most popular chatbots are male characters. Even the subreddit for the company has nearly 60,000 users! Others like Muah.AI report hitting $1 million in annualized revenue within months of launching. I was able to track down eight different services nearing similar levels of scale with names like Chai, Sakura, and SpicyChat. This is a Cambrian explosion of growth that reminds me of the early days of the Apple App Store. Each app offers not just smutty text, but the ability to design a character’s physical appearance and persona, and generate images of them doing things you asked them to.
All told, my back-of-the-napkin math means there are about 5-10 million monthly active users of these services today—a far larger user base than I think anyone has realized.
An even larger market is for adult emotional entertainment. These companies aren’t porn-y, but offer some variation on emotional fulfillment, allowing people to role-play companionship. Character.AI has 3.5 million daily active users, each of whom spends two hours a day chatting on average. It facilitates conversations with fictional characters like Iron Man but also lets users role-play with a “school bully.” To give a sense of scale, the school bully character has had 125 million conversations with users. I don’t know of any other consumer company that had this level of growth and usage outside of early social media. Meta offers the ability to design chatbots with an AI studio where you can find psychics, popular anime characters, and even your new “gay bestie.” Each of the chatbots has public stats showing they’ve done hundreds of thousands of conversations with users.
The most common reaction to these statistics is a general sensation of bile. It feels weird that people are building emotional—and even sexual—relationships with algorithms. Plus, there’s an even more worrying underbelly: Many of the seedier parts of the internet have forum discussions on how to use these bots to role-play with “under-18” characters, which feels like it should be illegal.
Instead, today, I am going to ask you to put that feeling aside. I am going to ask you to focus on the user—the multiple millions of them, many of whom are highly active, seemingly replacing parts of their lives or other products, and are willing to pay. Remember: Where goes porn, so goes the world.
Lonely, I’m so lonely
Roughly 60 percent of Americans report feeling lonely on a regular basis. This number has been steadily increasing since the 1970s, with all sorts of follow-on effects like decreasing levels of trust in community, faith in neighbors, and happiness. A similar dynamic plays out in sexual relations. Americans are having the the lowest rate of sex in history: In 2021, 26 percent of Americans didn’t have sex at all.
All of these effects began 20 years before the internet and 35 years before the smartphone. It is easy to blame big tech, video games, or the technology of your choosing, but the loneliness epidemic cannot be solely laid at the feet of Silicon Valley or Hollywood. It is a problem with dozens, if not hundreds, of causes ranging from the collapse of religion to rising housing costs.
Surprisingly, chatbots appear to be a net-positive intervention for people’s feelings of loneliness. A Harvard Business School working paper published last month found, in a meta-analysis, that chatbots were effective at resolving these feelings. And the most expensive of these services top off at around $40 a month, far cheaper than meeting friends at a bar or social club.
However, the longest study only measured those feelings over a week. My intuition says that using these chatbots to solve loneliness is the equivalent of using a GLP-1 drug to solve weight issues. Sure, you get the desired outcome, but without making underlying lifestyle changes, you are ultimately reliant on a product that has unknown long-term effects.
During my research for this piece, I spent hours reading Reddit forums and chatting with people in various Discord servers, trying to get a sense for how users felt about these services. There is only one word: intense. I’ve never encountered these depths of attachment to a technology before. Users reported that they chatted with AI companions for 10 hours a day. Whenever a startup altered the algorithm, users publicly mourned because they felt that the company had “killed” their friends. Even subtle changes in personality or memory are noticeable if you have 10 hours of conversation a day. On Character, additional filters trying to reduce the amount of NSFW activity resulted in mass rebellion as users demanded their return. One common user pattern for the chatbots meant exclusively for emotional companionship was trying to get around content filters and have virtual sex with them anyway. It was common to see comments like, “I prefer talking to [my AI-created character] over talking to people I’m attracted to.”
One comment on the Character.AI subreddit summarized it nicely: “I don't care how flawed their policies get, I will still prefer [Character.AI] over interacting with real people.”
It is tempting to blame these AI services, but remember: Loneliness and sexlessness existed before these services did. AI chatbots may exacerbate the problem, but they have not caused it. People forming bonds with artificial entities is old news. Dating simulators have been some of the most popular genres of video games in Japan since the 1980s.
What strikes me as new with generative AI services is that these products actually talk back. They are malleable, adaptable objects that can respond to how a user behaves to a greater degree than was previously possible. The level of immersion is beyond any previous service capabilities to the point where in June of last year—when the models were far worse than they are now—32 percent of people couldn’t tell when they were talking with a chatbot. With ChatGPT’s new voice mode, the audio is essentially indistinguishable from most phone calls.
And in contrast to previous products that tried to alleviate sexual gratification or loneliness, generative AI blurs the lines between creation and consumption. The use of the product requires conversations and requests, some suspension of belief, and users to put in effort. While most previous products for NSFW purposes are passive affairs that just involve scrolling or selecting, chatbots as companions are more work and, seemingly, more gratifying.
What this means for the future
From my research around chatbots, I had three primary takeaways:
- Chatbots provide a new, incredibly addictive form of entertainment. The creation and consumption dynamic is a new lens that I am still filtering through generative AI. Take, for example, music generation apps like Suno, where consumers can prompt their way into creating the perfect song. It simultaneously makes it easier to create music and harder to consume—because you first have to generate the work. Most users will just stream existing Suno songs, but there is a more hands-on form of consumption available. Similarly, AI chatbots have the potential to be more emotionally gratifying than human companionship because they are guaranteed to lead to responses. Again, this isn’t sci-fi. This is happening for millions of people around the planet today. Imagine a chatbot product not just in sex or companionship, but in each aspect of how we spend our time.
- We are still grappling with what it means to purchase emotions. Injecting the constraints of capitalism into companionship feels like it diminishes the majesty of the human experience. Companionships as a product moves us deeper into the objectification that social media and dating apps already encourage. Worse, the only capital required from these chatbots is fiscal, not emotional. They exist solely to please, and will completely conform themselves to your whims—in contrast to a real relationship, which pushes and stretches you with its demands. Social media platforms like TikTok already provide a steady dopamine drip from entertainment. How much more addictive will services that give us a dopamine drip that comes from love or infatuation be?
- The attention economy still reigns supreme. Despite this, there is no putting the genie back in the bottle. With the release of Meta’s new open-source models, anyone can build an at-least-OK chatbot. An open-source LLM removes the majority of the capital constraints and forces all chatbots to compete on the basis of time spent by its users. Perhaps the theory I am most proud of creating can explain what happens next. Double-bind theory argues that attention aggregators have no choice but to allow content that pushes the bounds of social acceptability. Otherwise, engagement moves to another platform. However, it's a double-bind. The more promiscuous your community content guidelines, the more advertisers will be reluctant to advertise on your platform (as Elon Musk is learning with X). I expect a similar dynamic to play out with chatbots—subscription-based models will become more common because the content is so hard to control and will so frequently push the boundaries of social acceptability.
It is still too early to say whether these products are net good or net bad. One of the principles of Every is that we try to be supportive of startups, and as such, I am unwilling to cast wholesale judgment on the category.
There are real uses to AI chatbots. Not only can they make lonely people feel like they have someone to talk to or bond intimately with, but they might even help those with disabilities if deployed thoughtfully. An ideal progression is an AI bot that gives you the emotional skills of talking with someone and that a user deploys in real life. We aren’t there yet, but it is a possibility.
Whatever the root cause, loneliness is a real problem, and chatbots may offer a temporary salve. Still, that so much of my research pointed to signs of user addiction and a preference for chatbots over humans is cause for concern. A world where humans depend on AI for emotional connection is probably not a better one.
Evan Armstrong is the lead writer for Every, where he writes the Napkin Math column. You can follow him on X at @itsurboyevan and on LinkedIn, and Every on X at @every and on LinkedIn.
Find Out What
Comes Next in Tech.
Start your free trial.
New ideas to help you build the future—in your inbox, every day. Trusted by over 75,000 readers.
SubscribeAlready have an account? Sign in
What's included?
- Unlimited access to our daily essays by Dan Shipper, Evan Armstrong, and a roster of the best tech writers on the internet
- Full access to an archive of hundreds of in-depth articles
- Priority access and subscriber-only discounts to courses, events, and more
- Ad-free experience
- Access to our Discord community
Comments
Don't have an account? Sign up!
Please listen to Kate Bush's 1989 song "Deeper Understanding" (or her 2011 remake) for a well-imagined user perspective, albeit a cautionary one.
Thanks for a thoughtful article about a use of AI I'd heard was happening, but had no idea of the scale. We live in interesting times!
Great work on this piece! I appreciate you acknowledging that the rise in loneliness predates the bulk of tech that often gets the blame. I'm not implying absolute innocence, but the problem has deep root. This line stood out to me: "Injecting the constraints of capitalism into emotions, into companionship feels like it diminishes the majesty of the human experience."
@_erickerr Hi Eric, this is Kate Lee, Every's EIC. We'd like to include your feedback in an upcoming edition of our Sunday newsletter. Would you let me know your job function? Feel free to email me at kate@every.to. Thank you!
18.2% of Americans are 14 or under, so 26% not having sex doesn't seem so bad...
Most regular users of Heroin and Methamphetamine are also deeply disturbed when you remove their supply. The AI chatbots are drugs that hack our social systems. This will not end well.
@mattroche that is my bad! that 26% is of american adults, should've made that more clear. you could be right and there is data out there to support your thesis! i wonder if it'll end up like social media, 5-20 years of outright addiction than a society level reduction in usage because harms become obvious?
Some things I agree 👍🏽 with and some things I didn't like for example putting extra oil in your robot to mate with!
One of my primary concerns is accessibility of these services by the young.
Scenario - Age of 3 - you graduate to receive your first AI companion embedded in a fluffy toy. Parents tweak AI to start moulding behavioural patterns. Throughout the child years the AI is updated and moved from one inanimate object to another. 'Inclusiveness' comes into the picture and the child is "coached" on all forms of sexuality. By the time they reach there teens the necessary usual social norms are completely warped and they move away from human contact altogether... Without the necessary protections and fail-safes the coming years are going to be a roller-coaster ride of events challenging the fabric of social structures.