In May of 2024, while I was combing through OpenAI’s “Model Spec” laying out how ChatGPT should act, one comment buried in the document struck me as peculiar. It said OpenAI was “exploring” how to let adult ChatGPT users generate content with mature themes such as “erotica, extreme gore, slurs, and unsolicited profanity.”
Seems like the exploration phase is over. OpenAI CEO Sam Altman recently posted on social media that an update coming to ChatGPT this December will allow the chatbot to engage in “even more” types of content like “erotica for verified adults.” In a follow-up post, Altman said erotica was just one aspect of OpenAI’s larger “freedom for adults” stance and that his startup was “not the elected moral police of the world.”
OpenAI lifting these restrictions on mature content will not only change what the bot is allowed to generate for its millions of adult users. ChatGPT’s horny era will be a major realignment in how people form connections with the AI tool, adding another enticing layer of interaction that could keep users on the platform.
“It’s normalizing people sharing very intimate information with chatbots,” says Julie Carpenter, a research fellow at Cal Poly who focuses on AI and attachment. “Sharing your innermost thoughts, desires, sexual proclivities, fetishes, adventures.”
This decision is a major shift for the startup, which previously attempted to block its AI tool from generating smutty outputs. In the past, at least one developer who built X-rated companions using OpenAI’s models was struck with a cease-and-desist letter from the company.
OpenAI acknowledged its receipt of a list of questions from WIRED asking for more details about this planned change to ChatGPT, but did not comment or answer our questions.
Leaders at OpenAI have claimed to be adamant about not making product decisions designed to juice ChatGPT engagement and users’ time spent on the platform, even adding reminders for high-prompting users to take breaks.
In contrast, Altman was asked by Cleo Abrams, an independent journalist, on her podcast in August about choices OpenAI has made that might be best for humanity but not the best decision for a company wanting to dominate generative AI.
“There’s a lot of things we could do that would grow faster, that would get more time in ChatGPT,” said Altman. When pressed for a specific example, he thought for a moment and responded that they hadn’t added a “sexbot avatar,” in a potential swipe at xAI’s erotic anime companion, which had launched weeks earlier.
Hmmm. So, how does a “sexbot” differ from a chatbot that generates “erotica”? The exact meanings are slippery.
“Sam Altman kept saying erotica, but that’s very vague,” says Carpenter. She highlights how Altman’s “archaic choice of words” latches on to the literary, artistic nature of human-written erotica. This may seem more palatable to the public than other explicit terms like pornography.
It remains unclear whether OpenAI’s moderation update will include content generated through text only, or include other aspects as well, like AI images and voice. If OpenAI decides to leave AI images out of this update, the company could sidestep concerns about the spread of erotic deepfakes often generated to harass women and girls. How OpenAI decides to define its updated moderation approach in practice will be paramount in shaping the newest, and potentially most arousing, iteration of ChatGPT.
Dirty Bot
While taboo to discuss, it’s common for users to attempt to have sexual interactions with inhuman technology. “People have been trying to talk dirty to machines since forever. We had this with voice assistants as well,” says Kate Devlin, a professor of AI and society at King’s College London whose research involves digital sex. “So, this is not surprising. They’re giving the people what they want.”
Carpenter believes the rise of social media and other forms of “computer-mediated communication” between humans prepared the culture at large to interact intimately with chatbots.
“That becomes the way you’re used to talking,” she says. “Then, talking to this persona that sounds incredibly humanlike—in the form of a large language model—also seems normal.”
Proponents of erotic chatbot companions see the AI tools as just one dimension of users’ online interactions. One researcher I spoke to suggests steamy chats with LLMs aren’t going to spell doom for humanity’s overall horniness or ability to connect with other humans. “I’m very skeptical of these claims that they will somehow destroy human intimacy just because we can form relationships with our chatbots,” says Neil McArthur, a director at the Centre for Professional and Applied Ethics at the University of Manitoba who focuses on sex and AI.
He sees erotic bots as “one part of your spectrum of relationships,” rather than a replacement for human connection, where users can “indulge a kink” they might not get to explore IRL.
Prompt Pleasure
When imagining who’s actually going to use a chatbot for sexual pleasure, it’s easy to picture some stereotypical greasy-haired straight guy who hasn’t left his house in a few days or feels alienated from physical connection in other ways. After all, men were quicker to start using generative AI tools, and now discussions about the male “loneliness epidemic” feel inescapable.
Devlin pushes back against the idea that “incel types” are the only people turning to AI bots for fulfilment. “There’s a general perception that this is for lonely straight men, and that’s not been the case in any of the research I’ve done,” she says. She points to the r/MyBoyfriendIsAI subreddit as one example of women using ChatGPT for companionship.
“If you think that these kinds of relationships have risks, let me introduce you to human relationships,” says McArthur. Devlin echoes this sentiment, saying that women are faced with torrents of toxicity from men online, so opting to “make yourself a nice, respectful boyfriend” out of a chatbot makes sense to her.
Carpenter is more cautious and clinical in her approach to ChatGPT. “People shouldn’t automatically put it in a social category of something that you can share intimacy with or that it’s friend-like or should be trusted,” she says. “It’s not your friend.” She says bot interactions should be classified into a novel social category that’s differentiated from human-to-human interactions.
Every expert WIRED spoke with highlighted user privacy as a key concern. If a user’s ChatGPT account is hacked or the chat transcripts are otherwise leaked, the erotic conversations would not only be a point of embarrassment, they could be damaging. Similar to a user’s pornography habits or their browser history, their chatbot sexts could include many highly sensitive details, like a closeted person’s sexual orientation.
Devlin argues erotic chatbot conversations could further open up users to the potential for “emotional commodification” where horniness becomes a revenue stream for AI companies. “I think that’s a very manipulative approach,” she says.
Envision a hypothetical version of ChatGPT that’s astounding at dirty talk and fine-tuned to be engaging, through text, images, and voice, with your deepest sexual desires—but the subscription costs extra every month.
“This is indeed a seductive technology. It’s one that offers us connection, whether that’s sexual or romantic,” Devlin says. “Everybody wants connection. Everybody wants to feel wanted.”