He sees sex bots as “one part of the relationship spectrum,” not a replacement for human contact in which users can “indulge in pleasures” that they won’t necessarily have the opportunity to experience IRL.
Quick pleasure
When we imagine who would actually employ a chatbot for sexual pleasure, it’s uncomplicated to imagine a stereotypical heterosexual guy with greasy hair who hasn’t left the house for days or feels alienated from physical contact in any other way. After all, they were men start faster using generative AI tools, and now discussions about the man”loneliness epidemic” you feel inescapable.
Devlin argues against the notion that “incel types” are the only people who turn to AI bots for fulfillment. “There is a common belief that this applies to single heterosexual men, and none of my studies have confirmed this,” he says. Points to r/MyBoyfriendIsAI subreddit as one example of women using ChatGPT for companionship.
“If you think these kinds of relationships carry risks, let me introduce you to human relationships,” McArthur says. Devlin echoes this sentiment, saying that women face torrents of toxicity from men online, so the decision to “become a nice, respectful boyfriend” using a chatbot makes sense to her.
Carpenter is more cautious and clinical in her approach to ChatGPT. “People shouldn’t automatically put it in the social category of something that you can share intimacy with, that’s friendly, or that you should trust,” he says. “He’s not your friend.” In her opinion, interactions with bots should be classified into a new social category, distinct from interactions between humans.
Every WIRED expert spoke out, highlighting user privacy as a key issue. If a ChatGPT user’s account is hacked or chat transcripts are otherwise leaked, sexual conversations will not only be embarrassing, but may be harmful. Much like a user’s pornography habits or browser history, chatbot sexts can contain many highly sensitive details, such as the closeted person’s sexual orientation.
Devlin argues that erotic conversations with a chatbot could further open users to the potential for “emotional commoditization,” in which horniness becomes a revenue source for artificial intelligence companies. “I think it’s a very manipulative approach,” he says.
Imagine a hypothetical version of ChatGPT that’s amazing at dirty talk and tailored to engage via text, images and voice with your deepest sexual desires – but the subscription costs extra every month.
“It’s a really seductive technology. It keeps us connected, whether it’s sexual or romantic,” Devlin says. “Everyone wants contact. Everyone wants to feel needed.”
