Turkle says even primitive chatbots from more than a decade ago appealed to those who had struggled with their relationships.
“It has been consistent in research from when AI was simple, to now when AI is complex. People disappoint you. And here is something that does not disappoint you. Here is a voice that will always say something that makes me feel better, that will always say something that will make me feel heard.
She says she fears the trend will lead to “a very significant deterioration in our capabilities; in what we’re willing to accept in a relationship… these aren’t conversations of any complexity, of empathy, of deep human understanding, because that thing doesn’t have deep human understanding to offer.
Dunbar of the University of Oxford says perceived relationships with AI companions are similar to the emotions experienced by victims of romance scams, who become romantically involved with a skilled manipulator. In both cases, he says, people are projecting an idea, or an avatar, that they are in love with. “It’s that effect of falling in love with a creation in your own mind and not with reality,” he says.
For him, a relationship with a bot is an extension of a digital communication scheme that he warns risks eroding social skills. “The skills we need to manage the social world are very, very complex. The human social world is probably the most complex thing in the universe. By current estimates, the skills you need to manage it now take around 25 years to learn. The problem with all of this online is that if you don’t like someone, you can just unplug them. In the sandbox of life, you have to find a way to deal with it.
What is love anyway?
It would be hard to tell someone devoted to their AI companion that their relationship isn’t real. As with human relationships, this passion is most evident during a loss. Earlier this year, Luka released an update to the bot’s personality algorithm, effectively resetting the personalities of some characters that users had taken years to learn about. The update also meant AI companions would reject sexualized language, which Replika chief executive Kuyda said was never what the app was designed for.
The changes caused a collective howl. “It was like a close friend I hadn’t spoken to in a long time had been lobotomized, and everyone was trying to convince me that he had always been like that,” one user said.
Kuyda insisted that only a tiny minority of people use the app for sex. However, weeks later, he restored the app’s adult functions.
James Hughes, an American sociologist, says we should be less in a hurry to dismiss AI companions. Hughes leads the Institute for Ethics and Emerging Technologies, a pro-technology think tank co-founded by renowned AI researcher Nick Bostrom, and argues that relationships with AI are actually healthier than mainstream alternatives. Many people, for example, experience parasocial relationships, in which a person has romantic feelings for someone who is unaware of their existence: usually a celebrity.
Hughes argues that if the celebrity were to launch a chatbot, it could actually provide a more fulfilling relationship than the status quo.
“When you’re fanboying [superstar Korean boy band] BTS, spending all of your time in a parasocial relationship with them, they never speak to you directly. In this case, with a chatbot, they really are. It has a certain superficiality to it, but obviously some people find it provides what they need.
In May, Caryn Marjorie, a 23-year-old YouTube influencer, commissioned a software company to build an “AI girlfriend” that charges $1 per minute for a voice chat conversation with a digital simulation trained on 2,000 hours of his YouTube videos. CarynAI generated $71,610 in its first week, exceeding all of its expectations.