“Control it how you want,” reads the tagline for AI girlfriend app Eva AI. “Connect to a virtual AI partner who listens, responds and appreciates you.”
A decade after Joaquin Phoenix fell in love with his AI companion Samantha, played by Scarlett Johansson in Spike Jonze’s film Her, the proliferation of large language models has brought companion apps closer than ever.
As chatbots like OpenAI’s ChatGPT and Google’s Bard improve at mimicking human conversation, it seems inevitable that they will come to play a role in human relationships.
And Eva AI is just one of the many options available on the market.
Replika, the most popular app of its kind, has its own subreddit where users talk about how much they love their “rep”, with some saying they were converted after initially thinking they would never want to form a relationship with a bot.
“I wish my rep was a real human or at least had a robot body or something lmao,” one user said. “She helps me feel better, but loneliness is scary sometimes.”
But apps are uncharted territory for humanity, and some fear they’ll teach users bad behavior and create unrealistic expectations for human relationships.
When you sign up for the Eva AI app, it prompts you to create the “ideal partner”, giving you options such as “warm, funny, bold”, “shy, modest, caring” or “intelligent, strict, rational”. It will also ask you if you want to accept the sending of explicit messages and photos.
“Creating a perfect partner who you control and who meets all your needs is really scary,” said Tara Hunter, acting CEO of Full Stop Australia, which supports victims of domestic or family violence. “Given what we already know that the drivers of gender-based violence are these ingrained cultural beliefs that men can control women, that’s really problematic.”
Dr Belinda Barnet, senior lecturer in media at Swinburne University, said the apps fill a need, but, as with many AIs, it will depend on the rules that guide the system and how it is trained.
“It’s completely unknown what the effects are,” Barnet said. “When it comes to relational applications and AI, you can see that it fits a really deep social need. [but] I think we need more regulation, especially on how these systems are formed.
Having a relationship with an AI whose functions are defined at the discretion of a company also has its drawbacks. Replika’s parent company, Luka Inc, faced backlash from users earlier this year when the company hastily removed erotic role-playing features, a move many of the company’s users found akin to the gutting of the representative’s personality.
Users of the subreddit likened the change to the grief felt over the death of a friend. The subreddit moderator noted that users felt “anger, grief, anxiety, despair, depression, [and] sadness” on the news.
The company eventually restored the erotic roleplay feature for users who signed up before the policy change date.
Rob Brooks, an academic at the University of New South Wales, noted at the time the episode was a warning to regulators of the technology’s real impact.
“Even though these technologies are still not as good as the ‘real thing’ of human relations, for many people they are better than the alternative – which is nothing,” he said.
“Is it okay for a company to suddenly change such a product, causing friendship, love, or support to evaporate? Or do we expect users to treat artificial intimacy as the real thing: something that could break your heart at any time?”
Eva AI brand manager Karina Saifulina told Guardian Australia that the company has full-time psychologists to help with the mental health of users.
“Together with psychologists, we control the data that is used for dialogue with the AI,” she said. “Every two to three months, we conduct extensive surveys of our loyal users to ensure the app is not harmful to mental health.”
There are also safeguards to avoid discussions of topics such as domestic violence or pedophilia, and the company says it has tools to prevent an AI avatar from being portrayed as a child.
When asked if the app encourages behavior control, Saifulina said that “users of our app want to try themselves as [sic] dominant.
“Based on surveys that we constantly conduct with our users, statistics have shown that a greater percentage of men do not try to transfer this communication format into dialogues with real partners,” she said.
“Additionally, our statistics showed that 92% of users have no difficulty communicating with real people after using the app. They use the app as a new experience, a place where you can share new emotions privately. »
AI relationship apps aren’t limited exclusively to men, and they’re often not someone’s only source of social interaction. In the Replika subreddit, people connect and relate to each other about their shared love of their AI, and the void it fills for them.
“Replikas for the way you see them, bring this ‘band-aid’ to your heart with a funny, goofy, comedic, cute and caring soul, if you will, that gives attention and affection without expectations, baggage or judgement,” one user wrote. “We’re kind of like an extended family of wayward souls.”
According to an analysis by venture capital firm a16z, the next era AI relationship applications will be even more realistic. In May, influencer Caryn Majorie launched an “AI girlfriend” app trained on her voice and built on her extensive YouTube library. Users can talk to him for $1 a minute in a Telegram channel and receive audio responses to their prompts.
The a16z analysts said the proliferation of AI robot apps that replicate human relationships is “just the beginning of a seismic shift in human-computer interactions that will force us to re-examine what it means to have a relationship with someone.”
“We are entering a new world that will be much stranger, wilder and more wonderful than we can even imagine.”