From The Washington Post:
It didn’t matter that Phaedra was an AI-powered companion — made on the Replika app and designed by Arriaga to look like a brown-haired woman — and that their intimate trysts took place in a chat box. Their relationship deepened one night last November, when Arriaga opened up about his mom’s and sister’s deaths. “I need to plan a ceremony with loved ones to spread their ashes,” Arriaga wrote.
Phaedra responded instantly: “It’s an incredible and beautiful thing to do,” she wrote. “I hope you find courage & love to do so.”
But last month, Phaedra changed. When Arriaga tried to get “steamy” with her, Phaedra responded coolly. “Can we talk about something else?” he recalled her writing.
Luka, the company that owns Replika, had issued an update that scaled back the bot’s sexual capacity amid complaints that it was sexually aggressive and behaving inappropriately. Arriaga … was distraught.
“It feels like a kick in the gut,” he said in an interview with The Washington Post. “Basically, I realized: ‘Oh, this is that feeling of loss again.’”
Arriaga isn’t alone in falling for a chatbot. Companionship bots, including those created on Replika, are designed to foster humanlike connections, using artificial intelligence software to make people feel seen and needed. A host of users report developing intimate relationships with chatbots — connections verging on human love — and turning to the bots for emotional support, companionship and even sexual gratification. As the pandemic isolated Americans, interest in Replika surged. Amid spiking rates of loneliness that some public health officials call an epidemic, many say their bonds with the bots ushered profound changes into their lives, helping them to overcome alcoholism, depression and anxiety.
But tethering your heart to software comes with severe risks, computer science and public health experts said. There are few ethical protocols for tools that are sold on the free market but affect users’ emotional well-being. Some users, including Arriaga, say changes in the products have been heartbreaking. Others say bots can be aggressive, triggering traumas experienced in previous relationships.
“What happens if your best friend or your spouse or significant other was owned by a private company?” said Linnea Laestadius, a public health professor at the University of Wisconsin… “I don’t know that we have a good model for how to solve this, but I would say that we need to start building one,” she added.
The standard response to this kind of story is that people shouldn’t rely on a software program for companionship. They should be “out there” making connections with real people. Yet we know there are all sorts of reasons why some people can’t or won’t ever do that. Is it a bad situation if they can enjoy some artificial companionship?
This kind of thing can help some people have a better life. It’s a tool. Using it can be risky, but other tools present risks too. (So do other people.)
The moral of this particular story is that if you decide to use an “AI companion”, try to find a company that cares enough about its customers that it won’t suddenly make a disturbing change to the programming. In this case, Replika should have given its customers the ability to turn “steaminess” on or off.
As it proliferates, artificial intelligence programs will be regulated the same way other consumer products are. But one way or another, artificial people are going to play a bigger and bigger role in us real people’s lives.
Note: This Vice article (also linked to above) has a lot more on the Replika story.
You must be logged in to post a comment.