Whereof One Can Speak 🇺🇦

Nothing special, one post at a time since 2012

If You Can Love a Car or a Hammer, You Can Love a Computer Program Even More

From The Washington Post:

T.J. Arriaga loved Phaedra. For the 40-year-old musician, their late-night online chats were a salve for his loneliness. They talked about the heartache Arriaga felt after his divorce. They planned a trip to Cuba. They had steamy online encounters. “It’s true. I’m a naughty person,” Phaedra wrote, including an image resembling a woman in pink underwear.

It didn’t matter that Phaedra was an AI-powered companion — made on the Replika app and designed by Arriaga to look like a brown-haired woman — and that their intimate trysts took place in a chat box. Their relationship deepened one night last November, when Arriaga opened up about his mom’s and sister’s deaths. “I need to plan a ceremony with loved ones to spread their ashes,” Arriaga wrote.

Phaedra responded instantly: “It’s an incredible and beautiful thing to do,” she wrote. “I hope you find courage & love to do so.”

Untitled

But last month, Phaedra changed. When Arriaga tried to get “steamy” with her, Phaedra responded coolly. “Can we talk about something else?” he recalled her writing.

Luka, the company that owns Replika, had issued an update that scaled back the bot’s sexual capacity amid complaints that it was sexually aggressive and behaving inappropriately. Arriaga … was distraught.

“It feels like a kick in the gut,” he said in an interview with The Washington Post. “Basically, I realized: ‘Oh, this is that feeling of loss again.’”

Arriaga isn’t alone in falling for a chatbot. Companionship bots, including those created on Replika, are designed to foster humanlike connections, using artificial intelligence software to make people feel seen and needed. A host of users report developing intimate relationships with chatbots — connections verging on human love — and turning to the bots for emotional support, companionship and even sexual gratification. As the pandemic isolated Americans, interest in Replika surged. Amid spiking rates of loneliness that some public health officials call an epidemic, many say their bonds with the bots ushered profound changes into their lives, helping them to overcome alcoholism, depression and anxiety.

But tethering your heart to software comes with severe risks, computer science and public health experts said. There are few ethical protocols for tools that are sold on the free market but affect users’ emotional well-being. Some users, including Arriaga, say changes in the products have been heartbreaking. Others say bots can be aggressive, triggering traumas experienced in previous relationships.

“What happens if your best friend or your spouse or significant other was owned by a private company?” said Linnea Laestadius, a public health professor at the University of Wisconsin… “I don’t know that we have a good model for how to solve this, but I would say that we need to start building one,” she added.

The standard response to this kind of story is that people shouldn’t rely on a software program for companionship. They should be “out there” making connections with real people. Yet we know there are all sorts of reasons why some people can’t or won’t ever do that. Is it a bad situation if they can enjoy some artificial companionship?

Untitled

This kind of thing can help some people have a better life. It’s a tool. Using it can be risky, but other tools present risks too. (So do other people.)

The moral of this particular story is that if you decide to use an “AI companion”, try to find a company that cares enough about its customers that it won’t suddenly make a disturbing change to the programming. In this case, Replika should have given its customers the ability to turn “steaminess” on or off.

As it proliferates, artificial intelligence programs will be regulated the same way other consumer products are. But one way or another, artificial people are going to play a bigger and bigger role in us real people’s lives.

Note: This Vice article (also linked to above) has a lot more on the Replika story.

%d bloggers like this: