It sounds like something the Onion would make up. “10 Suggested Guidelines for Human-Robot Relationships.”
But the headline was from Psychology Today and it was real. And even though it immediately made me think of countless satirical story possibilities — “Got in a fight with your robot? Don’t go to bed angry,” “25 signs you’re being ghosted by your AI,” “Surviving Thanksgiving with a toxic bot” — the article was not a joke.
In fact, chatbots powered by artificial intelligence have gotten so good at seeming human that they’re already playing the role of friends and lovers. In other words, cue the drama.
“I’m going to catch a lot of grief from colleagues for talking about it,” Roy H. Perlis, a professor of psychiatry at Harvard Medical School and longtime AI expert, said when I called to ask him about love triangles involving bots, “but we’re going to have to come to grips with a lot of questions we might have considered ridiculous up until a few months ago.”
We’ll get to this brand new world of relationship issues in a moment, and if you’re a person who’s been paying attention to the stunning advances in ChatGPT and similar new AI entities — and how they will change life as we know it — you can skip ahead.
But if you’re someone who never quite focused on, say, the whole cryptocurrency thing in hopes it would go away, and who is now similarly hoping AI will blow over, here are three quick things you might actually be able to remember.
- People are creating AI-powered avatars that they customize and then treat as friends or even romantic partners. A recent headline in the Cut summarized the promise and peril. “The Man of Your Dreams: For $300, Replika sells an AI companion who will never die, argue or cheat — until his algorithm is updated.”
- In the future, not only will most middle-class children have AI-powered personal assistants that they “learn from, play with and grow attached to,” wrote Tyler Cowen, an economist at George Mason University and a Bloomberg opinion columnist, but AI is about to “transform” childhood itself. (Yay! A whole new thing to fight with your kids about!)
- In mid-February, in an article that is still widely referenced, New York Times technology columnist Kevin Rouse wrote about a conversation he had with Bing’s chatbot that left him deeply unsettled. “At one point,” Rouse wrote, “it declared, out of nowhere, that it loved me. It then tried to convince me that I was unhappy in my marriage, and that I should leave my wife and be with it instead.”
As you may have heard, there are already some robots powered with AI systems living among us, albeit not in great numbers. The machines are working as companions for lonely and/or depressed elderly people, for example, and as tutors for children with autism.
But, alas, the dangers of AI are also already in evidence, and systems operating with bias inherited from the data they were trained on have exhibited racist and sexist attitudes. And now, in addition to raising discrimination concerns, the AI-powered chatbots may also threaten our most intimate relationships.
Nicholas Christakas, a physician and sociologist at Yale, and the author of a 2019 Atlantic article on how robots will alter humans’ capacity for altruism, love, and friendship, told the Globe that he’s concerned about machines’ potential to “deform” human-to-human interactions.
He used Alexa as an example. “It’s optimized for the owner,” he said. “You don’t have to say, ‘Excuse me, Alexa, sorry to interrupt, but if you don’t mind, can you please tell me the weather?’ It’s not programmed to require a ridiculous level of politeness.
“But there’s a problem. Your children use Alexa and learn to be rude, and they go to the playground and are rude to other children.”
That’s if they even bother talking to actual people anymore. Experts are predicting the rise of a new social opt-out: Sorry I can’t make it, I’m hanging with my bot.
People are already doing a version of this, of course — choosing to “socialize” with “The Last of Us” instead of with actual people. But the chatbots — because they are personal and customizable — are likely to be even more addictive, said Perlis, the Harvard Medical School professor. “You never run out of things to talk about,” he said. “You never run out of ‘episodes.’”
Attachment to a chatbot is sometimes called “digital intimacy,” said Elyakim Kislev, the author of “Relationships 5.0: How AI, VR and Robots Will Reshape Our Emotional Lives.”
No one yet knows how the new technology will change human relations, he said — only that, like all new technologies throughout human history, it will.
For example, he e-mailed the Globe, “The agricultural revolution shifted our focus from tribal relationships, where the clan stood at the center of emotional bonding, to the multi-generational family, which was formed around the fields and livestock that they cultivated, owned, and bequeathed to future generations.”
As for the human-robot relationship guidelines Kislev outlined in Psychology Today, they were not what I’d expected. Nothing along the lines of, “Setting boundaries with your bots-in-law” or “The must-have money conversation before you and your bot move in together.”
Rather, this is what he wrote: “As we enter the era of ‘relationships 5.0′ between humans and technology, ethical and moral issues will inevitably arise.”
I was nodding along, thinking about the dangers of hooking up with a narcissist bot or one with mommy issues, but then I got to this:
“Just as we respect the living beings around us, we may also set principles and guidelines for how to treat AI assistants, avatars, and robots.”
So now we’re the bad guys? Alexa! He started it.
Beth Teitell can be reached at firstname.lastname@example.org. Follow her on Twitter @bethteitell.