scorecardresearch Skip to main content
Alex Beam

Do unto robots ...

justin renteria for the boston globe

It was with some trepidation that I approached MIT Media Lab researcher Kate Darling to discuss her 2012 academic paper “On Extending Legal Rights to Social Robots.” I found the subject fascinating, but maybe the field of robot rights had run out of battery power, as it were.

Also there was the guffaw factor. I didn’t want to make fun of her, but that didn’t mean other people wouldn’t. I needn’t have worried. “Still super interested!” Darling e-mailed me. “Have fellowships at Harvard and Yale for robot ethics this year and am planning a bunch of experimental work on human-robot interaction at MIT.”

Robots having legal rights or privileges sounds ridiculous. But 20 years ago, the idea that the nation’s leading law schools would be teaching animal-rights courses seemed equally absurd. Now anti-cruelty legislation is quite common in industrialized countries, and late last year the Nonhuman Rights Project made national headlines when it argued that a chimpanzee had “standing,” meaning the right to sue, in a New York State court.

Advertisement



That case is currently on appeal.

Animal activists necessarily assert sentience on behalf of their clients, arguing that cats, bears, and elephants share an awareness that is like our own. Ditto on sensitivity to pain, physical and emotional. So here’s a problem right off. Robots aren’t sentient yet, and they are unlikely to be so anytime soon.

Ray Kurzweil has been talking up the “Singularity” — the forthcoming union of human and machine consciousnesses — for quite a while, but few take him very seriously. The Seattle-based Society for the Prevention of Cruelty to Robots allows that robots won’t be appearing in court any time soon, “but recent advances in data nanostructures, cognitive modeling, and neural networking have convinced many people that the advent of some sort of created intelligence is much closer than previously thought.”

Yes, Virginia, there is a Society for the Prevention of Cruelty to Robots, founded 15 years ago by music engineer Pete Remine. His website talks about a Robotic Bill of Rights, which Remine told me is more or less on hold; “until the state of artificial intelligence progresses a bit further, there’s really not a lot of relevant work to be done,” he e-mailed me.

Advertisement



There is ample proof that humans care about robots. During the height of the Iraq war, Washington Post writer Joel Garreau observed soldiers bonding with the complicated robots that detonated lethal improvised explosive devices. In one instance, a technician carried the remains of a “really great robot” named Scooby-Doo to a repair shop, hoping that the obviously “dead” robot could be brought back to life.

When we chatted, I asked Kate Darling what kinds of experiments she had carried out. “I did this one workshop where we gave everyone these cute little plush robot dinosaurs called PLEOs, and we asked them to spend time bonding with the toys,” she said. “They gave them names, they played with them a little . . . then we asked them to torture and kill them.”

“The results were more dramatic that I could even imagine,” she said. “There was an option to save your own dinosaur by killing someone else’s, and no one wanted to do that. They refused to even hit the things.”

For an advanced society, America lags far behind countries such as Japan and South Korea in . . . sexual robotics. Japan has hosted a thriving female doll escort service for almost 10 years, and engineers have designed robots called actroids, often young women who “breathe,” speak, and mimic many human behaviors.

Surely “Samantha,” the sensual and sensitive operating system that wins Joaquin Phoenix’s heart in the movie “Her” is barely a step removed from a sophisticated sexbot.

Advertisement



“The sexbot issue is going to be discussed sooner than most people think,” Darling predicted. “There are sexual acts that we don’t allow between humans, and people might argue for laws protecting robots from performing them.” In her 2012 paper, she quotes Immanuel Kant to the effect that a man shooting a dog “damages in himself that humanity which it is his duty to show toward mankind.”

So how we treat our robots will tell us volumes about ourselves.


Alex Beam’s column appears regularly in the Globe. He can be reached at alexbeam@hotmail.com.