Skip to content
Join our Newsletter
Join our Newsletter

Do sex dolls deserve respect?

A Seattle professor suggests that treating sex dolls or robotic pets abusively could be a moral or ethical problem.
A professor of bioethics and philosophy suggests humans should treat sex dolls and robotic pets with respect rather than see them as tools.

Do sex dolls or robotic pets deserve respect from their human masters?

With societies increasingly turning to androids and robots for companionship and emotional support, it's an issue we need to explore, says a University of Washington's School of Medicine researcher.

"Can we wrong robots?" asks bioethicist Prof. Nancy Jecker in a recently published AI & Society journal paper.

The bioethics and philosophy professor argues the answer is 'yes.'

"I think we need to question the assumption that robots are simply machines to serve us. Manufacturers increasingly build social robots in forms familiar to humans: soft, touchable, recognizable vocalizations, responses that have some emotional intelligence," Jecker said. "We are designing them this way so that humans can form bonds. So we should be similarly thoughtful and deliberate in how we behave toward social robots."

In recent years, sex doll rentals have become headline fodder in British Columbia. In 2019, one such company to launch was Natrl Dolls, which offered a discreet way for customers to select a doll and pay online and then have it delivered without having to leave the comfort of their homes. It no longer appears to be running.

Jecker said that through the development of increasingly sophisticated sociable robots, robot-human relationships are being transformed.

"Not only can sociable robots furnish emotional support and companionship for humans, [but] humans can also form relationships with robots that they value highly," she wrote. "It is natural to ask, do robots that stand in close relationships with us have any moral standing over and above their purely instrumental value as means to human ends."

Jecker cites the example of Samantha, the sex robot. At a 2017 Austrian electronics festival, Samantha was reportedly molested, her breasts and other body parts badly damaged, heavily soiled, and two fingers broken.

Some believed the behaviour not only shameful but also a violation of Samantha's moral rights.

But, Jecker added, "contemporary Western philosophy suggests the opposite view, that robots cannot be wronged but rather function merely as tools to realize human ends."

However, she argued the idea that Samantha was wronged is not easily dismissed.

Jecker noted that since Samantha lacks intrinsic capacities to suffer or have autonomous preferences, she might not meet requirements for personhood.

The problem with that view, she explained, is that disability critics have argued: "that making the ability to deliberate and impose moral rules upon oneself essential for moral standing excludes too many people, including those with intellectual impairment."

And that could include severely handicapped people, seniors with dementia and most animals.

Jecker quotes author Kate Darling, who writes in Robot Law that if we perceive social robots as life-like things, the authors assert that our behaviour toward them should be regulated."

"She asserts that we have good reason to focus on human responses to robots irrespective of whether robots have 'the stuff 'that moral standing is made of because human views toward robots could carry over to humans," Jecker wrote. "Better not to become desensitized to robots, or we might become desensitized to humans. For example, tolerating violence toward sex robots could lead to tolerating violence toward women."

What it may come down to, Jecker said, is that the emotions or attitudes robots evoke are hardly fixed.

"Instead, they depend on prior design decisions humans make," Jecker said. "If we make robots replete with backstories and names, soft and touchable, with adorable faces and sweet-sounding intonations, human users are apt to bond with, trust, and like them. If we design them to be less affable and purely functional, the response will be entirely different. What kinds of robot-human relationships do we aspire to?"

She said contemporary Western societies have tended to see robots as tools, perhaps even slaves, functioning only to help achieve human ends.

Jerker's article said that point of view stems from Abrahamic religions that have taught humans they are superior to the rest of creation and from philosophies suggesting that technologies, no matter how sophisticated, are simply instruments to help humans perform tasks better.

Jecker sees an opportunity for Western societies to broaden their idea or definition of relationships that can positively influence human life. Indeed, she points to the Japanese Shinto belief system founded on the concept that kami (spirits or gods) inhabit all things, including objects that Westerners consider inanimate, such as robots.

"In Japan, they consider robots valuable for their own sake," Jecker said. "But when Japanese manufacturers send robots to Western nations, they market them instead as 'useful.'"

Jecker suggested a philosophy inspired by the reverence of the Sherpa of Nepal display toward the Himalayan mountain range and other features of nature.

"Think about how we relate to a majestic mountain or a vivid, star-filled sky," Jecker said. "When we open ourselves to think about a robot in a similar way, it can influence our behaviour in a manner that lends itself to positive robot-human.

"If Western thought stays locked into seeing social robots as nothing more than tools or slaves, we do ourselves a disservice, closing off the adventure and possibility of forming highly valued social relationships with them," she said.