Share “Oklahoma City University professors...”

Oklahoma City University professors consider ethics of using robots for eldercare

Oklahoma City University professors Ted Metzler and Susan Barnes recently hosted a workshop to explore the ethics of robot-human interaction, which is an increasingly relevant topic in elder care.
BY KEN RAYMOND kraymond@opubco.com Published: August 29, 2011
Advertisement

The Andersons' robot is programmed to remind patients when to take pills. It's a straightforward task until a patient refuses to listen.

That scenario presents an ethical dilemma for the robot. Humans have free will; they can decide whether they want their medicine or not. But if they continue to refuse medications, their lives could be endangered.

“The robot is able, in this case, to weigh those factors and decide when to alert a physician or the human nurse charged with this person's care,” Metzler said. “This is the kind of thing that weighs patient autonomy against the welfare of the patient, the kinds of things that are in the domain of moral reasoning.”

Metzler is no Luddite. He has worked on artificial intelligence applications for the Army, Navy and Border Patrol and has been a member of the Association for the Advancement of Artificial Intelligence for 20 years.

Even so, he and Barnes worry that interacting with robots could prove dehumanizing, changing how we view ourselves and our obligations to society.

“The technology has to serve human purposes,” Metzler said, “not the other way around.”

Putting a PARO in the hands of a dementia patient concerns Barnes.

“You already have someone who is having difficulty retaining their perspective of the here and now,” she said. “If they're approached by a human artifact (a robot), does it push them further away from reality and decrease their personhood?”

The answer isn't clear. For most of our existence, humans have created items in our own image — baby dolls, for example, or GI Joes. Children can distinguish between real babies and fake ones, even those designed to wriggle and cry. It stands to reason that adults should be able to do the same.

In fact, people don't want to blur the line between human and machine. In 1978, Japanese robot builder Masahiro Mori noticed something surprising. People liked his human-shaped robots until they became too lifelike; then they were regarded as profoundly unsettling. Mori dubbed this disconnect the Uncanny Valley.

Researchers at New Zealand's University of Auckland have documented the Uncanny Valley phenomenon with regard to the elderly.

“They found out from doing focus groups that the elders were very receptive to the idea of the robot detecting falls and reporting them in an emergency situation,” Metzler said. “But they didn't want robots to have human faces. They preferred them to be not tall, roughly 4 feet in height.”

“Let's start with the three fundamental Rules of Robotics. ... We have: one, a robot may not injure a human being, or, through inaction, allow a human being to come to harm. Two, a robot must obey the orders given it by human beings except where such orders would conflict with the First Law. And three, a robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.” — Isaac Asimov, “Astounding Science Fiction”

Of course, the prime mover behind the robotics industry is the military. Unmanned combat systems put fewer American lives at risk. (At least one member of the Defense Advanced Research Projects Agency, which is tasked with ensuring American troops have a technological advantage over all foes, attended the workshop.)

From Aug. 16-19, weapons makers showcased their wares at the Unmanned Systems North America exhibition in Washington.

The Wall Street Journal described one of the offerings like this: “One look at the unblinking electronic eye and dark contours of the Modular Advanced Armed Robotic System and it's hard not to think of Skynet, the fictional computer in the Terminator film that becomes aware of its own existence and sends robotic armies to exterminate humans.

“The brawny combat robot ... rolls on tank-like treads. It boasts day and night-vision cameras, a four-barrel grenade launcher and a 7.62 mm machine gun.”

Drone aircraft have been used over Pakistan, Yemen and Libya. A consulting firm mentioned in the Journal article estimated that worldwide spending on unmanned aerial vehicles will nearly double by the end of the decade.

Last year, South Korea posted a non-humanoid robot to stand guard over the border with North Korea. The robot, made by Samsung and equipped with a variety of audio and video sensors, is armed with a machine gun and grenade launcher. It can exchange passwords with soldiers.

“This raises a number of situations where there might be, for example, a farmer straying into the area who doesn't know the password,” Metzler said. “He could get shot. This kind of situation is a little more stark in its call for responsible, moral behavior.”

The Andersons are troubled by tactical robots, as well.

“We have always said that if we're not comfortable that the robot can behave in an ethically acceptable fashion, then we don't think it should be put out there,” Susan Anderson said. “This includes killer robots.”

Clearly it's too late to interrupt the development of new robotic systems. But Metzler and the others want to make sure ethics are an integral part of technological advances.

“I'm trying personally to raise awareness of these issues,” Metzler said, “because if somebody doesn't, we're just going to slide into a different way of living without thinking about it and wake up someday saying, ‘How did this happen?'”