Oklahoma City University professors consider ethics of using robots for eldercare

Oklahoma City University professors Ted Metzler and Susan Barnes recently hosted a workshop to explore the ethics of robot-human interaction, which is an increasingly relevant topic in elder care.
BY KEN RAYMOND kraymond@opubco.com Published: August 29, 2011
Advertisement
;

“Scientists are actually preoccupied with accomplishment. ... They never stop to ask if they should do something. They conveniently define such considerations as pointless. If they don't do it, someone else will. Discovery, they believe, is inevitable. So they just try to do it first. That's the game in science.” — Michael Crichton,

“Jurassic Park”

On Aug. 8, a select group of 25 people met at a San Francisco hotel to discuss a topic that used to exist solely in the realm of science fiction.

The workshop, hosted by Oklahoma City University professors Ted Metzler and Susan Barnes and a colleague from New Hampshire, focused on robot ethics, particularly in regard to elder care.

“There are two sides to this,” said Barnes, OCU's chair of Transformative and Global Education. “One is, are we dealing with ethical questions when we assign robots to interact with humans who are elderly and vulnerable? The second is, can robots make ethical decisions, or decisions based on an ethical paradigm? ...

“We had two cohorts represented. One consisted of individuals who are currently being funded by research or commercial entities to build robots to help with elder care. They've bypassed the ethical question. The other group was saying we should stop a minute here and consider personhood and viewpoint and what happens to a person who interacts with a robot instead of a human.

“Can you use a robot as a replacement for human contact? The general answer to that is no.”

Robot-human interaction is an increasingly relevant topic in elder care. As baby boomers age, long-term care facilities, already struggling with insufficient staffing levels, are likely to reach a critical mass.

“Japan has the same situation,” said Metzler, OCU's director of the Darrell Hughes Program in Religion and Science Dialogue. “They have a very large proportion of the population in the upper age range, and they have a relatively low birthrate. ... So they and South Korea have led the way in the development of this technology.

“But it's a problem that is being recognized in other countries. In the U.S., it's been recognized for some time. With a shortage of nurses and an increasing demand for elder care, assistive robotics may be a solution. It is in the process, I would say, of becoming an industry.”

Consider PARO, for example.

PARO is a robotic baby harp seal that simulates the behavior of a real pet, providing the therapeutic benefits of pet ownership without the responsibilities.

The $6,000 robot is covered in soft fur and has exaggeratedly long eyelashes.

It coos, squeaks, moves and sleeps. It responds to touch and speech, knows when it's being held and pouts if it doesn't get enough attention.

So far, the automatons can't be found in Oklahoma facilities. That's likely to change.

Christine Hsu, a company representative, told The Oklahoman in an email that more than 1,500 PARO robots have been sold in Japan and Europe since 2003.

“In (the) U.S.,” she wrote, “we started to introduce PARO since last year, and currently we have users in military retirement communities, Alzheimer associations, nursing homes, assisted living facilities, hospitals, school(s) for autistic children and individuals across the country.”

By most accounts, patients respond well to the robots. The Washington Post reported on an 81-year-old woman who cried and said, “I love her,” when a PARO was put in her lap. An Illinois newspaper, the Herald-News, said some nursing home residents who wouldn't respond to humans immediately played with a PARO; some mistook it for a real animal.

And therein lies the rub.

“To create an entity that seems designed to deliberately fool the person it's interacting with that it cares and has feelings that can be hurt is unethical,” said Susan Anderson, professor emerita of philosophy at the University of Connecticut.

“But what if there is no one human to interact with that person? In that circumstance, it may be the lesser of two evils.”

“Robots do not hold on to life. They can't. They have nothing to hold on with — no soul, no instinct. Grass has more will to live than they do.” — Karel Capek, “R.U.R.”

Anderson and her husband, Mike Anderson, a computer science professor at the University of Hartford in Connecticut, attended the California workshop. They are the editors of a recent book called “Machine Ethics” and have endeavored to program a robot to behave ethically.

The couple work with an Aldebaran Nao, a mass-production humanoid robot that stands more than 2 feet tall and costs about as much as a new economy car. Working with an ethicist, the Andersons developed software that generalizes appropriate responses from a pool of specific cases, effectively giving their robot the ability to make decisions based on experience.

“From an artificial intelligence standpoint,” Mike Anderson said, “it's using machine learning techniques that permit you to get to this generalized principle.”

Continue reading this story on the...