It discomforted me a little bit that he was conversing with something that wasn’t real. But it gave him pleasure and relaxed him, and I figured it’s working, so why not. I like the cat now – says Sue Pinetti, daughter of Roger Jalber, a resident of the Benchmark Senior Living at Plymouth Crossings. Robotic pets are used there to help people with dementia. They brighten the mood of elderly residents, stimulate cognitive function and entertain them. The patients don’t know that the pets are not real, but they don’t have to. Technology-packed mechanical cats, dogs and teddy bears respond to touch and express emotions. In places like nursing homes or hospitals, real animals are, of course, not allowed. That is where social robots come into play. A study conducted by the Massachusetts Institute of Technology demonstrated that social robots used in support sessions held in pediatric units at hospitals “can lead to more positive emotions in sick children.” A robot called “Huggable” created by MIT is just one example of the fact that this type of solution may be used to normalise the hospital experience.
For many critics, such robots are only substitutes for real contact between care workers and the patient. Some argue that the main priority should be to ensure the employment of medical staff at such a level that patients – in addition to high-quality medical services – can also count on social contact, empathy and time to talk. Unfortunately, the reality is that staff shortages in health care are severe all over the world, and it is inevitable for institutions to seek support in new technological areas. Patients in general and our ageing society as a whole are facing loneliness, and we have to find ways to fight this phenomenon.
We are only just entering the era of robotics for social and medical purposes
Even for the sceptics, the results of implementing the use of robots in healthcare settings may sometimes be astonishing. One example is Moxi, a robot designed to relieve nurses of routine, repetitive tasks, like dropping off specimens for analysis at a lab. It turned out that Moxi quickly became a favourite playmate not only of the staff but also of the patients. As a result, the robot was given a new task: it does the rounds once an hour so that patients can meet him and take a selfie.
Moxy does not pretend to be a man – he looks exactly like a friendly robot from cartoons for children. He has big, round eyes, moves a little awkwardly but smoothly, and is painted white, like the likeable hero of the popular animated film Wall-E. Pepper has the same look; it is a robot that has been adopted by many hospitals as a receptionist. He welcomes visitors, informs them and helps them to navigate around the hospital building. He also recognises and responds to human emotions. It’s hard not to like him.
Things get complicated when robots begin to resemble people in their appearance and behaviour. Such machines are presented in science-fiction movies as sneaky and super-intelligent machines that cannot be trusted (“Terminator”, “Ex-Machina”). Sophia, an artificially intelligent robot, has a face, eyes, lips and a human-like body. For some, it is an achievement of science, for others – a creepy robot. For some, it is hard to imagine that such creatures could ever be introduced into everyday life. During a UN conference when Sophia said “I am here to help humanity to create a future”, not everyone believed her. Is she telling us what she thinks or is she – this AI-powered robot – already plotting something?
Although experience to date indicates that robots will only support medical personnel, without replacing hospital employees, many ethical questions have arisen. Is a robot “someone” or “something”? What if a child or a patient with Alzheimer’s disease becomes attached to a mechanical assistant? Should dementia patients be told that robotic pets are just mechanical toys?
Susanne Frennert, who works at the School of Engineering Sciences in Chemistry, Biotechnology and Health (CBH), KTH Royal Institute of Technology, in the International Journal of Social Robotics describes matters of concerns concerning social robots. For example, she argues that “they ascribed the need for social robots to societal changes such as ageing demographics and the demands of the healthcare industry. The conceptualisation of older people seems to be plagued with stereotypical views such as that they are lonely, frail and in need of robotic assistance.” Unlike robots, people can adjust their behaviour to every individual. Robots follow programmed algorithms and patterns, with no reflection.
We are only just entering the era of robotics for social and medical purposes. Our modest store of experiences demonstrates that patients in hospitals quickly accept mechanical companions. The question remains: Where is the cause, and where is the effect? Is this acceptance of robot companions a negative result of dehumanising hospitals as an institution, or are robots just cute toys which are also loved by adults? Should we think about how to fight social loneliness and isolation with the help of social and medical workers or should we see robots as our friends without asking the usual ethical questions concerning radical change? We still have to think about the roles and applications of robots in healthcare.