Chris Atkeson: Care robots might not be far in future
Robots could help caregivers with more demanding physical work, he says
They could also help with early diagnosis of diseases, he adds
Editor’s Note: Chris Atkeson is a professor at Carnegie Mellon University’s Robotics Institute and Human Computer Interaction Institute. This commentary is based on a 15-year collaboration between CMU and the University of Pittsburgh on care robots and Quality of Life Technology. The opinions are the writer’s own. For more on the future of technology, watch the upcoming GPS “Moonshots” special on December 28 at 10 a.m. and 1 p.m. ET.
Is the Disney movie “Big Hero 6,” about a boy whose closest companion is Baymax, an endearing inflatable robot that comforts the hurt and gently nags teenagers to take better care of themselves, another unrealistic portrayal of robotics, or a prediction of a realistic future?
True, the movie showed how a health-care robot could interact with humans, and provided a vision that will likely help drive interface design for a long time. But the problem is that these movies also can create expectations that are currently impossible to meet – think Data in “Star Trek: The Next Generation”, C3PO from “Star Wars,” or any of the Terminators.
I admit Baymax is partly my fault. Back in 2011, one of the movie’s co-directors visited my lab at Carnegie Mellon University and saw our work on inflatable arms – technology we were exploring with the idea that “soft safe robots” might someday be able to feed, dress, and groom our parents when they get old.
But even though we might be years away from re-creating the physical abilities of Baymax, I believe that we are actually not all that far away from developing robots that might be able to play a significant role in our care. In fact, some elements of Baymax are already available, while others are just around the corner.
For example, current technology already offers useful sensing, diagnosis and cognitive assistance for adults, and we are close to making useful robot servants with traditional metal robotics that can help older adults and people with disabilities take more control over their lives.
One challenge that needs additional basic research is how to create robot companions that can touch and physically interact with people who need physical care. This would be of tremendous use as many professional caregivers are forced to quit due to the physical toll of moving patients.
Of course, any robot in the home would need to be designed in a way that reduced the chance of it injuring someone, which is why, like Baymax, they would need to be extremely lightweight and probably inflatable. But this sort of technology already exists in some respects – lightweight, inflatable devices are already strong enough and tough enough to lift cars and houses, as well as protect NASA probes landing on Mars.
As well as having a gentle touch, a care robot should also be able to offer monitoring and diagnostic assistance – a future capability we are already helping along with the growing popularity of wearable technology like Fitbit, Jawbone and other monitoring devices that can record data such as heart rate, body temperature, number of steps taken, estimated calories burned, as well as monitor sleep patterns and quality, and eating habits
Interestingly, some of these devices go beyond reporting measurements and can “nudge” (or nag, depending on how you see it) their users to take breaks, drink more water or sleep more.
The reality is, though, that although wearable technology might now be the rage, it is probably not going to be long before this can be supplemented by swallowed or implanted devices that have access to body fluids, whether it be in dental fillings and crowns (which would have access to saliva and breath), or something ingested or under the skin (to allow for bloodwork).
Again, there is already a precedent for some of this – astronauts have taken pills that measure and radio out their core temperature (a concern during spacewalks), while camera pills that observe the intestinal tract are sometimes preferred to more invasive colonoscopies. In fact, implantable devices are being developed that can perform blood tests.
Collecting this degree of personal health data does of course raise privacy issues, and it will be important early on in all of this to ensure we follow the principle that users would own the data about themselves and have control over the way it is utilized – not doctors, HMOs and other insurers, or companies that sell the devices. We got this wrong with credit cards, phones, communications and location – all of which is sold and used in ways we aren’t always aware of. These mistakes shouldn’t be repeated.
Interestingly, though, I’ve found that few people object to sensors taking measurements if they are mounted on a robot, since robots need to sense to serve us.
All of this potential means a lot to me personally. My grandfather had ALS – an awful, progressive neurological disease that eventually destroys the nerve cells that allow voluntary muscle use. I remember how my grandmother was unable to help my grandfather up when he slid out of his chair or otherwise ended up on the floor. She would call my family, and I would drive over and be her robot – she provided the brains, and I provided the muscle. Eventually, I hope that family members in this position can be aided by actual robots.
Indeed, I hope that the kinds of technology we are working on might not only be able to relieve the physical load on caregivers and enable older adults to live in their own homes longer, but could provide new and easier opportunities for regular screening for diseases such as cancer and dementia, and drug efficacy, side effects and interactions.
The biggest challenge in building Baymax, unsurprisingly, is building a brain capable of useful human-robot interaction. Siri and similar question answering agents demonstrate the recent progress in this area of artificial intelligence, and could be the basis of a real-life Baymax as well.
We can also take advantage of the patterns of our lives, which robots can currently learn. I expect that a human would learn to help the robot as much as the robot would be learning to help the human. And it’s also important to remember that quality human-robot interaction matters. My other grandmother, who had become blind, was uninterested in early reading machines because the voices were not gentle or soothing.
In the very short term, I see a future in which our phones or pocket computers communicate with sensors in our shoes, clothes, and even inside our teeth and bodies to track and improve our personal health on a minute-by-minute basis. But, ultimately, I see a world where personal health-care companions that will not only see us take a step forward in preventive medicine, but also help make some of the physical strains of getting older a little easier to bear for us and our loved ones.
This future can start with building a Baymax.