The machine, a PR2, taught itself in one day, by analysing nearly 11,000 simulated examples of a robot putting a gown onto a human arm. Some of those attempts were flawless, while others were spectacular failures – the simulated robot applied dangerous forces to the arm when the cloth would catch on the person’s hand or elbow. From these examples, the PR2’s neural network learned to estimate the forces applied to the human. In a sense, the simulations allowed the robot to learn what it feels like to be the human receiving assistance.
“People learn new skills using trial and error. We gave the PR2 the same opportunity,” said Zackory Erickson, PhD student at Georgia Institute of Technology in the US. “Doing thousands of trials on a human would have been dangerous, let alone impossibly tedious. But in just one day, using simulations, the robot learned what a person may physically feel while getting dressed,” said Erickson. The robot also learned to predict the consequences of moving the gown in different ways. Some motions made the gown taut, pulling hard against the person’s body. Other movements slid the gown smoothly along the person’s arm.
The robot uses these predictions to select motions that comfortably dress the arm. After success in simulation, the PR2 attempted to dress people. Participants sat in front of the robot and watched as it held a gown and slid it onto their arms. Rather than vision, the robot used its sense of touch to perform the task based on what it learned about forces during the simulations. The robot is currently putting hospital gown on one arm. The entire process takes about 10 seconds. Researchers said that fully dressing a person is something that is many steps away from this work.