Roboticists have long been working towards creating electronic-skin (e-skin) for technology which could behave like our own giant sensory organ in relaying information about the environment. Getting there requires an electronic device that is far-reaching and highly-sensitive that can relay information in the blink of an eye, and engineers from the University of Glasgow think they might’ve created just that.
The computational e-skin prototype that enables robots to register pain is presented in the journal Science Robotics and is reported to be a significant advancement in touch-sensitive robotics that could even improve prosthetic limbs by giving them a near-human sensitivity to touch.
Previous attempts at creating touch sensitive robots have run into a snag with processing time, as spread-out sensors are capable of relaying a large volume of data, but it then takes a minute for a computer to translate the data into something meaningful.
Humans’ peripheral nervous systems inspired this new design, as it begins processing sensations from the point of touch and only sends the really important stuff up as far as the brain. A similar approach in robotics would free up the communication channels and stop the computer from getting bogged down with excessive volumes of sensory information.
A grid of 168 synaptic transistors was the key to unlocking this means of information processing, which were made up from zinc-oxide nanowires which could be spread out across a flexible surface. These were deployed across a human-shaped “hand” equipped with skin sensors to create a robotic appendage that was capable of differentiating between light and heavy touch.
Making a robot feel pain might sound mean, but the intention is to enhance sensitivity in a way that is beneficial to trial and error learning. As children, pain is a useful tool as we come to learn things like “touching hot iron: bad,” and the sense of touch can benefit robots trying to learn from external stimuli in the same way.
“What we’ve been able to create through this process is an electronic skin capable of distributed learning at the hardware level, which doesn’t need to send messages back and forth to a central processor before taking action,” said Professor Ravinder Dahiya, who heads up the University of Glasgow’s Bendable Electronics and Sensing Technologies (BEST) Group, in a statement.
“Instead, it greatly accelerates the process of responding to touch by cutting down the amount of computation required. We believe that this is a real step forward in our work towards creating large-scale neuromorphic printed electronic skin capable of responding appropriately to stimuli.”
As well as creating robotics who could learn to interpret their environment and avoid injury, it’s projected that it could one day have applications for human prosthesis, too.
“In the future, this research could be the basis for a more advanced electronic skin which enables robots capable of exploring and interacting with the world in new ways, or building prosthetic limbs which are capable of near-human levels of touch sensitivity,” Fengyuan Liu, a member of the BEST group and a co-author of the paper, added.