An electronic skin which can learn from feeling ‘pain’ could help create a new generation of smart robots with human-like sensitivity.

A team of engineers from the University of Glasgow developed the artificial skin with a new type of processing system based on ‘synaptic transistors, which mimics the brain’s neural pathways in order to learn. A robot hand which uses the smart skin shows a remarkable ability to learn to react to external stimuli.

In a new paper published today in the journal Science Robotics, the researchers describe how they built their prototype computational electronic-skin (e-skin), and how it improves on the current state of the art in touch-sensitive robotics.

Scientists have been working for decades to build artificial skin with touch sensitivity. One widely-explored method is spreading an array of contact or pressure sensors across the electronic skin’s surface to allow it detect when it comes into contact with an object.

Data from the sensors is then sent to a computer to be processed and interpreted. The sensors typically produce a large volume of data which can take time to be properly processed and responded to, introducing delays which could reduce the skin’s potential effectiveness in real-world tasks.

The Glasgow team’s new form of electronic skin draws inspiration from how the human peripheral nervous system interprets signals from skin in order to eliminate latency and power consumption.

As soon as human skin receives an input, the peripheral nervous system begins processing it at the point of contact, reducing it to only the vital information before it is sent to the brain. That reduction of sensory data allows efficient use of communication channels needed to send the data to the brain, which then responds almost immediately for the body to react appropriately.

You can read more here