We’ve made a lot of progress with robots over the years, but they still can’t feel anything.
Scientists at Cornell University are trying to fix that. They’ve created a stretchable fiber-optic glove that can detect pressure, bending, and other deformations. This tech could one day be used to give robots a sense of “touch,” or even improve virtual reality experiences for humans.
“This sensor could give soft robotic systems – and anyone using augmented reality technology – the ability to feel the same rich, tactile sensations that mammals depend on to navigate the natural world.”
The sensor consists of stretchy fiber-optic lightguides, or tubes, and the way it works is pretty interesting. The deformations are detected via wavelength shifts:
“This long tube contains a pair of polyurethane elastomeric cores. One core is transparent; the other is filled with absorbing dyes at multiple locations and connects to an LED. Each core is coupled with a red-green-blue sensor chip to register geometric changes in the optical path of light.”
In the video, you can see a demonstration of how the sensor can both detect pressure and display exactly where that pressure occurred.
This means that, while a robot using this tech may not experience the actual sense of touch, the information of where a force or deformation occurred would still be available in a way it wasn’t before. As the paper’s co-lead author Hedan Bai explains, “This skin is a way to allow ourselves and machines to measure tactile interactions in a way that we now currently use the cameras in our phones. It’s using vision to measure touch.”
For virtual reality, Bai provides an example of how such technology might be used:
“Let’s say you want to have an augmented reality simulation that teaches you how to fix your car or change a tire. If you had a glove or something that could measure pressure, as well as motion, that augmented reality visualization could say, ‘Turn and then stop, so you don’t overtighten your lug nuts.’ There’s nothing out there that does that right now, but this is an avenue to do it.”
You can check out their full paper, published November 13, 2020 in Science, right over here.