Robots looking for a soft touch Friday, 16 December 2016

American engineers have devised a way for a soft robot to feel its surroundings internally, in much the same way humans do.

Robert Shepherd is an assistant professor of mechanical and aerospace engineering at Cornell University in New York State and principal investigator of the Organic Robotics Lab. He says most robots achieve grasping and tactile sensing through motorised means, which can be excessively bulky and rigid.

However, his group is using stretchable optical waveguides act as curvature, elongation and force sensors in a soft robotic hand.

They employed a four-step soft lithography process to produce the core (through which light propagates), and the cladding (outer surface of the waveguide), which also houses the LED (light-emitting diode) and the photodiode.

The more the prosthetic hand deforms, the more light is lost through the core. That variable loss of light, as detected by the photodiode, is what allows the prosthesis to “sense” its surroundings.

“If no light was lost when we bend the prosthesis, we wouldn’t get any information about the state of the sensor,” Shepherd said. “The amount of loss is dependent on how it’s bent.”

The group used its optoelectronic prosthesis to perform a variety of tasks, including grasping and probing for both shape and texture. Most notably, the hand was able to scan three tomatoes and determine, by softness, which was the ripest.

Optical waveguides have been in use since the early 1970s for numerous sensing functions, including tactile, position and acoustic. Fabrication was originally a complicated process, but the advent over the last 20 years of soft lithography and 3D printing has led to development of elastomeric sensors that are easily produced and incorporated into a soft robotic application.

Future work on optical waveguides in soft robotics will focus on increased sensory capabilities, in part by 3D printing more complex sensor shapes, and by incorporating machine learning as a way of decoupling signals from an increased number of sensors.

“Right now, it’s hard to localise where a touch is coming from,” Shepherd said.

[Doctoral student Shuo Li shakes hands with an optoelectronically innervated prosthesis. Photo: Huichan Zhao/Cornell]