A collaboration at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) has designed a robotic gripper that consists of two flexible fingers that conform to the shape of the object into which they come in contact. The hollowed-out interior region of the fingers houses a camera and other sensory components of the system. Professor Edward Adelson, leader of the Perceptual Science Group at CSAIL, and Sandra Liu, a mechanical engineering Ph.D. student, developed the robotic gripper with “GelSight Fin Ray” fingers. GelSight is a silicone gel material. A fin ray item — fingers, in the case of the MIT robotic gripper — can passively adapt to different shapes and therefore grasp a variety of objects, Liu said. The fin ray has become a popular item in soft robotics since, 25 years ago, it was discovered that when a fish’s tail was pushed, the ray bent toward the applied force, almost embracing the finger applying the force of the push rather than tilting away from it. Though the design is popular, it lacks tactile sensitivity. The team assembled the fingers of its gripper from flexible plastic materials made on a 3D printer. Where the fingers typically used in soft robotic grippers typically have supportive cross-struts through the length of their interiors, the team’s design contains the camera and sensory components. The camera is mounted to a semigrid backing on one end of the hollowed-out cavity. The cavity itself is illuminated by LEDs. The camera faces a layer of sensory pads that are composed of the GelSight material that is glued to a thin layer of acrylic material. The sheet of acrylic material is attached to the plastic finger piece at the opposite end of the cavity. Upon touching an object, the finger folds around it, melding to the object’s contours. By determining how the silicone and acrylic sheets are deformed during this interaction, the camera — with accompanying computational algorithms — can assess the general shape of the object, its surface roughness, its orientation in space, and the force applied by and imparted to each finger. The GelSight Fin Ray gripper holds a glass Mason jar with its tactile sensing. The soft robotic gripper features an embedded camera and sensory capabilities. Courtesy of MIT CSAIL. Liu and Adelson tested the robotic gripper in an experiment in which one of the two grippers was “sensorized.” The device handled a screwdriver, plastic strawberry, tube of paint, Mason jar, and wine glass. While the gripper held the fake strawberry, for instance, the internal sensor detected the seeds on its surface. The fingers grabbed the paint tube without squeezing so hard as to breach the container and spill its contents. The GelSight sensor could even make out the lettering on the Mason jar. The shape of the jar was ascertained by seeing how the acrylic sheet was bent when wrapped around it. That pattern was then subtracted, by a computer algorithm, from the deformation of the silicone pad. The remainder was the more subtle deformation due just to the letters. Glass objects are challenging objects for vision-based robots due to the refraction of the light, whereas tactile sensors are immune to this optical ambiguity. When the gripper picked up the wine glass, it could feel the orientation of the stem and could make sure the glass was pointing straight up before it was slowly lowered. When the base touched the tabletop, the gel pad sensed the contact. Proper placement occurred in seven out of 10 trials. “Sensing with soft robots has been a big challenge, because it is difficult to set up sensors — which are traditionally rigid — on soft bodies,” said Wenzhen Yuan, an assistant professor in the Robotics Institute at Carnegie Mellon University who was not involved with the research. Yuan said that the technology has potential for wide use for robotic grippers in real-world environments. Liu and Adelson are improving the device. They aim to make GelSight sensors that are compatible with soft robots devised by other research teams. They also plan to develop a three-fingered gripper. The Toyota Research Institute and the U.S. Office of Naval Research provided funds to support the work. The work was presented this month at the 2022 IEEE 5th International Conference on Soft Robotics: www.doi.org/10.48550/arXiv.2204.07146.