Wednesday, December 25, 2024

The robotic hand imitates human touch

Share

“I’ll make you eat out of my hand” is an unlikely statement you’ll hear from a robot. Why? Most of them have no hands.

If you kept up with the protean field, grabbing and grabbing more like humans was a constant, herculean effort. Now, a recent robotic hand design developed at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) has allowed us to rethink the often overlooked hand. The recent design uses advanced sensors for a highly sensitive touch, helping you “extremely” handle objects with more detailed and fine precision.

GelPalm features a gel, versatile sensor embedded in the palm, drawing inspiration from the tender, deformable nature of human hands. The sensor uses special color lighting technology that uses red, green and blue LEDs to illuminate the object and the camera records the reflections. This mixture generates detailed 3D surface models enabling precise robot interactions.

What would a hand be without fingers to facilitate it? The team also developed several robotic phalanges, called ROMEO (“RObotic Modular Endoskeleton Optical”), made of versatile materials and equipped with hand-like sensing technology. The fingers exhibit something called “passive compliance,” which is when the robot can naturally adapt to the forces without the need for motors or additional control. This, in turn, helps achieve a larger goal: increasing the surface area in contact with objects so that they can be completely surrounded. Manufactured as single, monolithic structures using 3D printing, finger designs are cost-effective to produce.

In addition to increased dexterity, GelPalm offers safer interaction with objects, which is particularly useful for potential applications such as human-robot collaboration, prosthetics, or robotic hands with a human-like sensor for biomedical applications.

Many previous robot designs have typically focused on improving finger dexterity. Liu’s approach shifts the focus to create a more human, versatile end-effector that interacts more naturally with objects and performs a wider range of tasks.

“We draw inspiration from human hands, which have stiff bones surrounded by soft, compliant tissue,” says recent MIT graduate Sandra Q. Liu SM ’20, PhD ’24, lead designer of GelPalm, who developed the system as a CSAIL affiliate and a mechanical engineering Ph.D. . “By combining rigid structures with deformable, compliant materials, we can better achieve the same adaptive talent as our dexterous hands. The main advantage is that we don’t need additional motors or mechanisms to cause the hand to deform – the inherent compliance allows it to automatically adjust to objects, just as our human hands deftly do.”

Scientists tested the hand design. Liu compared the touch detection performance of two different lighting systems – blue LEDs and white LEDs – integrated into ROMEO fingers. “In both cases, similar high-quality 3D tactile reconstructions were obtained after pressing the objects into the gel surfaces,” says Liu.

But the most vital experiment, he says, was to see how well different hand configurations could wrap around and stably grip objects. The team got to work, literally smearing paint on the plastic shapes and pressing them into four types of hands: fixed, structurally compatible, gel compatible and dual compatible. “Visually and by analyzing the painted contact surfaces, it was clear that both the structural and material compatibility of the hand provided significantly better grip than others,” says Liu. “It’s an elegant way to maximize the hand’s role in providing stable grips.”

A notable limitation is the challenge of integrating enough sensor technology into the hand without making it unwieldy or overly sophisticated. The team says the operate of camera-based touch sensors poses size and flexibility issues because current technology cannot easily provide wide coverage without compromising design and functionality. Solving this problem could mean developing more versatile materials for mirrors and improving sensor integration to maintain functionality without compromising practical usability.

“The hand is almost completely neglected in the development of most robotic hands,” says Columbia University associate professor Matei Ciocarlie, who was not involved in the paper. “This work is unusual in that it represents a purposefully designed, usable hand that combines two key features, articulation and feeling, that most robotic hands lack. The human hand is both subtly articulated and highly sensitive, and this work represents a significant innovation in this direction.”

“I hope we are moving towards more advanced robotic hands that combine soft and rigid elements with tactile sensitivity, ideally within the next five to ten years. It’s a complex field in which there is no clear consensus on the best hand design, which makes this work particularly exciting,” says Liu. “When developing GelPalm and ROMEO fingers, I focused on modularity and portability to encourage a wide range of designs. Because the technology is cheap and easy to produce, more people can innovate and discover new solutions. As one laboratory and one person working in this vast field, my dream is that sharing this knowledge can result in progress and inspire others.”

Ted Adelson, the John and Dorothy Wilson Professor of Vision Sciences in the Department of Brain and Cognitive Sciences and a member of CSAIL, is senior author document describing the work. The research was supported in part by the Toyota Research Institute, Amazon Science Hub and the SINTEF BIFROST project. Liu presented the research earlier this month at the International Conference on Robotics and Automation (ICRA).

Latest Posts

More News