Tuesday, December 24, 2024

The pliable robot hand can pick up and identify a wide range of objects

Share

Robots have many strengths, but gentleness is not traditionally one of them. Stiff limbs and fingers make it tough for them to grasp, hold, and manipulate a range of everyday objects without dropping or crushing them.

Recently, researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) discovered that the solution might lie in a substance more commonly associated with up-to-date buildings and plastic: silicone.

At a conference this month, scientists from CSAIL director Daniela Rus’s Distributed Robotics Lab demonstrated a 3D-printed robotic hand made of silicone rubber. This hand can lift and carry objects as dainty as an egg and as slim as a compact disc.

Equally impressive, three of its fingers are equipped with special sensors that can estimate the size and shape of an object accurately enough to identify it from a set of many items.

“Robots are often limited in what they can do because it is difficult for them to interact with objects of different sizes and materials,” Rus says. “Grasping is an important step in performing useful tasks; in this work we set out to develop both soft hands and the control and planning systems supporting them that enable dynamic gripping.”

The paper, co-authored by Rus, his graduate student Bianca Homberg, PhD student Robert Katzschmann, and postdoc Mehmet Dogar, will be presented at this year’s International Conference on Wise Robots and Systems.

The Strenuous Science of Gentle Robots

The gripper, which can also pick up objects like a tennis ball, a Rubik’s Cube, and a Beanie Baby, is part of a larger body of work from Rus’s lab at CSAIL that aims to show the value of so-called “soft robots” made from unconventional materials like silicone, paper, and fiber.

Scientists say pliable robots have several advantages over “hard” robots, including the ability to handle irregularly shaped objects, squeeze into tight spaces and recover easily from collisions.

“A robot with stiff arms will have much more difficulty with tasks such as picking up an object,” Homberg says. “This is because he needs to have a good model of the object and spend a lot of time thinking about how exactly he will perform the grab.”

Gentle robots offer an intriguing up-to-date alternative. However, the downside to their added flexibility (“compliance”) is that they often have difficulty accurately measuring where an object is, or whether it has even managed to pick it up.

That’s where the CSAIL team’s “bending sensors” come in. When the gripper grabs an object, the fingers send out location data based on their curvature. Using that data, the robot can pick up an unknown object and compare it to existing clusters of data points that represent objects from the past. With just three data points from a single grip, the robot’s algorithms can tell the difference between similarly sized objects, like a cup and a lemonade bottle.

“As a human, if you’re blindfolded and you pick something up, you can feel it and still understand what it is,” Katzschmann says. “We want to develop similar capabilities in robots—essentially giving them ‘sight,’ even though they can’t actually see.”

The team hopes that with further development of the sensors, the system will eventually be able to identify dozens of different objects and program it to interact with them differently depending on their size, shape and function.

How it’s working

The hand can grasp with two types of grips: “enveloping grips”, in which the object is completely contained within the gripper, and “pinching grips”, in which the object is held by the fingertips.

Equipped for Baxter’s popular manufacturing robot, the gripper significantly outperformed Baxter’s default gripper, which was unable to pick up a CD or sheet of paper and was susceptible to completely crushing objects such as a soda can.

Like Rus’ previous robotic arm, the fingers are made of silicone rubber, which was chosen for its properties of being relatively stiff but elastic enough to expand under pressure from pistons. Meanwhile, the gripper interface and external finger molds are 3D printed, meaning the system will work on virtually any robotic platform.

Rus says the team plans to spend more time refining it in the future and adding more sensors that will allow the gripper to identify a wider range of objects.

“If we want robots in human-centric environments, they need to be more adaptive and able to interact with objects whose shape and location are not precisely known,” Rus says. “Our dream is to develop a robot that, like a human, can approach an unknown object, large or small, determine its approximate shape and size, and figure out how to interact with it in one fluid movement.”

The work was performed at the Distributed Robotics Laboratory at MIT with support from The Boeing Company and the National Science Foundation.

Latest Posts

More News