Wednesday, December 25, 2024

Letting robots manipulate cables

Share

People may find it arduous to manipulate slender, pliant objects such as ropes, wires or cables. But if these problems are arduous for humans, they are almost impossible for robots. As the cable slides between the fingers, its shape constantly changes, and the robot’s fingers must constantly sense and adjust the position and movement of the cable.

One could imagine using such a system for both industrial and domestic tasks, one day enabling robots to lend a hand us with tasks such as tying knots, shaping wire, or even surgical suturing.

The team’s first step was to build an pioneering two-finger gripper. The opposing fingers are featherlight and move quickly, allowing for agile adjustments to force and position in real time. There are “GelSight” vision sensors on the fingertips, made of cushioned rubber, with built-in cameras. The gripper is mounted on the robot arm, which can move within the control system.

The team’s second step was to create a perception and control structure that would enable cable manipulation. For perception purposes, they used GelSight sensors to estimate the position of the cable between the fingers and measure frictional forces as the cable slid. Two controllers operate in parallel: one modulates the gripping force and the other adjusts the gripper position to keep the cable in the gripper.

Once mounted on the arm, the gripper can reliably follow the USB cable starting from any gripping position. Then, in combination with a second gripper, the robot can move the cable hand over hand (just as a human would) to find the end of the cable. It can also be adapted to accommodate cables of various materials and thicknesses.

In another demonstration of its capabilities, the robot performed what humans routinely do by connecting headphones to a cell phone. Starting with a free-floating earphone cable, the robot was able to slide the cable between its fingers, stop when it felt the plug touch its fingers, adjust the plug’s position, and finally insert the plug into the socket.

“Manipulating soft objects is as common in our daily lives as manipulating cables, folding fabrics, and tying strings,” says Yu She, a postdoc at MIT and lead author of the modern paper on the system. “In many cases, we would like robots to help humans do this kind of work, especially when the tasks are repetitive, boring or dangerous.”

Pull me along

Following cables is challenging for two reasons. First, it requires controlling “grip force” (to allow polished sliding) and “grip position” (to prevent the rope from falling out of the gripper fingers).

This information is arduous to capture from conventional vision systems during continuous manipulation because it is usually obscured, steep to interpret, and sometimes inexact.

Moreover, this information cannot be directly observed using vision sensors alone, so the team used touch sensors. The gripper joints are also pliant, which protects them from potential impacts.

The algorithms can also be generalized to different cables with different physical properties such as material, stiffness and diameter, as well as to those with different speeds.

Comparing the different controllers used in the team’s gripper, their control policy allowed the cable to be held in the hand for longer distances than the three others. For example, the “open loop” controller only tracked 36 percent of its total length, the gripper easily lost the cable when it curved, and it took multiple re-grabs to complete the job.

Looking to the future

The team observed that it was arduous to pull the cable back once it reached the edge of the finger due to the convex surface of the GelSight sensor. Therefore, they hope to improve the shape of the finger sensor to improve overall performance.

In the future, they plan to study more complicated cable manipulation tasks, such as routing cables and inserting cables through obstacles, and eventually want to investigate autonomous cable manipulation tasks in the automotive industry.

Yu She wrote the paper with MIT students Shaoxiong Wang, Siyuan Dong, and Neha Sunil; Alberto Rodriguez, associate professor of mechanical engineering at MIT; and Edward Adelson, John and Dorothy Wilson are professors in the Department of Brain and Cognitive Sciences at MIT.

This work was supported by Amazon Research Awards, the Toyota Research Institute, and the Office of Naval Research.

Latest Posts

More News