Sunday, April 20, 2025

This robot helps you lift objects – looking at your biceps

Share

We humans are very good at working together. For example, when two people work together to carry a bulky object such as a table or sofa, they instinctively coordinate their movements, constantly recalibrating to make sure their hands are at the same height as the other person’s. Our natural ability to make these types of adjustments allows us to collaborate on tasks huge and compact.

However, a computer or robot still cannot easily follow a human. We usually either program them directly, using machine speech, or teach them to understand our words, like virtual assistants such as Siri or Alexa.

Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) recently demonstrated that smoother robot-human collaboration is possible thanks to a novel system they developed in which machines support people lift objects by monitoring their muscle movements.

Called RoboRaisethe system involves placing electromyography (EMG) sensors on the user’s biceps and triceps to monitor muscle activity. The algorithms then continuously detect changes in the level of a person’s arm, as well as discrete up and down hand gestures that the user can make to achieve better motor control.

Graduate student Joseph DelPreto says he can imagine people using RoboRaise to support in manufacturing and construction, or even as assistants at home.

“Our approach to lifting objects with a robot is intended to be intuitive and similar to how you might lift something with another person — roughly copying the other person’s movements, making useful changes,” says DelPreto, lead author of a novel paper on the project published at MIT Professor and Director of CSAIL Daniela Rus. “The key is to employ non-verbal cues that encode instructions for coordination, such as moving a little higher or lower. Using muscle signals to communicate almost makes the robot an extension of you that you can control seamlessly.

The project builds on the team’s existing system that allows users to instantly correct the robot’s errors using brainwaves and hand gestures, now enabling continuous movement in a more collaborative way. “Our goal is to develop human-robot interaction, in which the robot adapts to the human, not the other way around. In this way, the robot becomes an smart tool for physical work,” says Rus.

EMG signals can be hard to handle: they are often very boisterous and it can be hard to accurately predict limb movement from muscle activity. Even if you can estimate how a person is moving, the reaction of the robot itself may be unclear.

RoboRaise deals with this by giving control to a human. The team’s system uses non-invasive sensors placed on the body that detect the firing of neurons when muscles tense or relax. Using wearable devices also avoids problems with occlusion or ambient noise that can complicate vision or speech tasks.

The RoboRaise algorithm then processes the biceps activity to estimate how the person’s arm is moving, so that the robot can roughly mimic it and the person can slightly tense or relax the arm to move the robot up or down. If the user needs the robot to move further away from its position or momentarily strike a pose, they can simply gesture up or down for better control; the neural network detects these gestures at any time based on the activity of the biceps and triceps.

A novel user can start using the system very quickly, with minimal calibration. After attaching the sensors, simply tense and relax the arm a few times and then lift a compact weight a few heights. The gesture detection neural network is trained solely on data from previous users.

When the person received feedback from the robot – when they saw it move or when they lifted something together – the height achieved was much more precise compared to when they received no feedback.

The team also tested RoboRaise on assembly tasks, such as lifting a sheet of rubber onto a base structure. He was able to successfully lift both unyielding and elastic objects onto bases. RoboRaise was implemented on the team’s Baxter humanoid robot, but the team says it can be adapted to any robotics platform.

In the future, the team hopes that adding more muscles or different types of sensors to the system will augment degrees of freedom, with the ultimate goal of performing even more elaborate tasks. Signals such as exertion or fatigue caused by muscle activity can also support robots provide more intuitive assistance. The team tested one version of the system that uses biceps and triceps levels to tell the robot how stiffly a person is holding the end of an object; together, man and machine can smoothly drag an object or rigidly tighten it.

The team will present their work at the International Conference on Robotics and Automation this week in Montreal, Canada. The project was partially financed by Boeing.

Latest Posts

More News