Thursday, April 23, 2026

The wristband allows the user to control the robot with their own movements

Share

Next time you’re scrolling through your phone, take a moment to appreciate this feat: a seemingly mundane activity is made possible by the coordination of 34 muscles, 27 joints, and more than 100 tendons and ligaments in the hand. Indeed, our hands are the most agile parts of our body. Imitating their many and varied gestures has long been a challenge in robotics and virtual reality.

Now engineers at MIT have designed an ultrasonic wristband that precisely tracks the user’s hand movements in real time. The band generates ultrasound images of the muscles, tendons and ligaments of the wrist as the hand moves and is connected to an artificial intelligence algorithm that continuously translates the images to the appropriate positions of the five fingers and hand.

Scientists can train the wristband to learn the user’s hand movements, and then the device can communicate in real time with a robot or virtual environment.

During the demonstration, the team showed that a person wearing the wristband could wirelessly control a robotic hand. When the person gestures or points, the robot does the same. In a sort of wireless puppet interaction, the user can manipulate the robot to play a straightforward melody on the piano and shoot a diminutive basketball into a hoop on the desk. Using the same wristband, the user can also manipulate objects on the computer screen, for example by pinching their fingers to enlarge or minimize a virtual object.

The team is using the wristband to collect hand movement data from many more users with different hand sizes, finger shapes and gestures. They envision creating a enormous dataset of hand movements that could be used, for example, to train humanoid robots in dexterity tasks, such as performing specific surgical procedures. The ultrasonic band can also be used to grasp, manipulate, and interact with objects in video games, design applications, or other virtual settings.

“We believe this work has immediate impact on the potential replacement of hand-tracking techniques with portable ultrasonic wristbands in virtual and augmented reality,” says Xuanhe Zhao, the Uncas and Helen Whitaker Professor of Mechanical Engineering at MIT. “It could also provide huge amounts of training data for dexterous humanoid robots.”

Zhao, Gengxi Lu and their colleagues present a fresh format wristband design newspaper published today w Their MIT co-authors include former graduate students Xiaoyu Chen, Shucong Li, and Bolei Deng; graduates SeongHyeon Kim and Dian Li; postdoctoral fellows Shu Wang and Runze Li; and Anantha Chandrakasan, MIT provost and Vannevar Bush Professor of Electrical Engineering and Computer Science. Other co-authors include graduate students Yushun Zheng and Junhang Zhang, Baoqiang Liu, Chen Gong, and Professor Qifa Zhou of the University of Southern California.

Seeing strings

Currently, there are many approaches to capturing and imitating the dexterity of the human hand in robots. In some approaches, cameras record the hand movements of a person manipulating objects or performing tasks. Others involve a person wearing a glove with sensors that record the person’s hand movements and transmit the data to a receiving robot. However, building a complicated camera system for different applications is impractical and prone to visual obstructions. Gloves equipped with sensors can limit the natural movements and sensations of the hand.

A third approach uses electrical signals from muscles in the wrist or forearm, which scientists then correlate with specific hand movements. Scientists have made significant progress with this approach, but these signals are easily influenced by environmental noise. They are also not sensitive enough to recognize subtle changes in movements. For example, they can recognize whether the thumb and forefinger are pressed together or apart, but they do not notice the enormous distance between them.

Zhao’s team wondered whether ultrasound imaging could capture more dexterous and continuous hand movements. His group is developing various forms of ultrasonic stickers — miniaturized versions of heads used in doctor’s offices that are bonded to a hydrogel material that can safely stick to the skin.

In their fresh study, the team placed an ultrasound sticker in a wearable wristband to continuously monitor the muscles and tendons of the wrist.

“The tendons and muscles in your wrist act like strings pulling the puppets that are your fingers,” Lu says. “So the idea is this: every time you take a picture of the condition of the strings, you will know the condition of the hand.”

Mapping manipulation

The team designed a wristband with an ultrasonic sticker the size of a smartwatch and added electronics as diminutive as a cell phone. They attached the band to a volunteer’s wrist and confirmed that the device generated clear and continuous images of the wrist as the volunteer moved his fingers in various gestures.

The challenge was therefore to associate the black and white ultrasound images of the wrist with specific hand positions. As it turns out, the fingers and thumb have 22 degrees of freedom, or different ways of stretching or angling. The researchers found that they were able to identify specific areas in ultrasound images of the wrist that correspond to each of the 22 degrees of freedom. For example, changes in one region relate to thumb extension, while changes in another region correlate with index finger movements.

To establish these connections, a volunteer wearing the armband moved his or her hand in various positions while researchers recorded the gestures using multiple cameras surrounding the volunteer. By matching changes in certain areas of the ultrasound images to the hand positions recorded by the cameras, the team was able to mark areas of the wrist image with the appropriate degree of hand freedom. However, performing this translation continuously and in real time would be an impossible task for a human.

So the team turned to artificial intelligence. They used an artificial intelligence algorithm that can be trained to recognize patterns in images and correlate them with specific labels and, in this case, different degrees of freedom in the hand. The researchers trained the algorithm using ultrasound images, which they carefully labeled, describing areas of the image associated with a specific degree of freedom. They tested the algorithm on a fresh set of ultrasound images and found that it correctly predicted the appropriate hand gestures.

Once the researchers were able to pair the AI ​​algorithm with the wristband, they tested the device on more volunteers. In the fresh study, eight volunteers with different hand and wrist sizes wore the armband while performing various gestures and grips, including making signs for all 26 letters of American Sign Language. They also held items such as a tennis ball, a plastic bottle, scissors and a pencil. In each case, the band precisely tracked and predicted the position of the hand.

To demonstrate potential applications, the team developed a straightforward computer program that wirelessly paired with the wristband. As the user performed pinch-and-grab movements, these gestures corresponded to zooming in and out of an object on the computer screen and virtually moving and manipulating it in a polished and continuous manner.

The researchers also tested the wristband as a wireless controller for a straightforward, commercial robotic hand. Wearing the armband, the volunteer performed movements while playing the keyboard. The robot, in turn, imitated movements in real time to play a straightforward melody on the piano. The same robot was also able to imitate finger tapping to play a computer basketball game.

Zhao plans to further miniaturize the wristband, as well as train the artificial intelligence software on many other gestures and movements performed by volunteers with wider hand sizes and shapes. Ultimately, the team is working on a hand-tracking device that anyone can wear to wirelessly manipulate humanoid robots or virtual objects with great dexterity.

“We believe this is the most advanced way to track dexterous hand movements using wrist imaging that can be worn on the body,” Zhao says. “We believe these wearable ultrasonic wristbands can provide intuitive and versatile control in virtual reality and robotic hands.”

This research was supported in part by MIT, the U.S. National Institutes of Health, the U.S. National Science Foundation, the U.S. Department of Defense, and the Singapore National Research Foundation through the Singapore-MIT Alliance for Research and Technology.

Latest Posts

More News