You’ve probably met someone who identifies as a visual or auditory learner, but others learn through a different modality: touch. The ability to understand tactile interactions is especially critical for tasks like learning fine operations and playing musical instruments, but unlike video and audio, touch is arduous to record and transfer.
To address this challenge, researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and elsewhere have developed an embroidered astute glove that can capture, play back, and relay haptic instructions. To complement the wearable device, the team also developed a elementary machine-learning agent that adapts to how different users respond to haptic feedback, optimizing their experience. The up-to-date system could potentially facilitate teach people physical skills, improve responsive robot teleoperation, and aid in virtual reality training.
Open access article describing the work was published on January 29.
Will I be able to play the piano?
To create the astute glove, the researchers used a digital embroidery machine to seamlessly embed touch sensors and haptic actuators (a device that provides touch-based feedback) into the textile. The technology is present in smartphones, where haptic responses are triggered by tapping a touchscreen. For example, if you tap an app on your iPhone, you’ll feel a slight vibration coming from that particular part of the screen. In the same way, the up-to-date adaptive gadget sends feedback to different parts of the hand to indicate the optimal movements for performing different skills.
For example, a astute glove could teach users to play the piano. In a demonstration, an expert was tasked with recording a elementary melody on a section of keys, using the astute glove to capture the sequence in which they pressed their fingers down the keyboard. A machine-learning agent then converted that sequence into haptic feedback, which was then fed to the students’ gloves to make them follow instructions. As their hands hovered over the same section, actuators vibrated on the fingers corresponding to the keys below. The cable optimizes these directions for each user, taking into account the subjective nature of haptic interactions.
“Humans engage in a wide range of tasks, constantly interacting with the world around them,” says Yiyue Luo MS ’20, lead author of the paper, a doctoral candidate in MIT’s Department of Electrical Engineering and Computer Science (EECS) and a member of CSAIL. “We don’t typically share these physical interactions with others. Instead, we often learn by observing their movements, such as when they play the piano or dance.
“A major challenge in delivering haptic interactions is that everyone experiences haptic feedback differently,” Luo adds. “This hurdle inspired us to develop a machine-learning agent that learns to generate adaptive haptic feedback for each person’s glove, taking it to a more practical approach to learning optimal movement.”
The wearable system is customized to the user’s hand specifications using a digital manufacturing method. A computer creates a cutout based on the measurements of the person’s hand, and then an embroidery machine sews in the sensors and haptics. Within 10 minutes, the cushioned, fabric system is ready to wear. Initially trained on the haptics of 12 users, its adaptive machine-learning model needs just 15 seconds of up-to-date user data to personalize feedback.
In two other experiments, users wearing gloves were given time-based haptic feedback while playing games on a laptop. In a rhythm game, players learned to follow a narrow, winding path to hit a goal, and in a racing game, drivers collected coins and balanced their vehicles to the finish line. Luo’s team found that participants scored highest in the game with optimized haptics, as opposed to those with no haptics or with unoptimized haptics.
“This work is the first step toward building personalized AI agents that continuously collect data about the user and the environment,” says senior author Wojciech Matusik, a professor of electrical engineering and computer science at MIT and head of the Computational Design and Fabrication Group at CSAIL. “These agents then help them perform complex tasks, learn new skills, and promote better behaviors.”
Bringing Realistic Experiences to Electronic Environments
In the case of robot teleoperation, the researchers found that their gloves could transfer force sensations to the robots’ arms, helping them perform more fine grasping tasks. “It’s a bit like trying to teach a robot to behave like a human,” Luo says. In one case, the MIT team used human teleoperators to teach a robot how to secure different types of bread without deforming them. By teaching optimal grasping, humans could precisely control robotic systems in environments like manufacturing, where the machines could work more safely and efficiently with their human operators.
“The technology behind the embroidered smart glove is a major innovation for robots,” says Daniela Rus, the Andrew (1956) and Erna Viterbi Professor of Electrical Engineering and Computer Science at MIT, director of CSAIL, and an author on the paper. “With its ability to capture tactile interactions in high resolution, similar to human skin, this sensor enables robots to perceive the world through touch. The seamless integration of tactile sensors into textiles combines physical actions with digital feedback, offering enormous potential for responsive robot teleoperation and immersive virtual reality training.”
Similarly, the interface could create more immersive experiences in virtual reality. Wearing astute gloves would add haptic feedback to digital environments in video games, where players could sense their surroundings to avoid obstacles. Additionally, the interface would provide a more personalized, touch-based experience in virtual training courses used by surgeons, firefighters and pilots, where precision is key.
While these wearables can provide users with a more hands-on experience, Luo and her group believe they can extend their wearable technology beyond fingers. With stronger haptic feedback, the interfaces could target feet, hips, and other parts of the body that are less sensitive than the hands.
Luo also noted that with a more sophisticated AI agent, her team’s technology could facilitate with more sophisticated tasks, such as manipulating clay or guiding an airplane. Currently, the interface can only facilitate with elementary movements, such as pressing a key or grasping an object. In the future, MIT’s system could take into account more user data and produce more conformal and tight-fitting wearables to better account for the impact of hand movements on haptic perception.
Luo, Matusik, and Rus authored the paper, which was written with contributions from director and professor Tomás Palacios of EECS Microsystems Technology Laboratories; CSAIL members Chao Liu, Youthful Joong Lee, Joseph DelPreto, Michael Foshey, and professor and principal investigator Antonio Torralba; Kiu Wu of LightSpeed Studios; and Yunzhu Li of the University of Illinois at Urbana-Champaign.
This work was supported in part by a fellowship from MIT Schwarzman College of Computing through Google and a GIST-MIT Research Collaboration grant, with additional support from Wistron, Toyota Research Institute, and Ericsson.