The Wristband enables wearers to control the robot’s hand with their movements | MIT News

The next time you scroll through your phone, take a moment to appreciate this experience: This seemingly unusual action is made possible by the combination of 34 muscles, 27 joints, and more than 100 muscles and tendons in your hand. Indeed, our hands are the most delicate part of our body. Simulating their many nuanced gestures has long been a challenge in robotics and virtual reality.
Now, MIT engineers have designed an ultrasound wristband that accurately tracks the wearer’s hand movements in real time. The wristband generates ultrasound images of wrist muscles, tendons, and ligaments as the hand moves, and is paired with an artificial intelligence algorithm that continuously translates the images into the corresponding positions of the five fingers and palm.
Researchers can train a wristband to learn the wearer’s hand movements, a device that can communicate in real time with a robot or virtual environment.
In demonstrations, the team demonstrated that a person wearing a wristband can wirelessly control a robot hand. As the person touches or points, the robot does the same. In a kind of wireless marionette interaction, the wearer can manipulate the robot to play a simple song on the piano and shoot a small basket into a desktop hoop. With the same wristband, the wearer can also manipulate objects on the computer screen, for example, by pinching their fingers together to zoom in and out of the visible object.
The team uses a wristband to collect hand movement data from many additional users with different hand sizes, finger shapes, and gestures. They envision creating a large dataset of hand movements that can be embedded in water, for example, to train humanoid robots in technical tasks, such as performing certain surgical procedures. The ultrasound band can also be used to hold, manipulate, and interact with objects in video games, design applications, or other virtual settings.
“We think this work has an immediate impact on changing hand tracking techniques with wearable ultrasound belts in virtual and virtual reality,” said Xuanhe Zhao, Uncas and Helen Whitaker Professor of Mechanical Engineering at MIT. “It could also provide a large amount of data for the training of humanoid robots.”
Zhao, Gengxi Lu, and colleagues present a new wristband design in a paper appearing today in Environmental Electronics. MIT co-authors are former postdocs Xiaoyu Chen, Shucong Li, and Bolei Deng; graduate students SeongHyeon Kim and Dian Li; postdocs Shu Wang and Runze Li; and Anantha Chandrakasan, MIT provost and Vannevar Bush Professor of Electrical Engineering and Computer Science. Other co-authors are graduate students Yushun Zheng and Junhang Zhang, Baoqiang Liu, Chen Gong, and University of Southern California Professor Qifa Zhou.
Seeing the ropes
There are currently several ways to capture and simulate human handiwork on robots. Other methods use cameras to record a person’s hand movements as they change objects or perform certain tasks. Another involves having a person wear a glove with sensors, which record the movements of the person’s hand and transmit the data to a robot that receives it. But setting up a complex camera system for different applications is not possible and often has physical limitations. And gloves full of sensors can limit a person’s natural hand movements and hearing.
A third method uses electrical signals from muscles in the wrist or arm that scientists associate with specific hand movements. Researchers have made great progress in this method, however these signals are easily affected by noise in the environment. And they are not sensitive enough to distinguish subtle changes in movement. For example, they may notice that the thumb and index finger are pinched together or pulled apart, but not much of the middle part of the way.
Zhao’s team wondered whether ultrasound imaging could capture intelligent and continuous hand movements. His group has been developing different types of ultrasound stickers – smaller versions of the transducers used in doctors’ offices paired with hydrogel materials that can be safely attached to the skin.
In their new research, the team integrated an ultrasound sticker design into a wearable belt to continuously image muscles and tendons in the wrist.
“The tendons and ligaments in your wrist are like the strings that pull the puppets, which are your fingers,” says Lu. So the idea is: Each time you take a picture of the shape of the strings, you will know the shape of the hand.”
Map manipulation
The team designed a wristband with an ultrasound sticker that is the size of a smart watch, and added internal electronics that are almost as small as a cell phone. They attached a band to the volunteer’s wrist and ensured that the device produced clear and continuous images of the wrist as the volunteer moved the fingers in various gestures.
The challenge then was to correlate the black and white ultrasound images of the wrist with a specific area of the hand. As it turns out, the fingers and thumb are able to relax by 22 degrees, or different ways of stretching or hanging. The researchers found that they could identify specific regions in their wrist ultrasound images associated with each of these 22 degrees of freedom. For example, changes in one region are related to the extension of the thumb, while changes in another region are related to the movement of the index finger.
To detect this interaction, a volunteer wearing a wristband moved their hand in various directions while the researchers recorded the gestures with multiple cameras around the volunteer. By matching changes in certain regions of the ultrasound images with hand positions recorded by cameras, the team can label regions of the wrist image with the corresponding degree of freedom in the hand. But doing this translation continuously, and in real time, would be an impossible task for humans.
So, the team turned to artificial intelligence. They used an AI algorithm that can be trained to recognize image patterns and associate them with specific labels and, in this case, different degrees of freedom for the hand. The researchers trained the algorithm on ultrasound images that they carefully labeled, defining regions of the image associated with certain degrees of freedom. They tested the algorithm on a new set of ultrasound images and found that it correctly predicted the corresponding hand gestures.
When the researchers successfully paired the AI algorithm with the wristband, they tested the device on more volunteers. In the new study, 8 volunteers with different sized hands and wrists wore a wristband while performing a variety of hand and grasp actions, including making signs for all 26 letters of American Sign Language. They also had things like a tennis ball, a plastic bottle, scissors and a pencil. In each case, the wristband accurately tracked and predicted the position of the hand.
To demonstrate the potential applications, the team created a simple computer system that they wirelessly paired with a wristband. As the wearer moved around pressing and holding, the touch was accompanied by zooming in and out of an object on the computer screen, and it almost moved and operated in a smooth and continuous manner.
The researchers also tested the wristband as a wireless controller for a simple commercial hand. While wearing the wristband, the volunteer played the keyboard. The robot then imitates the movement in real time to play a simple piano song. This same robot was also able to mimic human taps to play a desktop basketball game.
Zhao plans to make the wristband’s hardware smaller, and train the AI software with multiple gestures and movements from volunteers with different hand sizes and postures. Ultimately, the team is building on a wearable wrist tracker that anyone can wear, to wirelessly manipulate humanoid robots or high-tech virtual objects.
“We believe this is the most advanced way to track intelligent hand movements, using wearable wrist imaging,” Zhao said. “We think these wearable ultrasound bands can provide precise and flexible controls for virtual reality and robotic hands.”
This research was supported, in part, by MIT, the US National Institutes of Health, the US National Science Foundation, the US Department of Defense, and the Singapore National Research Foundation through the Singapore-MIT Alliance for Research and Technology.



