New AI work gets to grips with gripping

Modern prostheses allow amputees to regain hand function using muscles in the residual limb. Sensors pick up contractions to drive finger and wrist movement. But mastering specific and complex muscle activations requires practice, a lot of practice. “These are movements the patient has to learn,” says Dr. Cristina Piazza, a professor of rehabilitation robotics from the Technical University of Munich (TUM).

Now Piazza’s team shows a neuroscience principle called synergies that could enable more instinctive control. When we grasp, our brain activates coordinated muscle groups that work together unconsciously. And researchers want to mimic this to simplify the complex activation to drive fluid robotic movement.

Hand prostheses are the specialty of scientists Dr. Patricia Capsi-Morales (left), Prof. Cristina Piazza (center), and doctoral student Johanna Happold from the Technical University of Munich (TUM)

The key is recording hyper-detailed signals across the whole muscle area. Piazza uses a dense grid of 128 sensors, 64 inside a sleeve on the arm and 64 outside. This detects the subtle patterns of many motor units during motion. Custom algorithms then analyse and translate these signals.

Initial tests focused on bionic hand gestures such as waving and picking up objects. But the implications extend further. “The more sensors we use, the better we can record information from different muscle groups,” Piazza explains. This high-resolution mapping reveals the specific neural instructions behind intended motions.

Imagine controlling robots through virtual reality using only small finger twitches. Or flying a camera drone over Mars using slight forehead muscle flexes. When the system knows which subtle contractions link to which actions, commands become second nature. No joysticks, toggles or buttons are needed.

There are barriers before this muscle-meld with machines. Machine learning algorithms must filter signal noise and adapt to each user’s unique patterns. And recordings shift if sensors slip on the skin. So the research system requires initial training and occasional adjustment recalibration.

Piazza sees promise in customising solutions for more seamless human-robot teaming. Her lab envisions helping limb amputees first. “Our goal is to find the right control solution for each patient,” she says. However, this may have larger implications for telepresence robotics.

This research inches us towards true machine connections, a virtuous cycle of reading and responding. And it moves prosthetics beyond crude on/off switches and into the realm of becoming a semi-natural extension of ourselves.

You can read Dr. Piazza’s paper in full here in IEEE.

Matthew

Matthew has been writing and cartooning since 2005 and working in science communication his whole career. Matthew has a BSc in Biochemistry and a PhD in Fibre Optic Molecular Sensors and has spent around 16 years working in research, 5 of which were in industry and 12 in the ever-wonderful academia.

Leave a Reply

Your email address will not be published. Required fields are marked *