Prosthetic limbs and hands, as well as gloves and tools for virtual and augmented reality, can be controlled with electrical signals from muscles in the wrist and forearm picked up by electrodes on the skin, or electromyography (EMG). But not every wrist is alike. Some people have more fat under the skin, others are lean and some are hairy. Older people have thinner, less elastic skin, and their muscle signals can also differ from young people in important ways. If prosthetics, VR and AR are going to be accessible to the widest possible range of people, surface EMG technology must be able to cope with a wide range of body and skin types as well as adapt to individual users.
With support from Meta, an interdisciplinary team of researchers at the is working to address the problem. The group is led by Professor , Department of Neurobiology, Physiology and Behavior and , assistant professor in the Department of Mechanical and Aerospace Engineering. They aim to develop wristband sensors that work for everyone with minimal calibration.
The team is in the process of recruiting 100 volunteers aged 18 to 90, representing a range of body types. They will use wrist EMG data from these people to develop algorithms that can be used to reliably turn EMG signals into actions, either of prosthetics or virtual limbs.
Media Resources
Making Prosthetics More Lifelike (In Focus feature and Unfold podcast)