CONIX Publication

A wearable electromyography-based hand gesture recognition system with real-time on-board incremental learning and classification

Authors: Ali Moin, Andy Zhou, George Alexandrov, Jan Rabaey, Simone Benatti, Alisha Menon, Abbas Rahimi, Senam Tamakloe, Jonathan Ting, Natasha Yamamoto, Yasser Khan, Fred Burghardt, Ana C. Arias, Luca Benini


Accurate, unobtrusive systems for hand gesture recognition are crucial to the functionality and comfort of smart prosthetics and other modern human-computer interfaces. Current wearable gesture recognition devices support very few gestures and rely on an external device (PC or smartphone) to perform pattern recognition. Many of these algorithms use electromyography (EMG) signals as inputs, but they often misclassify gestures performed in different situational contexts (changing arm position, reapplication of electrodes, etc.) due to changing signal properties. Here, we describe a wearable, wireless system for EMG-based gesture recognition that can be trained and updated on the fly to classify up to 21 finger gestures. Printed flexible electrodes and custom integrated circuits for high channel count recordings provide greater muscular coverage while enhancing wearability and extending battery life. Unlike traditional machine learning algorithms, our on-board learning and classification algorithm based on hyperdimensional (HD) computing enables computationally efficient updates to incrementally incorporate new data and adapt to changing contexts. Through experiments with multiple able-bodied subjects, we demonstrate 98.34% online classification accuracy on 13 individual finger flexion and extension gestures, with 5.47% degradation when expanding the model to 21 gestures and less than 5.25% degradation when subject to changing situational contexts.

Release Date: 09/01/2019
Uploaded File: View