Abstract:
Smart ear-worn sensor technologies are making rapid strides in recent years. These "earable" sensors are ideal for ubiquitous computing because they can be naturally attached to human heads and provide information such as movement data or vital signals. However, several challenges are impeding the fast prototyping of such applications. First, collecting labeled data is labor-intensive, and open-source datasets from earable sensors are hardly available. Second, reproducing data processing algorithms require considerable expertise and effort. Moreover, some existing algorithms are expensive in computation and memory, preventing them from being used for real-time applications on resource-constrained devices. To address these challenges, we introduce AURITUS, an earable computing framework consisting of three layers: data acquisition, algorithm, and application. The first two layers of AURITUS handles data collection, pre-processing, and labeling tasks for creating customized earable datasets using graphical tools. We also provide an open-source dataset consisting of 34 head poses and 9 activities from 45 volunteers. Secondly, in the algorithm layer, we systematically implement five lightweight hardware-in-the-loop machine learning classifiers for activity recognition and four filters for head pose tracking. All the algorithms are integrated as application programming interfaces (APIs) to be used instantly. Finally, the application layer in AURITUS allows the implementation of any ubiquitous computing applications using the information generated in the previous layer. We open-source the proposed framework so that researchers and developers can contribute to any layer of the proposed framework or rapidly prototype their applications using our dataset and algorithms. Our sample application built with AURITUS recognizes activities with up to 98% test accuracy with real-time models as small as 6 kB. Our models are 98x smaller and 6% more accurate over the state-of-the-art. We also estimate head pose with absolute errors as low as 5 degrees using 20kB filters, achieving up to 2.4x precision improvement over existing techniques.
Release Date: 10/12/2021Uploaded File: View