Abstract
Motion sensors embedded in wearable and mobile devices allow for dynamic selection of sensor streams and sampling rates, enabling several applications, such as power management and data-sharing control. While deep neural networks (DNNs) achieve competitive accuracy in sensor data classification, DNN architectures generally process incoming data from a fixed set of sensors with a fixed sampling rate, and changes in the dimensions of their inputs cause considerable accuracy loss, unnecessary computations, or failure in operation. To address these limitations, we introduce a dimension-adaptive pooling (DAP) layer that makes DNNs flexible and more robust to changes in sensor availability and in sampling rate. DAP operates on convolutional filter maps of variable dimensions and produces an input of fixed dimensions suitable for feedforward and recurrent layers. Further, we propose a dimension-adaptive training (DAT) procedure for enabling DNNs that use DAP to better generalize over the set of feasible data dimensions at inference time. DAT comprises the random selection of dimensions during the forward passes and optimization with accumulated gradients of several backward passes. Combining DAP and DAT, we show how to transform existing non-adaptive DNNs into a Dimension-Adaptive Neural Architecture (DANA), while keeping the same number of parameters. Compared to existing approaches, our solution provides better average classification accuracy over the range of possible data dimensions at inference time and does not require up-sampling or imputation, thus reducing unnecessary computations. Experimental results on seven datasets (four benchmark real-world datasets for human activity recognition and three synthetic datasets) show that DANA prevents significant losses in classification accuracy of the state-of-the-art DNNs and, compared to baselines, it better captures correlated patterns in sensor data under dynamic sensor availability and varying sampling rates.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.