Abstract

Environmental context prediction is important for wearable robotic applications, such as terrain-adaptive control. System efficiency is critical for wearable robots, in which system resources (e.g., processors and memory) are highly constrained. This article aims to address the system efficiency of real-time environmental context prediction for lower limb prostheses. First, we develop an uncertainty-aware frame selection strategy that can dynamically select frames according to lower limb motion and uncertainty captured by Bayesian neural networks (BNNs) for environment prediction. We further propose a dynamic Bayesian gated recurrent unit (D-BGRU) network to address the inconsistent frame rate which is a side effect of the dynamic frame selection. Second, we investigate the effects on the tradeoff between computational complexity and environment prediction accuracy of adding additional sensing modalities (e.g., GPS and an on-glasses camera) into the system. Finally, we implement and optimize our framework for embedded hardware, and evaluate the real-time inference accuracy and efficiency of classifying six types of terrains. The experiments show that our proposed frame selection strategy can reduce more than 90% of the computations without sacrificing environment prediction accuracy, and can be easily extended to the situation of multimodality fusion. We achieve around 93% prediction accuracy with less than one frame to be processed per second. Our model has 6.4 million 16-bit float numbers and takes 44 ms to process each frame on a lightweight embedded platform (NVIDIA Jetson TX2).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.