Abstract

Despite the rapid deployment of automation solutions in manufacturing, humans remain the key players in the manufacturing environments, especially for hazardous or unfavourable environments. To enable a collaborative robot or exoskeleton robot to better support humans more safely and efficiently, understanding human behaviours is an essential enabling technology for the machines to make decisions on optimal control strategies. This work introduces an algorithm for classification of human locomotion activities using inertial data captured with inertial measurement unit (IMU) to support the control of robots, exoskeletons and many other applications. The proposed approach, to recognise human locomotion activities and gait events, includes two main steps: 1) applying spectral analysis on the inertial signals to transform the data into time-frequency representation; 2) classify the time-frequency data of an image to be recognised using convolutional neural networks (CNNs). Three methods are proposed to combine spectrum representations of multiple sensor channels for the CNN and evaluated. There are six activities considered in the work. The highest accuracy in classification with a sub-set of three classes is 99.6%, demonstrating the promising result in being applicable for real applications. [Submitted 27 September 2019; Accepted 4 January 2020]

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call