Abstract

The Sussex-Huawei Locomotion-Transportation (SHL) recognition challenge organized at the HASCA Workshop of UbiComp 2020 presents a large and realistic dataset with different activities and transportation. The goal of this human activity recognition challenge is to recognize eight modes of locomotion and transportation from 5-second frames of sensor data of a smartphone carried in the unknown position. In this paper, our team (We can fly) summarize our submission to the competition. We proposed a one-dimensional (1D) DenseNetX model, a deep learning method for transportation mode classification. We first convert sensor readings from the phone coordinate system to the navigation coordinate system. Then, we normalized each sensor using different maximums and minimums and construct multi-channel sensor input. Finally, 1D DenseNetX with the Gated Recurrent Unit (GRU) model output the predictions. In the experiment, we utilized four internal datasets for training our model and achieved averaged F1 score of 0.7848 on four valid datasets.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call