Abstract

In recent years, with the development of science and technology, people have more and more choices for daily travel. However, assisting with various mobile intelligent services by transportation mode detection has become more urgent for the refinement of human activity identification. Although much work has been done on transportation mode detection, accurate and reliable transportation mode detection remains challenging. In this paper, we propose a novel transportation mode detection algorithm, namely T2Trans, based on a temporal convolutional network (i.e., TCN), which employs multiple lightweight sensors integrated into a phone. The feature representation learning of multiple preprocessed sensor data using temporal convolutional networks can improve transportation mode detection accuracy and enhance learning efficiency. Extensive experimental results demonstrated that our algorithm attains a macro F1-score of 86.42% on the real-world SHL dataset and 88.37% on the HTC dataset, with an average accuracy of 86.37% on the SHL dataset and 89.13% on the HTC dataset. Our model can better identify eight transportation modes, including stationary, walking, running, cycling, car, bus, subway, and train, with better transportation mode detection accuracy, and outperform other benchmark algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call