Abstract
• Indoor trajectory reconstruction model using limited reference points is proposed. • Uncertainty prediction model uses the combination of 1D-CNN and LSTM is proposed. • Enhanced training dataset contains sampling and measurement errors is generated. • Uncertainty region considers both sampling and measurement errors is established. Modelling pedestrian movement uncertainty in complex urban environments is regarded as a meaningful and challenging task regarding the promotion of geospatial data mining and analysis. However, the traditional uncertainty prediction model only takes the movement distance or speed into consideration and is not able to adapt well to time-varying measurement errors. In this paper, a deep-learning framework is proposed for modelling pedestrian movement uncertainty in large-scale indoor areas, in which a hybrid deep-learning model combines a one-dimensional Convolutional Neural Network (1D-CNN) with a long short-term memory (LSTM) network is proposed for enhancing feature extraction performance and reducing time correlation errors. The proposed framework takes human motion related measurement features into consideration, in which the moving step-length and heading information during a time period are also reconstructed and modelled as the input to the deep-learning model. Compared with state-of-art algorithms applied to different real-world trajectory datasets, the proposed deep-learning approach demonstrates much better performance of uncertainty region prediction, including the different indexes (Euclidean error distance, completeness and density) This study has leaded to the provision of an effective and practical framework for modelling trajectory uncertainty of the pedestrian in challenging urban environments, and which is expected to benefit smart city and spatial perception related applications.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Applied Earth Observation and Geoinformation
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.