Abstract

Nowadays, a large number of sensors are equipped on mobile or stationary platforms, which continuously generate geo-tagged and time-stamped readings (i.e., geo-sensory data) that contain rich information about the surrounding environment. These data have irregular space and time coordinates. To represent geo-sensory data, there have been extensive research efforts using time sequences, grid-like images, and graph signals. However, there still lacks a proper representation that can describe both the mobile and stationary geo-sensory data without the information-losing discretization in spatial and temporal dimensions. In this paper, we propose to represent massive geo-sensory data as <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">spatio-temporal point clouds</i> (STPC), and present <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">STPC-Net</i> , a novel deep neural network for processing STPC. STPC leverages the original irregular space-time coordinates, and STPC-Net captures intra-sensor and inter-sensor correlations from STPC. In this way, STPC-Net learns the key information of STPC, and overcomes challenges in data irregularity. Experiments using real-world datasets show that STPC-Net achieves state-of-the-art performance in different tasks on both mobile and stationary geo-sensory data. The source code is available at <uri xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">https://github.com/zhengchuanpan/STPC-Net</uri> .

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call