Abstract

Image time series (ITS) represent complex 3D (2D+t in practice) data that are now daily produced in various domains, from medical imaging to remote sensing. They contain rich spatio-temporal information allowing the observation of the evolution of a sensed scene over time. In this work, we focus on the classification task of ITS, as often available in remote sensing tasks. An underlying problem here is to consider jointly the spatial and the temporal dimensions of the data. We present Deep-STaR, a method to learn such features from ITS data to proceed to their classification. Instead of reasoning in the original 2D+t space, we investigate novel 2D planar data representations, containing both temporal and spatial information. Such representations are a novel way to structure the ITS, compatible with deep learning architectures. They are used to feed a convolutional neural network to learn spatio-temporal features with 2D convolutions, leading ultimately to classification decision. To enhance the explainability of the results, we also propose a post-hoc attention mechanism, enabled by this new approach, providing a semantic map giving some insights for the taken decision. Deep-STaR is evaluated on a remote sensing application, for the classification of agricultural crops from satellite ITS. The results highlight the benefice of this method, compared to the literature, and its interest to make easier the interpretation of ITS to understand spatio-temporal phenomena.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.