Abstract

The timely and accurate prediction of remote sensing data is of utmost importance especially in a situation where the predicted data is utilized to provide insights into emerging issues, like environmental nowcasting. Significant research progress can be found to date in devising variants of neural network (NN) models to fulfil this requirement by improving feature extraction and dynamic process representation power. Nevertheless, all these existing NN models are built upon rigid structures that often fail to maintain tradeoff between bias and variance, and consequently, need to spend a lot of time to empirically determine the most appropriate network configuration. This article proposes a self-adaptive recurrent deep incremental network model (SARDINE) which is a novel variant of the deep recurrent neural network with intrinsic capability of self-constructing the network structure in a dynamic and incremental fashion while learning from observed data samples. Moreover, the proposed SARDINE is able to model the spatial feature evolution while scanning the data in a single pass manner, and this further saves significant time when dealing with remote sensing imagery containing millions of pixels. Subsequently, we employ SARDINE in combination with a spatial influence mapping unit to accomplish the prediction. The effectiveness of the proposed model is evaluated in terms of predicting a time series of normalized difference vegetation index (NDVI) data derived from Landsat Thematic Mapper (TM)-5 and Moderate Resolution Imaging Spectroradiometer (MODIS) Terra satellite imagery. The experimental result demonstrates that the SARDINE-based prediction is able to achieve state-of-the-art accuracy with significantly reduced computational cost.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.