Abstract

Accurate crop phenology information is essential for precision farming and agricultural productivity improvement. In recent years, in-situ equipment on crop phenology observation has been boosted, which generates high-quality real-time pictures on capturing vegetation phenological changes. However, due to the limited number of ground sites, it is impossible to measure large-scale crop phenology with local observations. Complementary, the freely available Sentinel satellites with high revisit frequency provide an opportunity to map accurate crop phenology at an unprecedented fine spatial scale. Because of the differences in viewing angle and range, the consistency of crop phenological stages varies between satellite and ground observations. To fill the gap between satellite and ground observations, we developed a spatial-aware scheme to integrate SAR and optical time-series data for accurate crop phenology tracking. To be specific, we propose a new deep learning model called Deep-CroP framework to improve the alignment between satellite and ground observations on crop phenology. The experiment results on selected ground sites demonstrate that the proposed Deep-CroP is able to accurately identify crops phenology and narrow the discrepancies from 30+ days to as high as several days. In addition, we applied the Deep-CroP to large-scale Sentinel time-series to map spatial patterns of phenology at fine resolution imagery on two study areas (i.e., TA1 and TA2). In general, the potential of satellites time-series for ground-level crop phenology observation is verified. Also, the consistency between satellite and PhenoCam observations is expected to be further improved.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.