Abstract

Greenhouses have revolutionized farming all over the world. To estimate vegetable yields, greenhouse mapping using high spatial resolution (HSR) remote sensing imagery is very important. Although automatic greenhouse mapping methods have been proposed, they are often applied in limited small-scale areas (i.e. a parcel, a city, or a province). Large-scale greenhouse mapping (i.e. national-scale) faces the diversity of greenhouses in different areas, the difficulty of the simultaneous extraction of the number and area of greenhouses, and the dense spatial distribution of greenhouses. In this paper, to solve the problem of large-scale greenhouse mapping, a novel data-driven deep learning framework is proposed, which we refer to as the dense object dual-task deep learning (DELTA) framework. The dual-task learning module simultaneously extracts the number and area of greenhouses by adopting a greenhouse area extraction branch and a greenhouse number extraction branch. A high-density-biased sampler module is proposed to select more samples in areas with a dense distribution, so that the trained model is more effective at dense greenhouse extraction. Six regions in China were selected for evaluation, which obtained a performance increment of 1.8% in mean average precision (mAP) when compared with Faster R-CNN. Finally, the whole of China was taken as the research area, and remote sensing image tiles at a 1-m spatial resolution from all over China were obtained. All the images were captured by different sensors and downloaded from open-source sites or purchased. The experimental results indicate that more than 13 million greenhouses were extracted in China.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.