Abstract

Abstract. Two existing chlorophyll-a (chl-a) concentration retrieval procedures, which are analytical and empirical, are hindered by the complexity in radiative transfer equation (RTE) and in statistical analyses, respectively. Another promising model in this direction is the use of artificial neural networks (ANN). Mostly, a pixel-to-pixel with one-layer ANN model is used; where in fact that the satellite instrumental errors and man-made objects in water bodies might affect the retrieval and should be taken into account. In this study, the mask-based neural structure, called convolutional neural networks (CNN) model containing both the target and neighborhood pixels, is proposed to reduce the influence of the aforementioned premises. The proposed model is an end-to-end multiple-layer model which integrates band expansion, feature extraction, and chl-a estimation into the structure, leading to an optimal chl-a concentration retrieval. In addition to that, a two-stage training is also proposed to solve the problem of insufficient in-situ samples which happens in most of the time. In the first stage, the proposed model is trained by using the chl-a concentration derived from the water product, provided by satellite agency, and is refined with the in-situ samples in the second stage. Eight Sentinel-3 images from different acquisition time and coincide in-situ measurements over Laguna Lake waters of Philippines were utilized to conduct the model training and testing. Based on quantitative accuracy assessment, the proposed method outperformed the existing dual- and triple- bands combinations in chl-a concentration retrieval.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.