Abstract

Kim, S.M.; Shin, J.; Baek, S., and Ryu, J.-H., 2019. U-Net convolutional neural network model for deep red tide learning using GOCI. In: Jung, H.-S.; Lee, S.; Ryu, J.-H., and Cui, T. (eds.), Advances in Remote Sensing and Geoscience Information Systems of Coastal Environments. Journal of Coastal Research, Special Issue No. 90, pp. 302-309. Coconut Creek (Florida), ISSN 0749-0208.GOCI launched in 2010 is a geostationary satellite image sensor that monitors ocean color. It captures 8-band spectral satellite images of northeast Asian regions hourly, eight times a day. The spatial resolution of GOCI is about 500 m. GOCI is capable of monitoring a large ocean area for sensing various events such as red tide occurrences, tidal movement changes and ocean disasters. In this study, we propose a deep convolutional neural network model, U-Net, for automatic pixel-based detection of red tide occurrence from the spectral images captured by GOCI. We construct two training datasets with GOCI images and the corresponding red-tide index maps (RI maps) accumulated through 2011 to 2018. The RI maps indicate where red tides occurred and what kind of red tide species were there. U-Net consists of five U-shaped encoder and decoder layers to extract spectral features relating to red-tide species from GOCI images. We compared the performances of U-Nets trained from two datasets (i) consisting of only four spectral bands and (ii) consisting of all six spectral bands. The RI maps predicted by the trained U-Nets showed considerably matching spatial occurrence tendencies of three red tide species to the ground truths for validation images. The mean target accuracy with the four-band dataset was 13 % lower than that with the six-band dataset. The trained U-Net for pixel-wise red tide detection would be able to effectively inspect red tide occurrences in the huge area of water surrounding the Korean peninsula.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.