Abstract
Oil spills have proven to have detrimental effects on the marine-based environment and economy. Thus, it is necessary to identify oil spills and classify them in the sea to reduce oil-induced pollution in seas and oceans. Synthetic-aperture radar (SAR) imaging is a good option for rapid oil detection, as it covers a wide area, collects data at short intervals, and allows taking images in all weather conditions throughout the day. The reason for using deep neural networks is that training several images enhances segmentation accuracy significantly. This study intended to separate the oil spills of SAR images using U-NET and DeeplabV3 neural networks, separately with the lowest number of images and the highest accuracy possible. Each of these neural networks carries out image segmentation with different architectures independently and thus we could not combine these two networks for oil spill segmentation. We managed to find two accurate convolutional neural networks (CNNs) for oil spill segmentation because we did not have access to sufficient hardware facilities, like GPU, to train dozens of neural networks. The two networks we used in the present study are among the most well-known and widely used networks. Our purpose was to figure out which network was the best in SAR oil spill detection. Given the limited number of SAR oil spill images and as the input of CNNs needs many images for training, we increased the number of input images to 9801 using the augmentation technique. Then, we carefully identified oil spills with 300 epoch and a batch size of 5 using the Python programming language on the GoogleColab server. The oil spill detection accuracy was 78.8% in the U-NET network and 54% in the DeepLabV3 network. Accordingly, we conclude that the most accurate identification of SAR oil spills in images belong to the U-NET network.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.