Abstract
Abstract. In accordance with the United Nations (UN) Sustainable Development Goal (SDG) 16: Peace, Justice, and Strong Institutions, this study explores ship monitoring through the use of Synthetic Aperture Radar (SAR) for its potential applications to economic and security purposes. One method to extract ships through SAR-derived imagery is to employ the use of convolutional neural networks (CNN). However, the extraction of small features continues to be a challenging task for CNNs. To improve the performance in such cases, one way is to employ the use of an appropriate loss function, which helps guide the CNN model during training. In this paper, Focal Combo (FC) loss, a recent loss function designed for extreme class imbalance, will be investigated to analyze its effects when applied to ship extraction. In doing so, this paper also presents a thorough comparison of existing loss functions in their capability to segment and detect ships on SAR imagery. Making use of the U-Net model, our results demonstrate that by using FC loss we can observe an increase in segmentation of about 9% in terms of f3-score and a decrease in missed detections by about 17 ships (after post-processing) when compared to cross-entropy loss. Unfortunately, it has also shown a significant drop in precision of about 35% resulting in an additional 270 ships being incorrectly detected in the background. In future work, varying CNN models shall be tested to see if the pattern persists and several trials shall be conducted to assess consistency.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.