Abstract

The shape, size and distribution of grains play an important role in the quality of alloy materials. However, in the actual production process, the grain boundary is not always visible in the metallographic image observed by microscope. In this paper, we propose an end-to-end deep network named GF-RCF based on adversary network and feature learning to detect grain boundaries effectively. The network is guided by a novel multi-level loss through its three parts: base network, adversary network and metric module, which are used for boundary detection, boundary inpainting and feature learning respectively. Specifically, for base network, its architecture is based on richer convolution feature to achieve basic boundary detection by pixel-level loss. For adversary network, it employs the image-level adversary loss and feature matching loss to repair the invisible boundary. For metric module, it involves a feature learning strategy by comparing the features of unlabeled data with the labeled data to improve the generalization of the network. The multi-level loss refers to the total loss of pixel-level, image-level and feature-level obtained from the three subnetwork, respectively. From the perspective of machine learning, GF-RCF involves semi-supervised learning, transfer learning and generative adversarial network, which makes our model have more powerful learning ability. In experiments, GF-RCF achieves 0.8792 F1 score, surpassing other methods and reaching the state of the art.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.