Abstract
Chip pad inspection is of great practical importance for chip alignment inspection and correction. It is one of the key technologies for automated chip inspection in semiconductor manufacturing. When applying deep learning methods for chip pad inspection, the main problem to be solved is how to ensure the accuracy of small target pad detection and, at the same time, achieve a lightweight inspection model. The attention mechanism is widely used to improve the accuracy of small target detection by finding the attention region of the network. However, conventional attention mechanisms capture feature information locally, which makes it difficult to effectively improve the detection efficiency of small targets from complex backgrounds in target detection tasks. In this paper, an OCAM (Object Convolution Attention Module) attention module is proposed to build long-range dependencies between channel features and position features by constructing feature contextual relationships to enhance the correlation between features. By adding the OCAM attention module to the feature extraction layer of the YOLOv5 network, the detection performance of chip pads is effectively improved. In addition, a design guideline for the attention layer is proposed in the paper. The attention layer is adjusted by network scaling to avoid network characterization bottlenecks, balance network parameters, and network detection performance, and reduce the hardware device requirements for the improved YOLOv5 network in practical scenarios. Extensive experiments on chip pad datasets, VOC datasets, and COCO datasets show that the approach in this paper is more general and superior to several state-of-the-art methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.