Abstract

Sheet metal stamping is widely used for high-volume production. Despite the wide adoption, it can lead to defects in the manufactured components, making their quality unacceptable. Because of the variety of defects that can occur on the final product, human inspectors are frequently employed to detect them. However, they can be unreliable and costly, particularly at speeds that match the stamping rate. In this paper, we propose an automatic inspection framework for the stamping process that is based on computer vision and deep learning techniques. The low cost, remote sensing capability and simple implementation mean that it can be easily deployed in an industrial setting. A particular focus of this research is to account for the harsh lighting conditions and the highly reflective nature of products found in manufacturing environments that affect optical sensing techniques by making it difficult to capture the details of a scene. High dynamic range images can capture details of an environment in harsh lighting conditions, and in the context of this work, can capture highly reflective metals found in sheet metal stamping manufacturing. Building on this imaging technique, we propose a framework including a deep learning model to detect defects in sheet metal stamping parts. To test the framework, sheet metal ‘Nakajima’ samples were pressed with an industrial stamping press. Then optimally exposed, sequence of exposures, tone-mapped and high dynamic range images of the samples were used to train convolutional neural network-based detectors. Analysis of the resulting models showed that high dynamic range image-based models achieved substantially higher accuracy and minimal false-positive predictions.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.