Abstract
In-process measurement of machining precision is of great importance to advanced manufacturing, which is an essential technology to realize compensation machining. In terms of cost-effectiveness and repeatability of computer vision, it has become a trend to replace traditional manual measurement with computer vision measurement. In this paper, an in-process measurement method is proposed to improve precision and reduce the costs of machining precision. Firstly, a universal features model framework of machining parts is established to analyze the CAD model and give standard information on the machining features. Secondly, a window generator is proposed to adaptively crop the image of the machining part according to the size of features. Then, the automatic detection of the edges of machining features is performed based on regions of interest (ROIs) from the cropped image. Finally, the measurement of machining precision is realized through a Hough transform on the detected edges. To verify the effectiveness of the proposed method, a series of in-process measurement experiments were carried out on machined parts with various features and sheet metal parts, such as dimensional accuracy measurement tests, straightness measurement tests, and roundness measurement tests under the same part conditions. The best measurement accuracy of this method for dimensional accuracy, straightness, and roundness were 99%, 97%, and 96%, respectively. In comparison, precision measurement experiments were conducted under the same conditions using the Canny edge detection algorithm, the sub-pixel edge detection algorithm, and the Otsu–Canny edge detection algorithm. Experimental results show that the feature-model-based in-process measurement of machining precision using computer vision demonstrates superiority and effectiveness among various measurement methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.