Abstract

As an auxiliary assembly guidance method, the augmented assembly has been gradually applied in modern manufacturing. However, with current product assembly developing toward customization and diversification, static augmented assembly guidance instructions that mainly depend on position measurement are hard to meet the needs of complex assembly cases. The position of assembly elements such as tools and essential parts may change frequently. This paper proposes a novel approach for detecting and annotating assembly elements using AR mobile devices in an offline environment to solve the problem. An improved deep learning object detection network YOLOv4-Lite is proposed. It achieves further lightweight while improving detection accuracy than YOLOv4-Tiny, running well in offline AR mobile devices with limited computing power. A spatial 3D annotation algorithm is presented to calculate the location of the center of the target object and generate markers for annotation. To verify the feasibility of the proposed approach, an offline detection system for assembly elements named ARODAS (augmented reality offline detection and annotation system) is developed and deployed into HoloLens2. Experimental results show that the proposed approach can complete detection and annotation with reasonable accuracy, assist the operator in detecting key assembly elements, and has a significant advantage in operation speed. To verify the usability and usefulness of the ARODAS, we performed quantitative and qualitative analyses by conducting a user study: searching for several kinds of small parts in a real environment. Compared with the traditional marker-based AR system, the ARODAS system shows a better overall performance in the user test.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call