Abstract

This article presents the development of a multisensor user interface to facilitate the instruction of arc welding tasks. Traditional methods to acquire hand-eye coordination skills are typically conducted through one-to-one instruction, where trainees must wear protective helmets and conduct several tests. These approaches are inefficient as the harmful light emitted from the electric arc impedes the close monitoring of the process. Practitioners can only observe a small bright spot. To tackle these problems, recent training approaches have leveraged virtual reality to safely simulate the process and visualize the geometry of the workpieces. However, the synthetic nature of these types of simulation platforms reduces their effectiveness as they fail to comprise actual welding interactions with the environment, which hinders the trainees' learning process. To provide users with a real welding experience, we have developed a new multisensor extended reality platform for arc welding training. Our system is composed of: 1) An HDR camera, monitoring the real welding spot in real time. 2) A depth sensor, capturing the 3-D geometry of the scene; and 3) A head-mounted VR display, visualizing the process safely. Our innovative platform provides users with a “bot trainer,” virtual cues of the seam geometry, automatic spot tracking, and performance scores. To validate the platform's feasibility, we conduct extensive experiments with several welding training tasks. We show that compared with the traditional training practice and recent virtual reality approaches, our automated multisensor method achieves better performances in terms of accuracy, learning curve, and effectiveness.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call