Abstract
CAM systems are widely used in many key industries and are essential for many machining processes such as 5-axis milling. For toolpath planning, state-of-the-art CAM systems require the user to enter many abstract parameters, making these systems difficult to use. This paper presents a more intuitive approach to toolpath planning. The approach uses convolutional neural networks to interpret user gestures in a virtual reality environment and predicts the values of the aforementioned parameters. The results show that many of these parameters can be extracted from user gestures with high accuracy, which can be used to greatly simplify toolpath planning.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have