Abstract
Accurately predicting the time-varying dynamic parameters of a workpiece during the milling of thin-walled parts is the foundation of adaptively selecting chatter-free machining parameters. Hence, a method for accurately and quickly predicting the time-varying dynamic parameters for milling thin-walled parts is proposed, which is based on the shell FEM and a three-layer neural network. The time-dependent dynamics of the workpiece can be calculated using the FEM by obtaining the geometrical parameters of the arc-faced junctions within the discrete cells of the initial and machined workpiece. It is unnecessary to re-divide the mesh cells of the thin-walled parts at each cutting position, which enhances the computational efficiency of the workpiece dynamics. Meanwhile, in comparison with the three-dimensional cube elements, the shell elements can reduce the number of degrees of freedom of the FEM model by 74%, which leads to the computation of the characteristic equation that is about nine times faster. The results of the modal test show that the maximum error of the shell FEM in predicting the natural frequency of the workpiece is about 4%. Furthermore, a three-layer neural network is constructed, and the results of the shell FEM are used as samples to train the model. The neural network model has a maximum prediction error of 0.409% when benchmarked against the results of the FEM. Furthermore, the three-layer neural network effectively enhances computational efficiency while guaranteeing accuracy.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.