Abstract

In Machine learning (ML) and deep learning (DL), hyperparameter tuning is the process of selecting the combination of optimal hyperparameters that give the best performance. Thus, the behavior of some machine learning (ML) and deep learning (DL) algorithms largely depend on their hyperparameters. While there has been a rapid growth in the application of machine learning (ML) and deep learning (DL) algorithms to Additive manufacturing (AM) techniques, little to no attention has been paid to carefully selecting and optimizing the hyperparameters of these algorithms in order to investigate their influence and achieve the best possible model performance. In this work, we demonstrate the effect of a grid search hyperparameter tuning technique on a Multilayer perceptron (MLP) model using datasets obtained from a Fused Filament Fabrication (FFF) AM process. The FFF dataset was extracted from the MakerBot MethodX 3D printer using internet of things (IoT) sensors. Three (3) hyperparameters were considered – the number of neurons in the hidden layer, learning rate, and the number of epochs. In addition, two different train-to-test ratios were considered to investigate their effects on the AM process data. The dataset consisted of five (5) dominant input parameters which include layer thickness, build orientation, extrusion temperature, building temperature, and print speed and three (3) output parameters: dimension accuracy, porosity, and tensile strength. RMSE, and the computational time, CT, were both selected as the hyperparameter performance metrics. The experimental results reveal the optimal configuration of hyperparameters that contributed to the best performance of the MLP model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call