Abstract

Metaheuristic algorithms have been widely used to solve structural optimization problems. Despite their powerful search capabilities, these algorithms often require a large number of fitness evaluations. Constructing a machine learning classifier to identify which individuals should be evaluated using the original fitness evaluation is a great solution to reduce the computational cost. However, there is still a lack of a thorough comparison between machine learning classifiers when integrating into the optimization process. This paper aims to evaluate the efficiencies of different classifiers in eliminating unnecessary fitness evaluations. For this purpose, the weight optimization of a double-layer grid structure comprising 200 members is used as a numerical experiment. Six machine learning classifiers selected for assessment in this study include Artificial Neural Network, Support Vector Machine, k-Nearest Neighbor, Decision Tree, Random Forest, and Adaptive Boosting. The comparison is made in terms of the optimal weight of the structure, the rejection rate as well as the computing time. Overall, it is found that the AdaBoost classifier achieves the best performance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.