Abstract

Extreme learning machine is a neural network algorithm widely accepted in the scientific community due to the simplicity of the model and its good results in classification and regression problems; digital image processing, medical diagnosis, and signal recognition are some applications in the field of physics addressed with these neural networks. The algorithm must be executed with an adequate number of neurons in the hidden layer to obtain good results. Identifying the appropriate number of neurons in the hidden layer is an open problem in the extreme learning machine field. The search process has a high computational cost if carried out sequentially, given the complexity of the calculations as the number of neurons increases. In this work, we use the search of the golden section and simulated annealing as heuristic methods to calculate the appropriate number of neurons in the hidden layer of an Extreme Learning Machine; for the experiments, three real databases were used for the classification problem and a synthetic database for the regression problem. The results show that the search for the appropriate number of neurons is accelerated up to 4.5× times with simulated annealing and up to 95.7× times with the golden section search compared to a sequential method in the highest-dimensional database.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call