Abstract

ELM (Extreme Learning Machine) networks have been gaining attention due to high speed, ease of implementation and minimal human intervention compared to classical feedforward neural networks with iterative training algorithms. Part of the studies that seek improvements in ELM involves the specification of input weights and hidden layer biases, traditionally randomized. In this paper is proposed a variant of ELM network in which the hidden layer biases are analytically calculated simultaneously to the output weights in the solution of the least squares problem. The method is valid for SLFNs (single-hidden layer feedforward neural networks) with sinusoidal activation functions. Experiments have shown that calculating biases with the proposed method may reduce the amount of hidden nodes required to solve a given problem. The proposed variant received the designation CBTELM—Calculated (Hidden Layer) Biases Trigonometric ELM. In addition to standard applications of neural networks, CBTELM may be successfully used for amplitude and phase detection of noisy signals composed by sinusoids with known frequencies, as was verified in experiments.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.