Abstract

Neural networks have drawn much attention in modern machine learning community as they have achieved many successful applications, such as image recognition, speech recognition and system identification. According to the principle of parsimony, simpler neural models are preferable to more complex ones if they have similar generalization performance. However, when building a neural networks model, the neuron number is often determined randomly or by trial-and-error. These methods can often lead to the over-complex networks with many redundant neurons and therefore may result in over-fitting problems. In this paper, a new approach is proposed for obtaining a simplified neural networks with fewer neurons but still keeping a good performance comparing to the initial fully networks. More specifically, the initial neural model with a fixed model size is built using Matlab toolbox. Then, the orthogonal matching pursuit method is employed to select important neurons and drop out redundant neurons, leading to a more compact model with reduced size. Two simulation examples are used to demonstrate the effectiveness of the proposed method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.