Abstract
In the implementation of a neural network, some imperfect issues, such as precision error and thermal noise, always exist. They can be modeled as multiplicative noise. This paper studies the problem of training RBF network and selecting centers under multiplicative noise. We devise a noise resistant training algorithm based on the alternating direction method of multipliers (ADMM) framework and the minimax concave penalty (MCP) function. Our algorithm first uses all training samples to create the RBF nodes. Afterwards, we derive the training objective function that can tolerate to the existence of noise. Finally, we add a MCP term to the objective function. We then apply the ADMM framework to minimize the modified objective function. During training, the MCP term has an ability to make some unimportant RBF weights to zero. Hence training and RBF node selection can be done at the same time. The proposed algorithm is called the ADMM-MCP algorithm. Also, we present the convergent properties of the ADMM-MCP algorithm. From the simulation result, the ADMM-MCP algorithm is better than many other RBF training algorithms under weight/node noise situation.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.