Abstract
A radial basis function network (RBF) has excellent generalization ability and approximation accuracy when its parameters are set appropriately. However, when relying only on traditional methods, it is difficult to obtain optimal network parameters and construct a stable model as well. In view of this, a novel radial basis neural network (RBF-MLP) is proposed in this article. By connecting two networks to work cooperatively, the RBF’s parameters can be adjusted adaptively by the structure of the multi-layer perceptron (MLP) to realize the effect of the backpropagation updating error. Furthermore, a genetic algorithm is used to optimize the network’s hidden layer to confirm the optimal neurons (basis function) number automatically. In addition, a memristive circuit model is proposed to realize the neural network’s operation based on the characteristics of spin memristors. It is verified that the network can adaptively construct a network model with outstanding robustness and can stably achieve 98.33% accuracy in the processing of the Modified National Institute of Standards and Technology (MNIST) dataset classification task. The experimental results show that the method has considerable application value.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.