Abstract

In this paper, the fully-connected higher-order neuron and sparselized higher-order neuron are introduced, the mapping capabilities of the fully-connected higher-order neural networks are investigated, and that arbitrary Boolean function defined from (0,1)/sup N/ can be realized by fully-connected higher-order neural networks is proved. Based on this, in order to simplify the networks' architecture, a pruning algorithm of eliminating the redundant connection weights is also proposed, which can be applied to the implementation of sparselized higher-order neural classifier and other networks. The simulated results show the effectiveness of the algorithm.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.