Abstract

We investigate storage capacity and generalization ability for two types of fully connected layered neural networks with non-monotonic transfer functions; random patterns are embedded into the networks by a Hebbian learning rule. One of them is a layered network in which a non-monotonic transfer function of even layers is different from that of odd layers. The other is a layered network with intra-layer connections, in which the non-monotonic transfer function of inter-layer is different from that of intra-layer, and inter-layered neurons and intra-layered neurons are updated alternately. We derive recursion relations for order parameters for those layered networks by the signal-to-noise ratio method. We clarify that the storage capacity and the generalization ability for those layered networks are enhanced in comparison with those with a conventional monotonic transfer function when non-monotonicity of the transfer functions is selected optimally. We also point out that some chaotic behavior appears in the order parameters for the layered networks when non-monotonicity of the transfer functions increases.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.