Abstract

In most neural network applications, the network outputs are required to be binary, i.e. to correspond to a vertex of the output hypercube space. It is shown how, in the presence of positive self-feedback, binary outputs can be guaranteed even with finite sigmoid slope, or with asymmetric connection matrices. An expression is derived, which gives a lower bound on the sigmoid slope, in order that equilibrium points, corresponding to nonbinary solutions, be unstable.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call