Abstract

The Back Propagation Model for neural network simulation is a very simple and very popular model for solving real-world problems. It does, however, suffer from one major problem, that is, the time taken for the network to adjust its weights in order to response to an input. This paper proposes a computer architecture, specifically optimised for neural network simulation, which is capable of unlimited expansion without significant degradation in unit performance. It uses a high-speed conventional processor loosely coupled to a number of very simple processor and memory nodes, each with very limited functionality.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call