Abstract

In this paper, a sequence-based neural network approach called feedforward sequential learning (FSL) is proposed for extending the range of feasibility for feedforward networks in the three areas of architecture, training, and generalization. The extension is enabled through a spatio-temporal indexing scheme that decomposes the task into a sequence of simpler subproblems. Each subproblem is then solved by a separate weight state. The separate trained weight states are then combined into a continuous final weight state sequence to enable smooth generalization. FSL can be used to train mappings of analog or discrete I/O with underlying continuity for pattern association or classification. Implementation of FSL is illustrated and tested by learning the 2-spirals problem and an extended 4-spiral version. Training is found to be faster and more robust than its single-state counterpart. The generalization obtained indicates that the underlying patterns are classified more smoothly with FSL. Overall, the results suggest FSL to be a feasible approach to consider for complex and decomposable tasks.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.