Abstract
As a class of important classifiers, feedforward neural networks (FNNs) have been used considerably in the study of pattern recognition. Since the inputs to FNNs are usually vectors, and many data are usually presented in the form of matrices, the matrices have to be decomposed into vectors before FNNs are employed. A drawback to this approach is that important information regarding correlations of elements within the original matrices are lost. Unlike traditional vector input based FNNs, a new algorithm of extended FNN with matrix inputs, called two-dimensional back-propagation (2D-BP), is proposed in this paper to classify matrix data directly, which utilizes the technique of incremental gradient descent to fully train the extended FNNs. These kinds of FNNs help to maintain the matrix structure of the 2D input features, which helps with image recognition. Promising experimental results of handwritten digits and face-image classification are provided to demonstrate the effectiveness of the proposed method.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Engineering Applications of Artificial Intelligence
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.