Abstract

ABSTRACTA generalized formalism for feedforward neural networks is presented. This generalized architectureis shown to be capable of mapping many common neural network paradigms into a single architecture.Using an intrinsically iterable element, neural networks can be used to compute common preprocessingtechniques including Karhunen-Loeve reduction, Fourier and Gabor spectral decomposition and some wavelet techniques. The generalized architecture is applied to a problem in tactical target image segmentation. 1. A Generalized Neural Network AlgorithmA generalized formulation of feedforward network node can be represented by the equation: Zkfh(XTAX+WTX+Ok) (1)Where T,T, 4, and 9k are propagation constants or weights, and zk is the output of a particular node. Thisformulation, by not specifically addressing training, allows for simple implementation of many commonneural network paradigms, as well as many common preprocessing techniques which include Karhunen-Loeve reduction, Fourier and Gabor spectral decomposition in a connectionist network architecture.Cybenko showed that one hidden layer is sufficient for any multivariate function approximation.3Oxley9 et al proved that the output of the hidden layers can be, not only a sigmoid non-linearity, butalso a negative exponential.9 While Cybenko and others

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.