Abstract

We discuss the multi-dimensional exclusive-OR (EOR) problems and the extension, general classification problem. These problems can be solved by multi-layer neural networks and the back propagation learning (BP) in systematic processing. However, the solution is reasonable in case of small dimension, i.e., under 6 <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">th</sup> , where the number of data is 2 <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">6</sup> =64. Over 7 <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">th</sup> ..., we haven't effective approach yet. It is very hard to find convergence path toward the global minimum to classify general cases. To break the limitation, we propose a partial stepwise learning for the BP, which is derived by a kind of symmetric character found in teaching data set. Where, there is no clear-cut symmetry but indistinct one; that is, the symmetry is defined for most of elements, but not for small parts. We used the ambiguous symmetric idea to get initial-guess for connections among neurons. Thus; we got a stepwise learning, and had solve EOR and generalized classifications less than 11 <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">th</sup> /10 <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">th</sup>

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call