Abstract

AbstractIn this paper, we suggest basing the development of classification methods on traditional techniques but approximating them in whole or in part with artificial neural networks (ANNs). Compared to a direct ANN approach, the underlying traditional method is often easier to analyse. Classification failures can be better understood and corrected for, while at the same time faster execution can be obtained due to the parallel ANN structure. Furthermore, such a two‐step design philosophy partly eliminates the ‘guesswork’ associated with the design of ANNs. The expected gain is thus that the benefits from the traditional field, and the ANN field can be obtained by combining the best features from both fields. We illustrate our approach by working through an explicit example, namely, a Nearest Neighbour classifier applied to a subset of the MNIST database of handwritten digits. Two different approaches are discussed for how to translate the traditional method into an ANN. The first approach is based on a constructive implementation which directly reflects the original algorithm. The second approach uses ANNs to approximate the whole, or part of, the original method. An important part of the approach is to show how improvements can be introduced. In line with the presented philosophy, this is done by extending the traditional method in several ways followed by ANN approximation of the modified algorithms. The extensions are based on a windowed version of the nearest neighbour algorithm. We show that the improvements carry over to the ANN implementations. We further investigate the stability of the solutions by modifying the training set. It is shown that the errors do not change significantly. This also holds true for the ANN approximations providing confidence that the two‐step strategy is robust.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call