Abstract

The relationships among multi‐dimensional data (such as medical examination data) with ambibuity and variation are difficult to explore. The traditional approach to building a data classification system requires the formulation of rules by which the input data can be analyzed. The formulation of such rules is very difficult with large sets of input data. This paper first describes two classification approaches using back‐propagation (BP) neural network and Mahalanobis distance (MD) classifier, and then proposes two classification approaches for multi‐dimensional feature selection. The first one proposed is a feature selection procedure from the trained back‐propagation (BP) neural network. The basic idea of this procedure is to compare the multiplication weights between input and hidden layer and hidden and output layer. In order to simplify the structure, only the multiplication weights of large absolute values are used. The second approach is Mahalanobis‐Taguchi system (MTS) originally suggested by Dr. Taguchi. The MTS performs Taguchi’s fractional factorial design based on the Mahalanobis distance as a performance metric. We combine the automatic thresholding with MD; it can deal with a reduced model, which is the focus of this paper. In this work, two case studies will be used as examples to compare and discuss the complete and reduced models employing BP neural network and MD classifier. The implementation results show that proposed approaches are effective and powerful for the classification.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call