Abstract
In pattern recognition problems, the convergence of backpropagation training algorithm of a multilayer perceptron is slow if the concerned classes have complex decision boundary. To improve the performance, we propose a technique, which at first cleverly picks up samples near the decision boundary without actually knowing the position of decision boundary. To choose the training samples, a larger set of data with known class label is considered. For each datum, its k-neighbours are found. If the datum is near the decision boundary, then all of these k-neighbours would not come from the same class. A training set, generated using this idea, results in quick and better convergence of the training algorithm. To get more symmetric neighbours, the nearest centroid neighbourhood (Chaudhuri, Pattern Recognition Lett. 17 (1996) 11–17) is used. The performance of the technique has been tested on synthetic data as well as speech vowel data in two Indian languages.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.