Abstract

A conventional scheme to operate neural networks until recently has been assigning the architecture of a neural network and its subsequent training. However, the latest research in this field has revealed that the neural networks that had been set and configured in this way exhibited considerable redundancy. Therefore, the additional operation was to eliminate this redundancy by pruning the connections in the architecture of a neural network. Among the many approaches to eliminating redundancy, the most promising one is the combined application of several methods when their cumulative effect exceeds the sum of effects from employing each of them separately. We have performed an experimental study into the effectiveness of the combined application of iterative pruning and pre-processing (pre-distortions) of input data for the task of recognizing handwritten digits with the help of a multilayer perceptron. It has been shown that the use of input data pre-processing regularizes the procedure of training a neural network, thereby preventing its retraining. The combined application of the iterative pruning and pre-processing of input data has made it possible to obtain a smaller error in the recognition of handwritten digits, 1.22 %, compared to when using the thinning only (the error decreased from 1.89 % to 1.81 %) and when employing the predistortions only (the error decreased from 1.89 % to 1.52 %). In addition, the regularization involving pre-distortions makes it possible to receive a monotonously increasing number of disconnected connections while maintaining the error at 1.45 %. The resulting learning curves for the same task but corresponding to the onset of training under different initial conditions acquire different values both in the learning process and at the end of the training. This shows the multi-extreme character of the quality function – the accuracy of recognition. The practical implication of the study is our proposal to run the multiple training of a neural network in order to choose the best result

Highlights

  • Deep neural networks are a powerful tool for addressing a wide range of tasks in the fields of image processing, unmanned object management, disease diagnostics, recognition, voice signal generation, etc

  • In order to reduce the redundancy of neural networks while improving the quality of operation, it seems appropriate to study approaches that differ from each other in the principle of action and the possibility of their combined application. Among these directions is the use of an input signal distortion and its joint work with the usual pruning [3] regarding the training of a multilayer perceptron

  • The aim of this study is to examine the effectiveness of the combined application of data pre-distortion and the iterative algorithm to reduce connections and to post-train a multilayer perceptron using an example of handwritten digit recognition from the MNIST set

Read more

Summary

Introduction

Deep neural networks are a powerful tool for addressing a wide range of tasks in the fields of image processing, unmanned object management, disease diagnostics, recognition, voice signal generation, etc. [1]. The efficiency of the practical use of neural networks improves as their dimensionality increases. In this case, a practical constraint is the limited computational power of the processors used. A practical constraint is the limited computational power of the processors used This is especially true when a mobile phone or microcontroller serves as a computational tool. This circumstance has initiated a large body of research that identified the redundancy of fully connected neural networks [2] and defined basic approaches to reducing it. It is a relevant task to undertake further research aimed at reducing the redundancy of neural networks while maintaining, or even improving, their operational quality

Literature review and problem statement
The aim and objectives of the study
Parameters for the applied neural network and input data
Findings
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call