Abstract

Recognition of handwritten numeral using convolutional neural networks (CNN) is a hot research topic at present. However, CNN algorithm requires high hardware equipment and has slow prediction speed. Traditional machine learning classification algorithms usually rely on manual experience to extract features. This paper proposes a new algorithm of handwritten numeral recognition based on GPU multi-stream concurrent and parallel model. The new algorithm constructs a programming environment based on the CUDA architecture, uses CUDA/C++ programming to implement the convolutional neural network algorithm. Applying the convolutional neural network algorithm to the handwritten numeral recognition problem, and selecting the appropriate network model and related parameters, GPU has high concurrency performance, which can improve the speed of convolutional neural network training data. By comparing the GPU and CPU implementation process, it is verified that it is feasible and effective to perform CUDA parallelization training and recognition on the convolutional neural network algorithm. The experiment shows that the GPU-implemented convolutional neural network algorithm has the characteristics of fast convergence, high recognition rate and fast recognition speed. The features extracted by CNN have various characteristics, which could improve the efficiency of handwritten numeral feature recognition, and solve the process of traditional manual experience extracting features and dependence on prior knowledge.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call