The Backpropagation Neural Network (BPNN) is a deep learning model inspired by the biological neural network. Introduced in the 1980s, the BPNN quickly became a focal point in neural network research due to its outstanding learning capability and adaptability. The network structure consists of input, hidden, and output layers, and it optimizes weights through the backpropagation algorithm, widely applied in image recognition, speech processing, natural language processing, and more. The mathematical model of neurons describes the relationship between input and output, and the training process involves adjusting weights and biases using optimization algorithms like gradient descent. In applications, BPNN excels in image recognition, speech processing, natural language processing, and financial forecasting. Researchers continuously experiment with optimization algorithms, including the Grey Wolf Algorithm, Genetic Algorithm, Particle Swarm Algorithm, Simulated Annealing Algorithm, as well as comprehensive strategies and improved gradient descent algorithms. In the future, with the ongoing development of deep learning, BPNN is poised to play a crucial role in tasks such as image recognition and speech processing.