This work aims to solve the problem of low accuracy in recognizing the trajectory of badminton movement. This work focuses on the visual system in badminton robots and conducts side detection and tracking of flying badminton in two-dimensional image plane video streams. Then, the cropped video images are input into a convolutional neural network frame by frame. By adding an attention mechanism, it helps identify the badminton movement trajectory. Finally, to address the detection challenge of flying badminton as a small target in video streams, the deep learning one-stage detection network, Tiny YOLOv2, is improved from both the loss function and network structure perspectives. Moreover, it is combined with the Unscented Kalman Filter algorithm to predict the trajectory of badminton movement. Simulation results show that the improved algorithm performs excellently in tracking and predicting badminton trajectories compared with the existing algorithms. The average accuracy of the proposed method for tracking badminton trajectories is 91.40 %, and the recall rate is 84.60 %. The average precision, recall, and frame rate of the measured trajectories in four simple and complex scenarios of badminton flight video streams are 96.7 %, 95.7 %, and 29.2 frames/second, respectively. They are all superior to other classic algorithms. It is evident that the proposed method can provide powerful support for badminton trajectory recognition and help improve the accuracy of badminton movement recognition.