Laser active detection technology utilizing the cat-eye effect provides rapid response, precise positioning, and long detection distances. However, current research mainly focuses on active detection within a single visible or near-infrared band, lacking quantitative analyses of the echo spot. In this paper, a four-interval theoretical model for dual band cat-eye target echo detection was constructed using matrix optics theory and Collins diffraction integration method. Dual-band echo detection experiments were conducted using 10.6 um far-infrared waves and 532 nm visible light waves, also the power, radius, and target-missing quantities of the echo spots were collected and quantitatively compared with the theoretical results. Results indicate that, due to the diffraction limit's effect on the distribution of the echo field, the echo power of far-infrared band detection is smaller than that of visible light band detection. The impact on the light spot caused by the positive and negative defocus values is asymmetric, with positive defocus having a lower impact on the echo spot than negative defocus at the same value. A weak positive defocus value that minimizes the radius of the echo spot and maximizes the echo power exists, with the value of weak positive defocus varying between detection bands. A linear relationship exists between the incident angle of the detection laser and the deviation of the echo spot. These findings provide a foundation for extracting working band details, predicting the motion trajectory of moving cat-eye targets, and achieving real-time tracking and detection recognition during laser active detection.