Visual ranging technology holds great promise in various fields such as unmanned driving and robot navigation. However, complex dynamic environments pose significant challenges to its accuracy and robustness. Existing monocular visual ranging methods are susceptible to scale uncertainty, while binocular visual ranging is sensitive to changes in lighting and texture. To overcome the limitations of single visual ranging, this paper proposes a fusion method for monocular and binocular visual ranging based on an adaptive Unscented Kalman Filter (AUKF). The proposed method first utilizes a monocular camera to estimate the initial distance based on the pixel size, and then employs the triangulation principle with a binocular camera to obtain accurate depth. Building upon this foundation, a probabilistic fusion framework is constructed to dynamically fuse monocular and binocular ranging using the AUKF. The AUKF employs nonlinear recursive filtering to estimate the optimal distance and its uncertainty, and introduces an adaptive noise-adjustment mechanism to dynamically update the observation noise based on fusion residuals, thus suppressing outlier interference. Additionally, an adaptive fusion strategy based on depth hypothesis propagation is designed to autonomously adjust the noise prior of the AUKF by combining current environmental features and historical measurement information, further enhancing the algorithm's adaptability to complex scenes. To validate the effectiveness of the proposed method, comprehensive evaluations were conducted on large-scale public datasets such as KITTI and complex scene data collected in real-world scenarios. The quantitative results demonstrate that the fusion method significantly improves the overall accuracy and stability of visual ranging, reducing the average relative error within an 8 m range by 43.1% and 40.9% compared to monocular and binocular ranging, respectively. Compared to traditional methods, the proposed method significantly enhances ranging accuracy and exhibits stronger robustness against factors such as lighting changes and dynamic targets. The sensitivity analysis further confirmed the effectiveness of the AUKF framework and adaptive noise strategy. In summary, the proposed fusion method effectively combines the advantages of monocular and binocular vision, significantly expanding the application range of visual ranging technology in intelligent driving, robotics, and other fields while ensuring accuracy, robustness, and real-time performance.