Visual impairment people have many difficulties in everyday life, including communicating and getting information, as well as navigating independently and safely. Using auditory alerts, our study hopes to improve the lives of visually impaired individuals by alerting them to items in their path. In this research, a Video-based Smart object detection model named Smart YOLO Glass has been proposed for visually impaired persons. A Paddling - Paddling Squeeze and Attention YOLO Network model is trained with multiple images to detect outdoor objects to assist visually impaired people. In order to calculate the distance between a blind person and obstacles when moving from one location to another, the proposed method additionally included a distance-measuring sensor. The visually impaired will benefit from this system’s information about around objects and assistance with independent navigation. Recall, accuracy, specificity, precision, and F-measure were among the metrics used to evaluate the proposed strategy. Because there is less time complexity, the user can see the surrounding environment in real time. When comparing the proposed technique to Med glasses, DL smart glass, and DL-FDS, the total accuracy is improved by 7.6%, 4.8%, and 3.1%, respectively.