This paper presents a practical method for analyzing drivers’ eye movements, providing a valuable tool for understanding their behavior during driving simulations. The method, which utilizes an image processing technique, addresses the challenges when the driver’s attention is on points without information about the image depth. The screen image changes or moves with the simulation. It allows us to identify the gaze position relative to the road, determining whether the glance is inside or outside. This is achieved by transforming RGB images (frames) collected by the eye-tracker video camera into a b/w image using the Canny filter. This filter can identify objects’ contours by evaluating the change in color of their surfaces. A window is then applied to these new images to extract information about the gaze position in the real world. Four drivers were used as a sample for the method’s testing. The findings demonstrate various driver variations and a disparity between driving in curved and rectilinear segments. The gaze is typically inside the road in curved sections, whereas in rectilinear sections, the gaze is frequently outside.