Mapping drivers’ thoughts directly to mobility system control would make driving more intuitive as if the mobility system is an extension of their own body. Such a system would allow patients with motor disabilities to drive, as it would not require any physical movement. In this paper, we therefore propose a brain-controlled mobility system that analyzes real-time neural signals elicited from motor imagery, an imagination of different body movements. As such asynchronous brain-computer interfaces (BCIs) are prone to error, our system contains shared control capabilities that take into consideration continuously updated information of the surrounding environment along with electroencephalogram (EEG) signals to improve navigating performance without precise and accurate control from the driver. With our shared control method that uses a wheelchair with light detection and ranging (LiDAR) and inertial measurement unit (IMU) sensors, we held a comparative study in which participants drove our wheelchair with and without our shared control approach using either our brain-controlled system or a keyboard in a physical environment. The experimental results show that among the five participants, the three participants that failed the driving task with the asynchronous BCI-based system could also successfully complete it using our shared control approach. Furthermore, our approach narrows the gap between driving with neural signals and driving with a widely used interface in terms of both elapsed time and safety. These results show not only the potential of brain signals for driving, but also the applicability of BCIs to real-life situations.