Thanks to advances in electric wheelchair design, persons with motor impairments due to diseases, such as amyotrophic lateral sclerosis (ALS), have tools to become more independent and mobile. However, an electric wheelchair generally requires considerable skill to learn how to use and operate. Moreover, some persons with motor disabilities cannot drive an electric wheelchair manually (even with a joystick), because they lack the physical ability to control their hand movement (such is the case with people with ALS). In this paper, we propose a novel system that enables a person with motor disability to control a wheelchair via eye-gaze and to provide a continuous, real-time navigation in unknown environments. The system comprises a Permobile M400 wheelchair, eye tracking glasses, a depth camera to capture the geometry of the ambient space, a set of ultrasound and infrared sensors to detect obstacles with low proximity that are out of the field of view for the depth camera, a laptop placed on a flexible mount for maximized comfort, and a safety off switch to turn off the system whenever needed. First, a novel algorithm is proposed to support continuous, real-time target identification, path planning, and navigation in unknown environments. Second, the system utilizes a novel N-cell grid-based graphical user interface that adapts to input/output interfaces specifications. Third, a calibration method for the eye tracking system is implemented to minimize the calibration overheads. A case study with a person with ALS is presented, and interesting findings are discussed. The participant showed improved performance in terms of calibration time, task completion time, and navigation speed for a navigation trips between office, dining room, and bedroom. Furthermore, debriefing the caregiver has also shown promising results: the participant enjoyed higher level of confidence driving the wheelchair and experienced no collisions through all the experiment.