This paper presents an Assistive Navigation System (ANS) for a Robotic Wheelchair (RW) relying on a Brain–Computer Interface (BCI), as the Human–Machine Interface (HMI). A two-layer collaborative control approach is proposed to steer the RW, taking into account both user and machine commands. The first layer, a virtual-constraint layer, is responsible for enabling/disabling the user commands, based on context. More specifically, user commands are enabled for a set of situations requiring user decision, namely, bifurcations, multiple-directions caused by new obstacles in the environment, and deadlocks. The second layer is a user-intent matching responsible for determining the suitable steering command that better fits the user selection, taking into account the user competence to steer the wheelchair, and situation awareness of potential directions at a given location. A P300-based BCI allows the selection of commands to steer the RW. Experimental results using RobChair (Pires and Nunes (2002) [7], Lopes et al. (2007) [42]) are presented, showing the effectiveness of the proposed methodologies. The ANS was validated with ten able-bodied participants, and one participant with cerebral palsy, in two different scenarios: a structured known environment, and a structured unknown environment with moving objects. The overall result was that all participants were able to successfully operate the device, showing a high level of robustness of both, the BCI system, and the navigation system.
Read full abstract