BackgroundWith the arrival of the new generation of artificial intelligence wave, new human-robot interaction technologies continue to emerge. Brain–computer interface (BCI) offers a pathway for state monitoring and interaction control between human and robot. However, the unstable mental state reduce the accuracy of human brain intent decoding, and consequently affects the precision of BCI control. New methodsThis paper proposes a hybrid BCI-based shared control (HB-SC) method for brain-controlled robot navigation. Hybrid BCI fuses electroencephalogram (EEG) and electromyography (EMG) for mental state monitoring and interactive control to output human perception and decision. The shared control based on multi-sensory fusion integrates the special obstacle information perceived by humans with the regular environmental information perceived by the robot. In this process, valid BCI commands are screened by mental state assessment and output to a layered costmap for fusion. ResultsEight subjects participated in the navigation experiment with dynamically changing mental state levels to validate the effects of a hybrid brain-computer interface through two shared control modes. The results show that the proposed HB-SC reduces collisions by 37.50 %, improves the success rate of traversing obstacles by 25.00 %, and the navigation trajectory is more consistent with expectations. ConclusionsThe HB-SC method can dynamically and intelligently adjust command output according to different brain states, helping to reduce errors made by subjects in a unstable mental state, thereby greatly enhancing the system's safety.