Abstract

Current assistive devices to help disabled people interact with the environment are complicated and cumbersome. Our approach aims to solve these problems by developing a compact and non-obtrusive wearable device to measure signals associated with human physiological gestures, and therefore generate useful commands to interact with the smart environment. Our innovation uses machine learning and non-invasive biosensors on top of the ears to identify eye movements and facial expressions. With these identified signals, users can control different applications, such as a cell phone, powered wheelchair, smart home, or other IoT (Internet of Things) devices with simple and easy operations. Combined with VR headset, the user can use our technology to control a camera-mounted telepresence robot to navigate around the environment in the first-person's view (FPV) by eye movements and facial expressions. It enables a very intuitive way of interaction totally hands-free and touch-free.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call