Abstract

In recent years, assistive wearables technologies based on Internet of Things (IoT) platforms for amyotrophic lateral sclerosis (ALS) patients trigger broad interests. Nevertheless, the user privacy leakage issue, owing to the scene camera installed on wearables to analyze environmental information, hinders further success use for ALS patients. To address this issue, in this article, a smart human-environment interactive (HEI) environment, including eye motion detection, radio-frequency identification (RFID), and speech feedback techniques, under the IoT framework is presented. Here, the users’ intentions are first interpreted by the eye motion classification, and then the target smart devices are reported and desired operations are confirmed by the RFID and speech feedback system in a hand-shaking manner. A high average accuracy of 93.2% is experimentally achieved, demonstrating the feasibility of the proposed method in obtaining satisfying performance while avoiding potential privacy leakage.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call