Abstract

In recent years, assistive wearables technologies based on Internet of Things (IoT) platforms for amyotrophic lateral sclerosis (ALS) patients trigger broad interests. Nevertheless, the user privacy leakage issue, owing to the scene camera installed on wearables to analyze environmental information, hinders further success use for ALS patients. To address this issue, in this article, a smart human-environment interactive (HEI) environment, including eye motion detection, radio-frequency identification (RFID), and speech feedback techniques, under the IoT framework is presented. Here, the users’ intentions are first interpreted by the eye motion classification, and then the target smart devices are reported and desired operations are confirmed by the RFID and speech feedback system in a hand-shaking manner. A high average accuracy of 93.2% is experimentally achieved, demonstrating the feasibility of the proposed method in obtaining satisfying performance while avoiding potential privacy leakage.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.