Abstract

AbstractEye movements are essential in comprehending the environment. Users’ attention may be directed via eye gazing, which can enhance Human–Computer Interaction (HCI). Gaze estimation can make HCI more natural in a non-intrusive way. For example, tiredness detection, biometric identification, illness diagnosis, activity recognition, an estimate of alertness level and gaze-contingent display are all possible uses of eye-tracking. It's been available for decades, but it's not been widely used in consumer applications. The high cost of eye-tracking technology and the lack of consumer-level applications are the main reasons behind this. When it comes to Activities of Daily Living (ADL), assistive robots may restore essential levels of independence for the elderly and others who are disabled. Although people can communicate their wishes in numerous ways, linguistic patterns and gaze-based implicit communication of intentions remain underdeveloped, such as bodily expressions or actions. In this study, based on the eye view, a new, implicit, nonverbal communication paradigm for HCI is introduced. Conventional gaze detection systems rely on infrared lights and cameras with high-resolution sensors to achieve outstanding performance and durability. But these systems need to be modified. Thus they are confined to laboratory research and challenging to apply in the real world. Here, we recommend that the gaze be followed using a webcam. Using a webcam to obtain 2D coordinates, we provide a practical visual monitoring framework based on models. This work on the platform is intended to ease HCI and thus increase useful technology and consumers’ privacy in their daily lives. The test results of this experiment indicated that implicit human gaze patterns on visualized objects could efficiently be used for contact with people's intentions. Studies have also shown that it is simple to comprehend and use contact with the gaze. It is further suggested that the subject-dependent eye parameters are calculated using a specific reference system. Finally, an implicit communication system for monitoring and interpreting purposes has been developed. After understanding the user's implicit intention to support the elderly with the action of the eye-gaze, the device can distinguish the activities/requirements required from the home environment. Finally, to direct caregivers to deliver the correct service, the implied purpose may then be used.KeywordsActivities of daily living (ADL)Eye-gaze trackingGaze-based communicationHuman–computer interaction (HCI)Implicit intention inferenceWebcam

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call