Abstract

In this study, a human-computer interface was created in C# so that individuals with physical mobility disabilities such as ALS can express their wishes. In this system created, pupil movements were analyzed and the patient's wishes were expressed both visually and audibly. In the system created for the tracking of the pupil, the face of the patient, which was detected by the camera, was detected autonomously by the system. An adaptive IR LED light source has been designed to illuminate the eye area of the user. Pupil motion detection was performed with the developed image processing algorithms. According to the movements of the detected pupil, commands were created on the user interface to express the wishes of the patient by using the location information of the patient. An application study was carried out by creating the prototype of the controlled patient bed with a 3D printer. At the end of this study, pupil motion detection was carried out using a camera without any contact with the user. With the algorithm created for pupil motion detection, it is ensured that the patient can express his wishes without the need for any movement other than eye movement. With this study, a uniquely developed algorithm that can be used in pupil tracking systems of individuals with physical movement disabilities such as ALS has been acquired.  

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call