Abstract

This study involves the design and development of iRiter that will assist paralyzed people to write on screen only using their eye movement. It includes the precise detection of movement of eye pupil using the reflections of Near-Infrared (IR) signals from external illuminator source. IR signals are synced with the refresh rate of camera and the bright/dark reflections relative to time are tracked. For calibration, the Deep Multilayer perceptron (DMLP) uses four calibration points, five virtually generated points of first hidden layer and sixteen virtually generated points of second hidden layer. The four calibration points (actual points) are the detected pupil positions. The areas that are generated from four actual, five virtually MLP generated points and sixteen DMLP generated points are the pupil areas while the multi 1st order polynomial based transformation maps the pupil areas to the screen areas. Through the position vector of pupil area, the co-ordinates of the pupil are mapped to give position of eye on screen. There are sixteen pupil areas and corresponding sixteen multi-geometric transformations and screen areas. Data from camera was processed by a μ-controller, which passes it on to computer. openFrameworks toolkit was utilized to design graphical user interface and to display eye movement tracks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call