Inspiration

Users with physical impairments are limited by their ability to work on computers using the conventional mouse- and keyboard-based interactions. Existing accessible technologies still have usability issues, need a lot of training, and are imprecise.

What it does

We present a gaze gesture-based interaction paradigm for users with physical impairments to work on a computer by just using their eye movements. We use an eye tracker that tracks the user's eye movements. To perform an action like minimizing and maximizing an application; opening a new tab, scrolling down, refresh on a browser, etc. the user moves their eyes to make a predefined gesture. The system recognizes the gesture performed and executes the corresponding action.

Users with speech impairment can also use this system to speak quick phrases by performing gestures. This is crucial when a user with the speech impairment is interacting with another person who does not know sign language.

How I built it

The application is built in C# and uses gesture recognition algorithms to match a user's gaze-path with the template path to recognize the command.

Challenges I ran into

Gesture recognition is hard with gaze points. This is because our eyes move very fast and the system generates only a few points when a gesture is performed by the user. I had to interpolate the gesture to add more points so that it can be matched with a template and recognized. Also, adding audio responses for gestures was hard as I was running into a lot of wrong recognitions.

Accomplishments that I'm proud of

Ability to recognize gaze gestures, and map the gesture back to a command or a spoken response.

What I learned

It was quite fun to work for long hours and not feeling tired. I enjoyed the experience of developing a prototype in 24 hours, creating a video demo and the presentation.

What's next for Gaze Gesture-Based Paradigm for Accessibility in HCI

I wish to integrate more gestures into the system and improve the feedback. Currently, the only feedback is the movement of the cursor on the screen which is hard to see. While performing the gesture, the cursor should change to a different visual object and make the gesture quite apparent.

Built With

Share this project:

Updates