What it does

This project is a communication application which uses the Tobii Eye Tracker 4c. It is designed to help people with communication and movement related disabilities express basic needs. Users of the software may make their feelings and assistance needs heard by simply gazing at on-screen objects with only their eyes.

How I built it'

I used Tobii Pro SDK, Unity, and C# scripts.

Challenges I ran into

Microsoft Visual Studio did not recognize the Speech Synthesizer framework for text-to-speech functionality. So unfortunately, there is no sound when a button is activated (hopefully this will be fixed!)

Accomplishments that I'm proud of

The user can "activate" on-screen objects by gazing at them for 4 seconds straight. For example, if the user gazes at the "Assistance" button for 4 seconds, the assistance button is activated and the user is presented with a new menu screen of buttons for various "Assistance" needs.

What I learned

How to design a user interface of 3d objects which have the GazeAware component.

What's next for Virtual Communication Chart

I would like to add the Speech Synthesizer text-to-speech functions to this project. I also think an on-screen keyboard would be great for a less-restrictive communication design.

Built With

Share this project: