Coming into this project I looked at was available on the market. What I saw were braille interfaces for use by those who are blind to control their computer, and eye tracking software for use by those who are paralyzed or cannot physically interact with their device. There's nothing on the market right now that is an accessibility solution for those who are both blind and paralyzed and would like to interact with a tablet or computer. Braille readers don't work for those who are paralyzed and eye trackers don't work for those who are blind.
What we made
We developed a keyboard based in an iOS app that lets the user type with their facial expressions. A user can alternate between smiling and relaxing their face and this data in combination with blinks can be used to type or control the computer. We used an efficient algorithm that turned facial data into Morse code and then depending on which eye is blinked a user can either type a letter or perform an action like play a song or open a web browser. Also for this project I wrote my own face detection and facial expression detection algorithms.
Why it matters
Accessibility has been in the minds of tech developers around the world in recent history. Most of us have seen the breakthroughs in technology such as eye recognition technology for those that are physically impaired. With our Facial Detection Keyboard we are able to close the gaps in accessibility when it comes to communication. Giving a blind and paralyzed individual the ability to type freely using nothing but their face opens up new doors.
What we learned
We actually built two separate solutions for this hackathon. In our first attempt, we built a python app that could be used to type based on brainwaves. This worked with the NeuroSky MindWave Headset We had two working solutions and decided that facial expressions were a better approach after testing both solutions and finding the typing speed of optimal accuracy to be much better with facial expressions.