In an age when texting and captions and comments form the vast majority of young people’s social capital and interaction, dyslexic people are even more disadvantaged. We're connected to people with dyslexia, and after seeing what it can do up close, we thought that it would be important to think towards viable solutions to mitigate symptoms of dyslexia when reading.
What it does
Zelixa is a Hololens app that helps dyslexic people learn to read better over time. It uses the technique of focus points, typefaces, and colors, to help those with dyslexia focus on the correct portions of words. We take a picture through the Hololens, which gets sent via POST request to our server, and that server uses the Microsoft Computer Vision API to analyze the text and the positioning of it - that way, we can determine where the focus points need to be.
It takes the locations of the places the points should be placed and adds it into the Hololens environment, so they can slowly train themselves to focus on the right parts of words and end up reading more comfortably over time.
How we built it
We have two portions: first, we have the Hololens portion. We use C# code to get a picture of what is being viewed in the Hololens and is sent to our server via POST request. Our server, the second portion, is built with Node.js and Express.js, and uses the Microsoft Computer Vision API to return the text in the picture we took.
Challenges we ran into
Those of us working on the Hololens portion weren't familiar with C#, and those of us working on the web server weren't that familiar with Node/Express. It took a while to get re-acclimated to those languages and implement some difficult functionality.
Accomplishments that we're proud of
It was everyone's first time dealing with a Hololens app! We also did an insane amount of research and got a lot of realtime data from our communities.
What we learned
A lot about dyslexia, and about C# + Node + Express. We did a ton of research on dyslexia beforehand, and we ended up with almost 10 pages of notes on it.
What's next for Zelixa
If we had more time, we could have had faster processing of our images and getting it onto the Hololens, expand it to other illnesses, have a better UX.
Log in or sign up for Devpost to join the conversation.