Human Centered Desing: rising awareness of our user browsing experience
Our Neural Network in Action
A member of our team brought us a touching story about his family member with a visual disability struggles to use the internet and computers daily. Accessibility is something that all of us take for granted, and so we wanted to create a whole new way to interact with the web. This lead us to create a voice based extension which later we improved by adding AI to learn from our browsing behavior and better predict what we want to see.
What it does
When the extension is activated in the browser. It waits till it receives a voice command, executes the command and then a neural network analyzes our patterns and trains a model based on our online behavior.
How we built it
Challenges we ran into
- Image processing,
- Communicating between different coding languages and various API,
- Facebook front end reverse engineering (DOM),
- Difficulties integrating Facebook API
Accomplishments that we're proud of
In 36 hours we’re proud that we managed to create a whole Reinforcement Training neural network from scratch and created a revolutionary voice interface that changes the way one can interact with the web.
What we learned
What's next for Hi-Oid!
While our prototype is only for navigating a function in Facebook, our extension can potentially control any website with our voice. Our future vision would be to create an open source wrapper where other developers can code their own voice interactions in various ways not just pertaining to the web but also different hardware.