We are interested in making a positive impact with technology. In this hack, we were motivated by the idea of helping all individuals, especially the physically impaired, interact more effectively with technology. This was what inspired us to create Pure Interaction.

Pure Interaction is, at its heart, an inclusive Human-Computer Interaction system. Through the use of eye gaze estiimation, computer vision, and speech recognition, we demonstrate a product that allows individuals to interact more effectively with computers. We show how combining these three modalities of interaction can lead to more immersive experience with technology.

To build Pure Interaction, we extended the eye-gaze algorithms produced by xLabs, and combined this input modality with human speech, physical action, and emotion sensing, using the Microsoft CNTK framework. Machine learning algorithms were developed in Python, and eye gaze estimation was implemented in JavaScript. Developing a Chrome plugin to extend the eye-gaze algorithms was certainly challenging at points, but eventually issues were resolved!

We're most happy with how we combined different input modalities in Pure Interaction to create natural HCI and also feel that the software has great potential to help physically impaired individuals.

Built With

Share this project:
×

Updates