Hearing of stories of struggles from people who haven't been able to take full advantage of the wonders of technology due to physical disabilities, we recognize a need for an easier way to navigate the web. Using a cross-platform touchsreen allows for surfing the web with ease no matter what platform is being used.

What it does

Empowers the physically challenged to browse the web with a touchscreen/pad that recognizes many gestures, and uses Watson, triggered by gestures, to transform the data on the webpage to something more easily understood.

How we built it

Python & Node.js server, Chrome Extension w/ HTML & JS, Firebase

Challenges we ran into

Interpreting touchscreen data efficiently, as the data stored on the touchscreen were stored in 2-dimensional arrays, making data extraction inefficient if not taken care of properly.

Accomplishments that we're proud of

Recognizing gestures and manipulating the webpage accordingly.

What we learned

Quick access and interpretation of data stored in matrices, chrome web plugin and user functionality and interface interaction, and Synaptic and Watson API integration.

What's next for Chrome Touch

Implement the BlueMix text and image recognition for any webpage, to make an impact on the lives of those with physical disabilities. Implement more gestures and features, giving the user an option to add/remove different gestures tailored to their needs.

Share this project: