Usability testing is expensive and inaccessible, so we aim to replace large testing labs with AI.

What it does & How we built it

It watches a tester navigate through a website, listening to their thoughts and looking at facial expressions and where they're looking. It uses speech to text (azure speech api), Luis (azure language understanding), sentiment analysis (azure analytics api), estimated demographics and facial expressions (azure face api) to gather data on how the test is going, then stores each frame in azure's CosmosDB. Clicks, cursor position, and browser navigation are all tracked by Selenium and PyInput.

All of this data can then be accessed on the admin dashboard (written in React and NodeJS/Express) with data visualization (ChartJS), sorting, and transcripts with navigation events.

Challenges we ran into

The Azure Python SDK had a lot of module compatibility issues and wouldn't work on one team member's computer regardless of the virtual environment settings. Making all of the APIs, selenium, and input tracking run in parallel was difficult to manage.

Accomplishments that we're proud of

Getting informative data about the user testing the site and being able to analyze it with Azure APIs. We learned what Azure was, and how to interface with it in Python. Challenge: AccordingToAllKnownLawsOfAviationThereIsNoWayABeeShouldBeAbleToFlyTo.Space

Built With

Share this project: