Muse is an awesome technology that grant access to how we think. Would it be awesome to get brain data when testing interfaces design, program features, movie feeling or game learning curves.
What it does
Show data of brain activity and screen capture to analyse the behavior of users testing an app, a video game and more, with the muse headband.
How I built it
We used a python framework (Turbogear) as a backend to deliver the data streamed from the muse and a desktop app to a web dashboard interface.
Challenges I ran into
- Learning how to use the information provided by the muse.
- New technologies and language (network, backend)
- Making work together many languages and frameworks
- Packaging of the solution so it can be used by many (Docker file)
Accomplishments that I'm proud of
- Learning so many things in a short period
What I learned
- How to use websocket, ftp and sftp file upload
- New language : python
- Shell usage
- How to build a web server from scratch
What's next for QAmuse
- Designing a dashboard for tablets and phones.
- Add other metrics: - Eye motion tracking (with webcam/camera) - Mouse and keyboard heatmap - Ability to see many users at one time - Pulse and body motion - Live data