As technology advances we begin to see a trend. From clicking buttons to touch screen, from touch screen to motion sensor, the functionality of everything requires less and less of the user, making its usage more at ease. Soon this lead to using simply thoughts as a source of control. Our team had the opportunity to take part in MHacks 6 and was given access to Muse, a EEG headset. We decided to take this chance to make an app that would connected to the language of the world: music.
What it does
Museic is an application which uses the Muse headset to detect the user's level of excitement, and from that data, will make a personalized music playlist from spotify.
How we built it
We made use of the Muse Research Tools SDK. We used
muse-io to read data from the Muse over Bluetooth, and we used
node-osc to connect it to our Node/Electron GUI application. To find the songs, we use the Spotify API, proxying to it using a custom Azure backend.
Challenges I ran into
This project put each of the team members into new situations as well as environments. For each of us, challenges appeared at every corner.
This was my first time writing an Electron app, so figuring out Electron's interprocess communication (IPC) architecture was the first challenge. I eventually figured out that we could use the IPC module and the
webContents module to perform bi-directional asynchronous communication. Additionally, the Muse SDK for Linux did not appear to work, leaving me to primarily build the frontend early on.
Additionally, at this hackathon, I held the role of the most experienced frontend developer, and I wished for each of my team members to build significant portions, so I spent most of my time tutoring my teammates. I hope they feel like they did something, because they were awesome!
Accomplishments that I'm proud of
Simply accomplishing this project under the time limit we had was an accomplish that every one of us were very proud of. Throughout the production period, we also had our own individual proud accomplishments.
Ningxian Fan: When we first started this project, we thought that only using alpha waves data would be a good enough indication of the level of excitement for a person. However, as we started testing out the data we found that there was a correlation but it was not as strong of a correlation as we wanted. I then did some research on Google and found that alpha waves were the dominant waves for when a person is meditating while beta waves are the dominant waves for conscious thinking. I then shared this with the team and tried to record the data of the ratio of beta to alpha waves. We then found that there was a very strong correlation between the ratio and the excitement level of the users.
Anjana Rajagopal: I am very proud of the learning experience that this project and my peers enriched me in. I'm happy to have gotten the opportunity to work so well in a team and get to know several different languages and programming applications. It showed me a new side to computer science hat I could not have imagined and I had the opportunity to build something big and unique. It really paid off in the end with our combined intelligence, teamwork, determination, and passion, and it leaves me yearning to come back for more of these experiences.
Vinay Hiremath: I was definitely proud of the experience I got working with more complex APIs that in this case required OAuth authentication, which I did not have much experience in. However, by persistent and reading through various pieces of documentation I was able to create a backend API that fulfilled the needs of the project, which was to respond with songs of various mood types that were discovered by searching through playlists with relevant keywords.
What I learned
Nathan Moos: I learned how to process the massive brain-wave data stream, how to tune a threshold loop when under pressure, how the Electron multi-process architecture works, and how we can make use of programs written in a variety of languages via the Open Sound Control (OSC) protocol. Additionally, I learned more about how to teach people languages, runtimes, and systems they are not familiar with.
Vinay Hiremath: I learned to use the Express web framework as well as better ways of handling OAuth with Node.js, but also a lot more about the intricacies of the brain waves that the Muse device captures. There was a large stream of data that needed to be efficiently processed in real-time, and I learned more about the significance of each.
What's next for Museic
- Increasing accuracy for the mood detection feature
- Improving the UI to facilitate generating client OAuth tokens
- A method for users to receive an email with their generated music