Inspiration

As technology advances we begin to see a trend. From clicking buttons to touch screen, from touch screen to motion sensor, the functionality of everything requires less and less of the user, making its usage more at ease. Soon this lead to using simply thoughts as a source of control. Our team had the opportunity to take part in MHacks 6 and was given access to Muse, a EEG headset. We decided to take this chance to make an app that would connected to the language of the world: music.

What it does

Museic is an application which uses the Muse headset to detect the user's level of excitement, and from that data, will make a personalized music playlist from spotify.

How we built it

We made use of the Muse Research Tools SDK. We used muse-io to read data from the Muse over Bluetooth, and we used node-osc to connect it to our Node/Electron GUI application. To find the songs, we use the Spotify API, proxying to it using a custom Azure backend.

Challenges I ran into

This project put each of the team members into new situations as well as environments. For each of us, challenges appeared at every corner.

Ningxian Fan: For me, this project was very challenging because I was still relatively new to coding. One of the parts that I had to finish was in the language of javascript to which I had to previous experiences with. This made my biggest challenge which was to learn and do some simple coding in javascript within a short amount of time.

Anjana Rajagopal: This project was a huge learning experience for me. I had the opportunity to expand my knowledge of Javascript/Node, GitHub, and audio-oriented programming. Although challenging, I learned a lot and got to experience all the different possibilities that could come out of projects like these. My role was mainly front end, working on the audio functionalities, the changes in mood, fades into songs, and getting songs via URL. Nathan taught me so much from this project and I'm coming out of MHacks as a more confident hacker, programmer, and problem-solver.

Nathan Moos: This was my first time writing an Electron app, so figuring out Electron's interprocess communication (IPC) architecture was the first challenge. I eventually figured out that we could use the IPC module and the webContents module to perform bi-directional asynchronous communication. Additionally, the Muse SDK for Linux did not appear to work, leaving me to primarily build the frontend early on. Additionally, at this hackathon, I held the role of the most experienced frontend developer, and I wished for each of my team members to build significant portions, so I spent most of my time tutoring my teammates. I hope they feel like they did something, because they were awesome!

Vinay Hiremath: This project was a wonderful experience, though it was not without challenges. During construction of the Node.js backend, which integrates with Spotify to provide curated results based on mood, I delved into some JavaScript constructs such as nested promises which at first resulted in some difficult-to-trace errors that were later resolved after more extensive debugging. In addition, closer to completion we ran into some inconsistencies with the Spotify API likely due to copyright restrictions that necessitated a tricky fix. Overall, I was able to use the Express web framework for the first time to create a simple yet reliable API.

Accomplishments that I'm proud of

Simply accomplishing this project under the time limit we had was an accomplish that every one of us were very proud of. Throughout the production period, we also had our own individual proud accomplishments.

Ningxian Fan: When we first started this project, we thought that only using alpha waves data would be a good enough indication of the level of excitement for a person. However, as we started testing out the data we found that there was a correlation but it was not as strong of a correlation as we wanted. I then did some research on Google and found that alpha waves were the dominant waves for when a person is meditating while beta waves are the dominant waves for conscious thinking. I then shared this with the team and tried to record the data of the ratio of beta to alpha waves. We then found that there was a very strong correlation between the ratio and the excitement level of the users.

Anjana Rajagopal: I am very proud of the learning experience that this project and my peers enriched me in. I'm happy to have gotten the opportunity to work so well in a team and get to know several different languages and programming applications. It showed me a new side to computer science hat I could not have imagined and I had the opportunity to build something big and unique. It really paid off in the end with our combined intelligence, teamwork, determination, and passion, and it leaves me yearning to come back for more of these experiences.

Nathan Moos: I am very proud of my teammates I tutored -- they managed to pick up JavaScript, Node, and Electron within the first four hours of the hackathon. Many of them only had experience from the Intro to Computing course at the University of Michigan, so they not only learned a new language, but they also learned powerful, valuable techniques for building apps.

Vinay Hiremath: I was definitely proud of the experience I got working with more complex APIs that in this case required OAuth authentication, which I did not have much experience in. However, by persistent and reading through various pieces of documentation I was able to create a backend API that fulfilled the needs of the project, which was to respond with songs of various mood types that were discovered by searching through playlists with relevant keywords.

What I learned

Ningxian Fan: For this hack, I learned some basic javascript coding as well as some new interesting facts about brain waves. I always thought that each computer language was completely different from another, but after receiving help with javascript, I found that it was very similar to c++.

Anjana Rajagopal: I learned so much from this. Not only did I learn Node, Javascript, Electron, and Git, but I also learned about many of the fascinating tools and details of the muse. I learned to work better as a teammate, keep myself motivated throughout the 36 hours, do the best that I could with all that I had, and had a lot of fun during the whole process. I got a lot of help throughout this project, but I got so much out of it.

Nathan Moos: I learned how to process the massive brain-wave data stream, how to tune a threshold loop when under pressure, how the Electron multi-process architecture works, and how we can make use of programs written in a variety of languages via the Open Sound Control (OSC) protocol. Additionally, I learned more about how to teach people languages, runtimes, and systems they are not familiar with.

Vinay Hiremath: I learned to use the Express web framework as well as better ways of handling OAuth with Node.js, but also a lot more about the intricacies of the brain waves that the Muse device captures. There was a large stream of data that needed to be efficiently processed in real-time, and I learned more about the significance of each.

What's next for Museic

  • Increasing accuracy for the mood detection feature
  • Improving the UI to facilitate generating client OAuth tokens
  • A method for users to receive an email with their generated music

Built With

Share this project:
×

Updates