Studies have shown that listening to different kinds of music can improve focus, help to manage stress, and improve mood during times of sadness. With music having such a powerful impact on our mood, music therapy can be utilized to improve one’s emotional health. A recent review in the World Journal of Psychiatry found that music is a valid therapy to potentially reduce depression and anxiety, as well as to improve mood, self-esteem, and quality of life.

During the days of social distancing and remote learning/work, it can be easy to spend hours working at your computer without giving a thought to your mental health. Aura is an easy way to take a step towards having a healthier and more productive lifestyle.

What it does

Aura is your personal musical companion designed to monitor your emotions and generate music to fit your mood. The background gradient is animated to match your emotions. A warm colored background signifies happy, a cool background signifies sad, and a rainbow background signifies neutral.

Using your webcam, Aura detects your mood through your facial expression, categorizing it as happy, neutral, or sad. Then, Aura uses machine learning to generate music reflective of your current emotional state. Aura can also be customized to your preferences to give you the best experience possible. Users can choose to include drums and nature sounds in addition to the instrumentals.

Aura also keeps track of your emotions throughout the day. Navigating to the 'Analytics' page allows users to view which emotions they expressed in three graphs.

How we built it

Aura was developed using HTML/CSS and Node.js. Facial expression recognition was implemented using face-api.js. To generate the AI audio, the MusicRNN model from the Magenta.js library was used. Sample audio from a piano, guitar, and marimba were input into the Tone.js Sampler which created the notes that are played. Chart.js was used to generate the two line graphs and the pie chart displaying user expressions throughout the day.

Challenges we ran into

One challenge we ran into was transferring data between different JavaScript pages. We wanted to display the facial expression data on the analytics page, but the data was collected in the JavaScript file associated with our home page. To solve this, we used LocalStorage to store daily and weekly data which we were then able to access for our analytics page.

Another challenge we ran into was finding samples for Tone.js. It was difficult to find good audio files that also included a wide enough range.

Accomplishments that we're proud of

We're proud of the calming and energizing music our AI produced. We are extremely pleased by the harmonies generated by our AI and the change in the audio when the user's mood changes.

What we learned

We had never used face-api.js before, nor had we accessed a computer's webcam before. It was a very new and exciting experience to see ourselves reflected on our computer screen!

What's next for Aura

The face-api.js also included detection for anger, disgust, surprise, and fear. We would love to be able to incorporate these emotions into Aura and generate interesting music based on these emotions!

Built With

Share this project: