Inspiration

Music is already a massive part of everyones day to day lives, used for all sorts of purposes, so we know its a useful tool in and of itself. Music is known to have therapeutic capabilities, but to our knowledge very few tools also leverage direct user feedback to adjust the musical experience for the betterment towards a goal.

What it does

Our project, in its first incarnation puts together to pieces of a dynamic musical system that is driven by user heartbeat. Based on the user heartbeat the musical experience will vary in a way that should - in the context of meditation/relaxation - tend to reduce the heart rate, helping to regulate the users mood.

How we built it

We built this project combining all of our individual expertise

  • Designer
  • Front-end developer
  • Back-end developer
  • AI Music Researcher

We first defined the scope of the project that would be manageable in the allotted time. Then came brainstorming an initial design, agreeing on roughly what components we wish to include. The engineering elements were then discussed. Fortunately team members had some tools and resources handy that formed the backbone of the application.

Heart rate simulation with vizualization which served as the base of the final front end A music playback tool that allowed layering of pre-composed musical pieces, composed in such a way that layering would result in pleasant sounds.

The application as it stands is a vanilla Javascript, HTML, CSS project. The back end is a python flask webs server that wraps the aforementioned music playback tool, with the capability of adding and removing layers of the pre-composed tracks.

Challenges we ran into

Given that some of the elements of the final product were written by team members who had never met prior, and developed in completely different contexts, combining the tools in and of itself was a challenge, not least due to the necessary knowledge transfer and convergence of development environments.

Technically, the audio playback tool is quite a low level implementation directly writing samples to the sound device, so manipulation of the audio signals yielded interesting solutions.

Accomplishments that we're proud of

We managed to implement a fading in of audio samples programmatically over the raw samples. We managed to build combine all the disparate tooling into a functioning application. We managed to incorporate the UI design into the final front end.

What we learned

We learned about complex characteristics of music as we considered possible effects on human physiology. We learned about the market space that out concept would fit into.

What's next for RelaxSync

The underlying technology behind music control is quite complex and is very much an open research question. The application is very much conceptual at this point, so refinements on the software elements is required.

Built With

Share this project:

Updates