Inspiration

ConductMe was inspired by our collective passion and interest in music. Our initial plan involved using text analysis to generate the appropriate music given a particular story’s tone. We decided, however, to move in the direction of actually conducting music when we realized that we had hardware such as Kinect available to us.

What it does

At its core, ConductMe is about allowing the user to become a conductor of his own music. OneDrive and relies on input from Unity/Visual Studio. The web app displays a visualizer that app was constructed using Node.js; we used libraries that included Express, Busboy, Node­Sass, project; one consisted of researching a way to accurately calculate the BPM (beats­per­minute) that relied on filtering for only lower­frequency notes, identifying clusters of notes, and calculating how often these clusters appeared. This was done by taking the raw data for the song and passing it through the HTML5 Web Audio API. This was especially interesting because it revealed the sophistication of the new HTML5 APIs, and it seems like an inspiring future for high­quality media apps on the web.

ConductMe uses the Kinect Motion Capture to track the velocity of a user’s hand and calculate the BPM at which he or she is conducting. The measured value is then used to adjust the speed at which the music is being played at. ConductMe aims to give users more control over the playback of their music, allowing them to experience music more interactively.

Construction

The final product consists of a web app for both the client and server ends that integrates with allows users to see the progression and rhythm of the song as it plays; OneDrive was used to store music files, and Unity/Visual Studio was used to process input from the Kinect. The web Jade, and more. We used the standard libraries in Unity/Visual Studio to parse input and communicate with our web app; web sockets were created in both the web app and Visual Studio for data transfer.

Challenges we ran into

Only one person had experience with Node.js, and none had experience with any other SDKs that were used; this in itself posed a challenge for us. There were two major components to this of a song, and the other consisted of constructing a web app to both play songs and interact with users to speed up and slow down playback.

Calculating the BPM posed a challenge for several reasons; the first and foremost is that music formats don’t expose lower­level song properties (e.g. BPM!). Eventually, we found a method  Constructing the web app was more straightforward but involved more work; we had to build a web app using Node that would post audio files to the server, where the server would store and analyze the files, sending the results back to the client who would appropriately respond and change playback.

Most challenging of all was working with Kinect. The majority of the issues we had while beginning was getting set­up and becoming familiar with the environment; because of the slow internet (to say the least) and the number and size of SDKs that we had to install to grab input from the Kinect, we weren’t able to start working with the Kinect until about twelve hours into the hackathon. Moreover, after spending a few hours with Unity paring down useless objects and writing VB code to parse Kinect input, we found that Unity continually crashed; it became necessary to restart Unity after every single test we ran. tactics and resolving merge conflicts.

What I learned

Most of what we learned consisted of gaining experience in APIs and languages that we had never used before. It was the first time that any of us had seen C#, Visual Studio, and Unity, and most of us had little to no experience with web development, let alone Node.js and its libraries.

We ran into quite a few Git issues as well and were able to learn more about version control

What's next for ConductMe

ConductMe can be applied for both recreational and educational purposes. It can help rising ensemble. For the general public, it allows people to adjust music speed, fast forwarding and slowing down at their leisure with a small wave of the hand. This could have applications to audio files of all types. For example, students could use ConductMe to easily play audio lectures More hand signals could also be integrated into the ConductMe program. A closed fist could signal the program to pause the audio and an open hand could signal the program to play. A waving hand could be used to switch to the next song.  Conductors gain awareness of the impact their gestures have on the speed and volume of the aloud at 2x speed.

ConductMe could be influential beyond the audio world if we can link it to the video file as well as the audio . ConductMe would then empower the user to control playing speed of videos as well.

Built With

Share this project:

Updates