Music Composition in Progress
Matplotlib Frequency Identifications
Physics Formulas for Frequency Analysis
How many of you like to sing, hum, or play around with music for fun? What if you realized that the music you just made could be the next great hit, but you forgot most of the notes? This web application uses a microphone to record the frequencies and maps them into musical notes so that you don't need to worry about forgetting your music! Also, it could help you create music quickly without having to think very hard in writing the musical notes and measures; this application does it all for you in real time!
What it does
This application provides real-time music composition by using a microphone to detect particular frequencies that fall in the ranges of typical musical notes. On the frontend, there is a play/pause button to start/pause the recording. The refresh button allows the user to restart the entire composition, so we made sure to include a confirmation alert just in case some user mistakenly pressed the button. The pencil button allows the user to edit or add a title to their masterpiece, which could be changed anytime. The musical note allows the user to confirm their recording before sharing it to everyone by clicking on the download icon. The download features a quick 1-click process of letting the user save the HTML directly into a PDF file so that they can save their composition while creating new ones. The shopping cart allows the user to get similar sheet music using the eBay API that can give them the inspiration to write the particular music they want.
On the backend, there is a matplotlib graph that indicates the amplitude vs frequency distribution graph with at most six different peak frequencies labeled for the user to see. This allows them to not only learn more about the different musical notes, but also practice their ears to potentially be pitch perfect.
How we built it
On the backend, we used pyaudio to compute frequencies based on certain sampling intervals, number of channels, and size of chunks of data to optimally transfer the information from the backend to the frontend. In addition, the backend uses matplotlib to generate up to six frequency peaks that are used to identify a guitar's chords. Firebase is used as a real-time database to channel the data from the backend to the frontend.
Challenges we ran into
It was a bit difficult to develop the frequency computations in the backend as there is a relatively steep learning curve to get accustomed to the physics concepts associated with it, including but not limited to Fourier transforms and spectral noise. Initially, there was some difficulty setting up the Python environment that runs pyaudio in context of Python3. Also, it was quite challenging to render the music notes while maintaining the bar measures as we update and add new music notes in real time.
What's next for Decomposition
We would like to add chord progressions to the musical composition, which we currently have on the backend but decided to not include it due to significant deviations as a result of noise. We would also like to add undo/delete buttons to quickly edit the composition in case the user makes a mistake. In addition, we would like to integrate social media sharing to further encourage users to share their masterpieces to the community.