What it does
Amadeus is an interactive application that teaches you about waveforms via a repurposed Guitar Hero controller and an ESP32 connected to Unity via Bluetooth. Looking into the VR glasses, the Quest 2, you are immersed in a sea of particles visualizing the transformations of modulated waveforms.
Waves are all around us and we have to be present to them. Music and audio are built around concepts related to physics. The transformations and harmonies in the music we hear are created through waves traveling through the air. We often learn about these concepts separately: through physics lessons and music lessons Furthermore, we typically only get a two-dimensional visual of waveforms.This inspired us to create an interactive and immersive experience that brought users to a world where they can control sound and visualize its transformations in a 3D. We all have a musical background and we wanted to create something to share our love of music with younger people.
How we built it
We built this through the help of so many amazing mentors and the advice of friends in the fields of music and audio engineering. The application is made in Unity, the particle effects are created using the Unity VFX graph, and the audio is generated using a theremin-based synthesizer via the CSound library – a library that was originally made by Barry Vercoe in 1985 at MIT Media Lab. The instrument is created with a Guitar Hero controller repurposed with added potentiometers and an ESP32 for logic and connectivity. The ESP32 talks to Unity via Bluetooth and Unity receives the data via the Singularity Bluetooth package.
Challenges we ran into
While we have worked in teams before, building a cohesive project management system in a tight time frame was a challenge for all of us. We were all new to working in a relatively larger team so gauging the range of our skill sets and our speed was a challenge. Time management and scoping of the project also proved to be an issue as we started to cut input devices. Also, we should have had a waterfall-like project management structure to remain rigid to the original concept and an agile one in development. Sound generation was also new, learning Csound took time. We also ran into issues with different build targets not working for Csound on the Quest 2, Arm7 vs Arm64. Getting Bluetooth data into the Quest was also a major challenge, taking up lots of time Saturday. With mentor help this was accomplished.
Accomplishments that we're proud of
We were able to build independent systems that talked to each other and could be integrated successfully. Members of the team learned Unity development for the first time and made sound generation (Csound).
What we learned
- Json in Unity
- Singularity (Unity library)
- Working in a large team
- Multi-branched projects for a hackathon
For the future we would like to add more inputs, make it a more finished-looking guitar, and more accurate sound effects (reverb, interference, reflection from the environment). In-depth lesson plans, tutorial on building your own device, a refined game environment, and spatial audio / audience participation.
Log in or sign up for Devpost to join the conversation.