UnderwaterAcoustics

We are team Lilia includes Nam, Hoang, Duy, Sunny.

What it does

Converse sound into color and simulate under the water. You can either sing or input a song. It will calculate the frequency of the sound, then converse it into rgb channels to visualize. Then we simulate the sound under the water using Navier Stokes. After that, we can see the simulation through VR

Tech

Languages: Python, javascript, HTML/ CSS. Libraries: pyaudio, three.js, scipy, matplotlib, p5js. Tools: pycharm, VSC

Inspiration

We were inspired by underwater acoustics and musical fountain, which is sound underwater and make art out the sound. This project will help deaf people have a feeling of music. They will able to see the color of music. You can simply create an art by singing

Math

Fourier transform to filter sound. Quadratic Interpolation to find peak frequency. Navier stokes: fluid simulation

Difficulty

None of us knows how sound frequency works, so we need to learn everything about sound frequency, wavelength, amplitude, pitch,... There aren't that many works done on coding sounds, especially converting sound to color. Time constraint: we work on separate tasks, but, at the end, we didn't have enough time to put everything together and implement some features we have in mind

Built With

Share this project:

Updates