Raghib wants to be one of the first experts in "Architectural Robotics"; Ben is a photographer with a passion for Computer Vision; "Jarvis" is a BIG robotics-guy; and Zach... well Zach does everything from Satellites to Bio-CS. So we're 4 IOT guys; and we think the future is in connectivity! SO we're not fans of localizing control to a single device. It's inconventient, boring, not-sexy and pretty inherent to how we control music.

What it does

Uses an array of sensors and CV to let you toggle music-controls via motion of the hand and speech. The attached Music Visualizer, gets hundreds of sound signals to encode the frequencies into RGB values for our visualizer. In short, it's one BIG, COLORFUL and FREE music player.

How we built it

Gesture Recognition: OpenCV, IR Sensors Maths: Fast Fourier Transform (SciPy) Electronics: RGB LEDs, 4 Breadboards, Arduino, Resistors...

Challenges we ran into

Half the team was 8-hours late, due to an IEEE Commitment. To our Surprise... music has 2 dominant frequencies. SO how do we map that to 8 different colors??? It's inherently a bad conversion, so we had to do some "hacking" :) SPOTIFY API

Accomplishments that we're proud of

SPOTIFY API... Touching a new dimension of data: sound.

What we learned

Signal Processing, RGB Encoding, Matplotlib, SPotify API, Time Management.

What's next for IRIS

More robust CV Model, a more aesthetic music visualizer.

Share this project: