The idea of musical instruments as a collaborative tool and it's sonic characteristics to be the result of the physical and emotional presence of the space.

What it does

Our program uses **Lidar** as input, to take in spacial data. It then analyzes and maps the data to produce MIDI and/or parameters for controlling oscillators. A user('s) facial sentiment is being analyzed with the google image api. This resulting analysis acts as metadata for the piece's timbre/mode/mood/genre. The two process are running in conjunction and are fused together to create meaningful sonic output. 

How we built it

We built our program, using mostly python, as well as the google cloud API for the sentiment analysis. 

Challenges we ran into

The Lidar sponsor showed up without an input box... Getting data from the Lidar. 

Accomplishments that we're proud of

We made a working prototype and made freinds ;p !!!

What we learned

Importance of reading documentation! Value of collaboration!

What's next for SoundSpace

More musical creation modes, better performance, nicer sounds.

Built With

Share this project: