We are inspired by the facial expressions live music performers made and thought it would be interesting to use it to control music.
What it does
FaceMod tracks your facial structure to control musical effects in realtime. It currently allows you to map how wide your mouth is open and how high your eyebrows are raised to the tremolo and high pass filter running on a MOD Dou.
How we built it
We used ReactJS on the frontend with a library called clmtrackr for getting facial feature points. The frontend web app sends filter param information to MOD's mod-ui on web socket through a Node.JS web socket proxy.
Challenges we ran into
Accomplishments that we're proud of
Successfully hacked mod-ui without having to make heavy server side instances.
What we learned
Not to fuck with the DSP.
What's next for FaceMod
More input modes, more plugins to map, and a more customizable UI.