Inspiration

We are inspired by the facial expressions live music performers made and thought it would be interesting to use it to control music.

What it does

FaceMod tracks your facial structure to control musical effects in realtime. It currently allows you to map how wide your mouth is open and how high your eyebrows are raised to the tremolo and high pass filter running on a MOD Dou.

How we built it

We used ReactJS on the frontend with a library called clmtrackr for getting facial feature points. The frontend web app sends filter param information to MOD's mod-ui on web socket through a Node.JS web socket proxy.

Challenges we ran into

We find it very challenging to hack the MOD Duo. Although we can ssh into the device's Linux environment, it is hard to turn off the default mod-host service and bootstrap our own instance. We chose to dig into mod-ui's javascript code publicly available on https://github.com/moddevices/mod-ui to see how its web socket protocol works.

Accomplishments that we're proud of

Successfully hacked mod-ui without having to make heavy server side instances.

What we learned

Not to fuck with the DSP.

What's next for FaceMod

More input modes, more plugins to map, and a more customizable UI.

Built With

Share this project:
×

Updates