Inspiration

We think that musical instruments of the future will leverage our gestures and body movements to reduce the time between the feeling of an emotion and its translation into sound as never before. We see great potential for FaceBeat to be used as a time-saving tool to create new beats, as well as being an accessible instrument that requires very few hand movements to get up and running. It can also be calibrated to specifically tailor a person's own facial expressions, similar to how the MiuMiu gloves are tailored to each user's needs.

What it does

Face Beat creates sounds reacting to facial expressions in real-time, by accessing blendshape data from ARKit and transmitting this data to audio software such as Max MSP or PureData.

How we built it

We used Unity to access ARKit's blendshape data, and used OSC to transmit this data to Max MSP. From here we could access each blendshape's specific float value and manipulate it, such as mapping it to specific audio samples or playing specific frequencies according to how high the float value from the blendshape was.

Challenges we ran into

We struggled for a long time with parsing the OSC data from Unity to PureData, and getting the strings split up and converted into floats. We decided to continue building the project through Max MSP rather than PureData as Max was coping better with the data load and we could easier access the data. There were then a lot of blendshape values that needed to be mapped, as well as figuring out which blendshape value corresponded to what. For our working prototype, we have mapped a selection of the blendshapes to convey our concept, whilst further development would incorporate all blendshapes.

Accomplishments that we're proud of

ARKit's face tracking modules are still quite new, and we have not come across anyone else who have used this data for anything sound-based. We're proud of the flexibility that FaceBeat can provide.

What we learned

We've learned a lot about serial communication as well as how to cope with large data streams and network connection issues when working with ARKit's facial remote.

What's next for FaceBeat

Next steps would be to integrate the full PureData patch and Unity application into a mobile app, where users can utilise its portability to its full extent. In the future, we would love to integrate with other wearables such as the Apple Watch, which could set the BPM of your tracks using your heart beat or headphones equipped with EEG.

Built With

Share this project:

Updates