Inspiration

Mobile AR is big these days. However, the user is usually limited to screen gestures to interact with the augmented world. I wanted to build a mobile AR application that can talk to a hardware peripheral to make the experience more interactive.

What it does

An iOS augmented reality music production app created with ARKit that connects to a Bluetooth shoe peripheral to register sound hits. In Recording mode you can register hits. Hover over the shoe to change instruments. Enter playback mode to walk through your arrangement and listen to it as you walk (both forwards and backwards), or simply press play. Built over one weekend at Reality Virtually Hackathon 2017 at MIT.

How I built it

The iOS app is built with ARKit. It connects to a hardware peripheral (a shoe) via bluetooth. The app sends the shoe a notification to vibrate when the instrument selection menu pops up. The shoe sends the app sound hits. An Arduino on the controls the bluetooth shield, accelerometer, and vibration motor.

The app runs animations, listens to bluetooth input, and runs computer vision on separate threads to avoid main thread latency.

Challenges I ran into

The system has a ton of different parts (bluetooth, sound, animation, computer vision), so getting them all to work/run simultaneously without race conditions was a challenge.

Accomplishments that I'm proud of

Got it working!

What I learned

A lot about ARKit, SceneKit, iOS development in general, how to communicate between an Arduino and an iOS App.

What's next for Produce-AR

Making some sick beats, yo.

Built With

  • adafruit-accelerometer
  • adafruit-bluetooth
  • arduino
  • arkit
  • ios-avaudio
  • ios-corebluetooth
  • ios-vision-framework
  • scenekit
Share this project:

Updates