Euterpe uses motion sensors of Android phones to act as a MIDI controller for DAWs on the computer (We are demonstrating using Ableton Live). We wanted to make something that would break the traditional MIDI controllers or XY-pads that performing artists often use. The usage of on board motion sensors (such as accelerometer, orientation sensor, ambient light sensor, proximity sensor) allows performing artists to express him/herself in a different way, and complete free him/herself from the knobs and levels that would otherwise attach him/her to a table.

Euterpe connects to any device using Core MIDI or a compatible RTP MIDI implementation and sends control signals and notes via the interface. Each sensor dimension could be individually switched on/off for ease of isolation (such as applying a MIDI map in Ableton Live). Once connected, the phone will continue to send MIDI notes and control messages via RTP to the said computer, allowing some DAWs to freely interpret the received MIDI messages.

Euterpe is aimed at performance artists (both with and without instruments). It can be used for effect processing by attaching an Android device (with sensors) to the instrument, and allows expression without the movement of fingers. It can also be used by DJs and mixers to work with existing sound samples, allowing a more visual stage experience.

The main challenge to the project are limitations within the nmj library, as it is not well-documented, and a lot of guesswork was involved into making a useful prototype. Furthermore, wireless conditions itself is overly crowded, and external factors interferred greatly with the packet streams between the phone(s) and the computer. This greatly affected the stability and latency of the controller.

Built With

Share this project: