The natural flow of creating music requires a seamless method of recording sounds onto the computer. The current method requires using a MIDI keyboard, which restricts the creativity of the artist. Using a simple wireless device, any monophonic musical instrument can be digitized as a midi instrument or used to power insightful new human-computer interfaces.

Executive Summary

Pyed Piper is an intelligent electronic music assistant that will turn the sound of any monophonic instrument into midi notes as the musician plays the corresponding note. These midi notes can be interpreted by a computer and work with the musician's existing digital audio workstation (DAW). This allows electronic music to be created using the instrument that the musician knows best. They can even whistle. Going further, particular notes can be tied to keys so that you can use an ocarina to play Legend of Zelda: Ocarina of Time.

# Context of Creation Pyed Piper was created on January 16 & 17, 2016 as part of Dragon Hacks 2016, held at Drexel University in Philadelphia, Pennsylvania. Pyed Piper was created by Christopher Frederickson, Nick Felker, and Max Bareiss.

Market Landscape

There are several different applications for this technology. Electronic music artists can benefit from using more tools to compose music, moving from just a keyboard to the entire range of analog musical instruments. This can also benefit students studying music theory as they will not have to learn piano to develop simple compositions. Educational software can be written around students playing their instrument without relying on expensive, proprietary solutions. Using a musical instrument as an input device can have many purposes for musicians and gaming.


  • Allows for natural instrument input
  • Laptop software creates a virtual MIDI port which works with any MIDI supported program
  • Laptop software can also wirelessly send / receive MIDI commands from another computer
  • Android app can wirelessly receive MIDI commands using Marshmallow’s built-in MIDI APIs
  • Keyboard events can be sent by detecting the pitch of your instrument

Hardware Description

Pyed Piper consists of a software program which runs on a computer. It uses your laptop’s microphone to capture sound, which can come from any monophonic instrument such as an ocarina. The sound is analyzed for its pitch and intensity and creates a MIDI command. This MIDI command can be used locally through a virtual MIDI port or sent over TCP/IP to another computer.

The dataflow for Pyed Piper starts with the musician playing an instrument. This audio is then recorded by the computer, running our software. The software analyzes the note the musician is playing, and some filtering is applied to the detected note in order to improve the algorithm. The note signal can either be sent as MIDI data over a network to commercial audio software, or sent as keystroke data to any computer program, giving the musician flexibility in how their music is used.

Development Insights

Determining the pitch of audio is easy within a given octave, but extending it for a larger audio range has been an active area of research for the past fifty years. While developing our project, we restarted our project several times using different libraries and algorithms in order to reduce the effects of harmonics.

A bunch of instruments -- slide whistle, electric guitar, ocarina, recorder, and even whistling -- were played in order to test the robustness of our algorithm.

The greatest success came from limiting the range of pitches that could be detected. Additionally, using the ocarina provided the best results as the sound generated was the closest to a sinusoid.

At that point, the project shifted to using an instrument as a general input device. The video game Legend of Zelda: Ocarina of Time has the protagonist use an in-game ocarina that is played by pressing particular buttons. These buttons were set to be called when the musician plays the corresponding notes on a real ocarina.

There were a number of problems getting this project to run on Windows. Setting up libraries for the first iteration took several hours of building and installing tools to build the libraries. When we opened a terminal in a Linux based operating system, all that work was finished with a single apt-get command.

Also, when we shifted the project to a general input device, there was difficulty getting keys to be pressed correctly. Although keys could be pressed down, Windows would immediately release them. There was no way to hold down a key, which prevented the game from detecting the inputs as each key needs to be held down.

Appendix A

Version 0.1

Using Aubio library using Python in an Ubuntu VM created by Chris and Nick

Version 0.2

Using new algorithm to detect frequencies written by Max. This algorithm runs Goertzel’s algorithm for each frequency band of interest, and compares the frequencies over time to determine the note the musician is playing.
Sending MIDI messages over TCP/IP is now successful

Version 0.3

Switch from pysoundboard to pyaudio for audio input by Chris and Nick

Version 0.4

Using Python library SoundAnalyse  to determine pitch by Chris and Nick

Version 0.5

Normalized autocorrelation algorithm written in python and tested by Max. Works for a single octave, but not beyond that range

Version 0.6

Max peak FFT based algorithm by Max. While not a proper pitch detection algorithm, it was created as a baseline. Worked worse than previous algorithms.

Version 0.7

Returned to SoundAnalyse for pitch detection. Further tuned configuration.

Version 0.8

Nick develops a MIDI service for Android devices running Android 6.0 Marshmallow. It can receive MIDI down messages over a TCP/IP socket and play that note with a synthesizer.

Version 0.9

There is a proof-of-concept for turning analog sounds into MIDI notes, now the next step is using music as an input device for a video game, such as the Legend of Zelda: Ocarina of Time.

Version 1.0

After more iterating, the game now responds to ocarina notes.
Share this project: