Inspiration

This project was inspired by an interest in music, animation, and experimental audiovisual technology. We wanted to explore creating an instrument/toy that would draw in a user, and be fun to make noises with. We were interested in learning how to use FPGA to create procedural audio-reactive graphics.

What it does

This project is a combination of a digital audio synthesizer and a FPGA based visualiser. We created a circuit on the FPGA to procedurally generate animated shapes, and sync them up with the sounds created by the synth. We also planned a unique physical synth controller to control the whole arrangement.

How we built it

We used Verilog, Python and C in this project. Half of our project was made working in QNX, and the other half was made on the FPGA. We had many pieces of code to bring the project together, like a TCP server to inject commands into the Verilog shell and spawn animated shapes, and a microcontroller used to interface sensors and interactive elements with the synth engine on the Pi.

Challenges we ran into

The hardware was definitely one of our biggest challenges; we struggled to prototype reliable hardware and connect it to the larger project. Working with QNX was challenging, but it was rewarding to create custom tools to fill the gaps in the OS.

Accomplishments that we're proud of

We are very proud of our GOAT Sam, who built an entire live programmable FPGA animation system, and made it possible to inject code directly into the FPGA environment to change visuals on the fly.

What we learned

We learned many things, how to work with bits and bytes, how to write low-level drivers, how to uhhh..

What's next for PIPIS MACHINE

Built With

Share this project:

Updates