Inspiration

Quake rendered on an oscilloscope.

What it does

It takes an audio output from a laptop and feeds the left channel into the x axis and the right channel into the y axis of an oscilloscope, allowing us to render images and games including Flappy Bird.

How we built it

We configured communications between two programs, created an audio to display pipeline and projected 3D vertices to a 2D plane to render 3D objects on the oscilloscope screen.

Challenges we ran into

At the start we struggled on the structure of the project, and it took a few attempts to figure out how all the different components would fit together. We also had a few problems with using Git, which took quite a while to solve! Overall we found it difficult to cooperate together as a team as there were so many interconnected parts of the project that it was hard to assign tasks that weren't dependent on each other.

Accomplishments that we are proud of

Any 3D object can now render on the oscilloscope using perspective projection! And good progress was made on both games.

What we learned

We learnt how to work together better as a team and were able to cooperate much better together by the end of the 24 hours. We also learnt about game dev and how it can be rendered using edges and vertices, as well as the maths behind projections.

What's next for osci-render

Cleaning up the code, adding more functionality (e.g. potentially keyframing animations) and consolidating the code base.

Built With

Share this project:

Updates