-
-
Our intermediate circuit boards are quite messy.
-
The Hyper Mainframe Interface, i.e, where the data comes in.
-
Our finished hack, with all the elements labeled.
-
A diagram indicating how we wanted the W.T.A. to work.
-
A checklist of module ideas. They were crossed off when they were either completed or deemed infeasible.
Whimsical Transmission Apparatus!
Now with a better name!
Inspiration
While on the train over to Astonhack, we pondered the idea of doing interesting things as layer 1 transports. As we discussed various weird and wacky ways of moving bits, we realised that the best project would be to do them all!
What it does
Our project transfers bytes, however each bit in the byte is transferred over a different physical transport.
At the start of our system we have a Raspberry Pi Pico as our interface with our sender, and at the other end we have another Pi Pico setup as the receiver. In-between we have a number of different transports that handle the bits, such as a signal flag based communication, QR codes, physical buttons, and a connection through liquid.
The end displays the code of the received character on a seven segment display, and also prints out the character that was sent on a connected laptop.
How we built it
After deciding the interfaces at the ends, we split the modules among our team, and made them in parallel. This allowed us to quickly build up a number of interesting methods.
A module here is just a self contained circuit which takes a digital input from the transmitter and in some convoluted and interesting way transfers that value to the receiver.
We required a computer vision system for two of the modules: the coloured flags, and the QR codes. For the first of these, we made two coloured paper flags and attached them on opposite ends of a servo, so that we could switch them out in the view of a webcam - the green flag corresponding to a high logical value and the red flag corresponding to the low signal. The QR code system consists of two hand-drawn QR codes on a piece of card which a servo can flip round to present the QR code for the strings "one" or "zero" to the webcam. Using the OpenCV computer vision library, we made some software to capture the webcam's vision and analyse its contents. We made a custom calibration mode to specify a rectangle in the view window which is expected to contain only pixels on the flag, whichever is being presented. Then, continuously, we use OpenCV to iterate through the pixels in this region of interest and output whichever colour is dominant, red or green. The calibration allows us to adjust the system to different webcam mounting angles and any environmental factors like lighting. We had to investigate different colour spaces and comparison methods to find the best way to distinguish between red and green with as little sensitivity to error as possible. The QR code detection was straightforward, since OpenCV exposes a function to search for QR codes anywhere in an image and give us the data. The problem with QR code detection came in the hardware side, as any wires, glare, or other distractions in the way of the code would mess up the reading. To resolve this, any invalid readings are ignored and taken again until a successful read is found.
The computer vision system for these two modules runs on a MacBook, for efficiency reasons, which communicates over a serial connection (via a Raspberry Pi Pico) to the main system to integrate with the receiver circuit.
The module where a servo pushes a button is simple in principle, but due to our limited tooling we had a lot of issues. First of all, a major problem throughout every module was connecting things together: we had very limited access to soldering (we were able to get into an electronics lab to do some soldering for about an hour, so James soldered some pin headers onto a few of the Pi Picos we were given, but still most of the circuitry wasn't able to be soldered. This was fine for the most part but sometimes we had to wrap wires around thumb tacks, and other such hacky solutions. This module worked by taking a digital input and, if it detects that it's high, it will send a PWM signal to the servo motor to make it press the button. Otherwise, it will move back up to a ready position. Luckily the motor has enough torque to press the quite firm button - I was surprised with how much torque the servo had. It was a pain to firmly secure the servo to the breadboard in the perfect position for it to be able to press the button, but with a lot of duct tape it worked okay. This module transmits the first bit of the byte.
The receiver is the last part of the system. It takes 8 digital inputs (from various modules), and shows the binary number they represent as a decimal number on a 7 segment display. Additionally, it sends the output to a connected laptop, concerted to ascii, to complete the network message. The receiver was really fun to make, mainly because 7 segment displays are satisfying when they work. This circuit ended up looking like a bit of a mess because of all the inputs coming into it.
Challenges we ran into
As with any hardware project, the limited access to soldering and other tools was quite limiting. We managed to make it through component shortages by scouring the shops of Birmingham to find the needed cables and construction materials.
We also ran into limitations of the Raspberry Pi 2B we were using, with it not being able to keep up with real time image processing. This meant that we needed to do a bunch of workarounds, which combined with generally unreliability of the SD cards we brought with us, led to use swapping out the Pi with a MacBook Pro.
We had to overcome a few issues with the servo motors at first. When first connected to the PWM signal from the Pi it would continuously rotate. After further investigation, we discovered that the duty cycle had to be set between 2 specific max and min values for 0 to 180 degree rotation. Values in between the max and min (which was 1000 to 8000) would give a semi-rotation. After the issues were overcome, the servos were used to move the flags and the QR codes.
Accomplishments that we're proud of
Getting a hardware project working as expected in an accomplishment at any hackathon.
Getting the QR codes drawn by hand took some precision, but it was very nice to see it working.
Being able to learn to use new hardware quickly. It was also a nice to finish the project on time. Our time management was pretty good.
What we learned
Decision of labour ended up a bit weird, our hardware expert ended up writing the software, and our software expert build our hardware, which was quite the learning experience. It was also all of our first time hands on with Raspberry Pi Picos, and for some of us our first time programming microcontrollers, and it was a very educational experience.
What's next for Network over Rube Goldberg Machine
We currently only implement 4 bits of transport, but we have some interesting ideas to expand it.
- Kinematic networking - Using a catapult to send bits.
- Multithread QR code and flag detection, which is the current bottleneck in the rate at which we can reliably send data.
- Audio transmission - We didn't have the components necessary to drive our transducers today, but it would be a cool thing to add.
Built With
- hardware
- micro-python
- microcontroller
- raspberry-pi
Log in or sign up for Devpost to join the conversation.