Inspiration

The premise of the movie Ratatouille is that a rat named Remy controls the movements of a junior chef and turns him into a masterful cook. We thought, "Could we bring this to life?" Except rats kind of suck to work with, so we chose to have a cat.

What it does

The goal of our system was to have the movements of a cat in a life-sized box control the movements of a human. A camera above the box track's the cat's position in the box, and a complimentary camera tracks a human's position outside of the box. Based on the cat's position in the box, a computer sends signals to a device connected to the human and triggers their nervous system via galvanic vestibular stimulation (GNS), which artificially alters the human's balance and involuntarily prompts them to tilt , lean, and move in a certain direction. Just for fun, we also added buttons in the wooden floor that, when the cat steps on them, triggers the human to lift an arm.

How we built it

Our system consists of the following components:

  1. H-bridge circuit with outputs connecting to electrodes on the side and front of the head, placed in spots that trigger vestibular stimulation. The H-bridge's input is preceded by a current-limiting circuit, which limits the current draw to 3mA for safety purposes. If the resistance of a person's skull ends up being lower than we expect, then the current limiting circuit will clamp the current at 3mA (though for demo purposes we stay well below this risk). This circuit is controlled via an esp32.
  2. Plywood-constructed life-sized box meant to comfortable hold our cat, Junkie.
  3. Raspberry Pi connected to a camera module that track's the human's location, generating X and Y coordinates.
  4. Raspberry Pi connected to a camera module mounted above the wooden box, which runs an ML model trained to perform object detection on the cat's location in the box (using VIAM), generating its X and Y position and comparing them with the human's to calculate which direction the human needs to lean to follow the cat. This sends signals to the esp32.
  5. TENS modules connected to the esp32, which are connected to the human's bicep muscles via electrodes. When activated, the tensor modules stimulate muscular contraction. This switch is controlled via a relay. These TENS modules were intended to be stimulated by the cat pressing on a button in the box, with a cooldown between stimulations for safety.

Challenges we ran into

We had a lot of moving parts in our project. We had complicated circuitry and thus a lot of hardware bugs (including fried transistors and unresolved shorts), not to mention some complicated computer vision techniques, including homography (e.g. estimating the human's "bird's eye" X and Y position via a camera from the side, since an overhead camera would have had to be mounted much higher than the one we had for the cat box. Unfortunately, our integration tests also didn't go very well because each person got stuck debugging their own modules for much longer than we expected. Also, one of the circuits that we built first and tested early on ended up frying itself last minute, and we never figured out what was wrong.

Accomplishments that we're proud of

Our team had a pretty balanced split of expertise (e.g. EEs worked on the circuitry & firmware, MechE worked on the box, ML guy worked on the computer vision). We put a lot of thought and effort into safety since we had to make sure we weren't frying our brains, and especially given the scale of the hackathon, we didn't take many risks and did a lot of testing to make sure the currents/voltages were expected.

What we learned

We learned that integration testing takes a longer than it takes. "I have too much hubris and I'm not cut out to be an EE." -Anika

What's next for Catatouille

Hopefully we can actually get integration working :D

Built With

Share this project:

Updates