Inspiration

The project stems from the observation that while technology often isolates us behind screens, it possesses the untapped potential to reinforce our most basic human needs. In high-pressure environments like a hackathon, stress levels are high and physical interaction is low; we wanted to build a "physical firewall" against burnout. To motivate and connect friends.

What it does

The Connection Crate is an AI-powered "Affective Computing" system designed to bridge the gap between technology and human touch. Using an NVIDIA Jetson Nano, the device monitors its surroundings for a physical hug. Once detected, it captures a snapshot of the moment, triggers Google Gemini to compose a unique motivational poem, and signals an Arduino-controlled dispenser to drop a candy reward. It’s a complete "Sense-Think-Act" loop that turns a social gesture into a digital memory and a physical treat.

How we built it

It begins with the Sense layer, where an NVIDIA Jetson Nano uses a computer vision pipeline to monitor for specific "pose landmarks". When the model detects two overlapping bounding boxes in a hug configuration, it triggers a high-resolution snapshot. This moves the system into the Act layer, where the Jetson sends a serial command to an Arduino Uno, which rotates a servo-driven dispenser to deliver a physical candy reward. Then the program communicates with the Google Gemini API to generate a unique, motivational sentence.

Challenges we ran into

Running a Vision-Language Model (VLM) or a pose estimation library like Mediapipe alongside a camera feed can easily max out the 4GB (or 2GB) of RAM. If the Wi-Fi is slow or not available, there is a "dead air" period between the hug and the poem appearing. We had to fine-tune confidence thresholds. Set them too low, and the machine gives away all the candy to people standing in line; set them too high, and a genuine hug goes unrewarded.## Accomplishments that we're proud of

What we learned

We learned to optimize high-level computer vision models to run locally on the NVIDIA Jetson Nano's hardware. We gained hands-on experience in cross-platform communication, specifically managing the "handshake" between Python-based AI on a Linux environment and C++ control on an Arduino. We tackled the real-world engineering challenge of mechatronics and power management by ensuring our servo motor and LCD screen can operate simultaneously without system failure.

What's next for Vital Groove

Built With

Share this project:

Updates