Inspiration
University students often struggle to cook meals due to the stress and burnout of academic life. Skipping meals is not laziness but a symptom of the executive function gap. We wanted to solve this real public health problem by building a robotic meal companion to help students with physical disabilities and executive dysfunction. The goal was to remove the friction of cooking and provide an encouraging presence in the kitchen.
What it does
SousChef is a stationary assistive cooking robot that guides users through recipes both verbally and physically. Users paste a recipe link into our Next.js web application. The robot then reads the recipe, announces each step in an encouraging voice, and performs kitchen tasks like stirring or picking up ingredients using a statically mounted robotic arm. It also uses a webcam to detect ingredients and ensure safety in the kitchen. Once the meal is done, the app allows users to take a victory photo to log their achievement and build momentum.
How we built it
We split the architecture into three main components. For the hardware, we built a 6 jointed robotic arm with a popsicle stick claw. We used an Arduino to handle the real time motor timing of the servos and a Raspberry Pi as the main hub to handle networking and AI processing.
For the software backend, we built a Python FastAPI server hosted on DigitalOcean. We deployed our frontend on a Next.js web app and secured the domain SousChef.io using GoDaddy Registry. We connected the web app, backend, and Raspberry Pi using a persistent WebSocket JSON dictionary relay system.
For the AI logic, we utilized several APIs. We used K2 Think V2 as our logic brain to translate user requests into precise robotic steps. We integrated Google Gemini 2.5 Flash for high speed computer vision. This ensures safety by verifying that human hands are clear of the cutting board before the Arduino actuates the arm. Gemini also parses recipes into structured JSON steps. Finally, we used ElevenLabs to give the robot a warm and comforting voice.
Challenges we ran into
Building a reliable physical arm using hackathon materials like popsicle sticks and rubber bands required constant tuning. Managing the real time WebSocket communication between the cloud server and the physical Raspberry Pi was another major hurdle. We also had to figure out how to parse complex recipes into a strict set of predefined physical actions.
Accomplishments that we're proud of
We are incredibly proud of our computer vision safety constraints. By using Gemini Vision to explicitly check that hands are clear before sending commands to the Arduino, we demonstrated real world alignment and safety on an agentic system. We also succeeded in creating a genuinely empathetic digital companion that makes cooking feel less overwhelming.
What we learned
We learned how to translate messy and unstructured text into structured JSON that physical hardware can understand and execute. We also gained a lot of experience managing serial communication between a Raspberry Pi and an Arduino.
What's next for SousChef
We plan to implement a fully autonomous mode where the robot can wake up on a schedule, scan the kitchen, and have a meal ready for the user. We also want to add a mobile drivetrain so the robot can navigate the kitchen on its own. Finally, we want to add social features like a friends feed to make cooking a more collaborative experience.
Built With
- arduino
- c++
- elevenlabs
- fastapi
- next.js
- opencv
- pyserial
- python
- raspberry-pi
- sqlite
- tailwind
- websockets


Log in or sign up for Devpost to join the conversation.