Inspiration

Panko started with a personal story. One of our teammates has dealt with vitamin deficiency for years — and despite knowing they needed to take their supplements daily, actually remembering to do it was a constant battle. It sounds simple, but when life gets busy, medication is one of the first things that slips. That experience made us ask a harder question: if someone who knows they need medicine still struggles to take it, what about patients who can't easily help themselves? What about elderly patients in assisted living, or anyone who depends on a caregiver to get it right, every time? We built Panko because we believe the system should do the remembering — not the patient.

What it does

Panko is a fully automated medication dispensing and delivery system designed for assisted-living environments. It has three core components working together: a caretaker app, a robotic dispensing arm, and an autonomous delivery rover — all running on AMD hardware and connected through Espressif's wireless technology. Caretakers start by using the Panko app, whose entire storage and backend functionality runs locally on an AMD Ryzen AI PC. This keeps patient data private, on-premise, and fast — no cloud dependency required. To map the facility, the rover carries a LiDAR sensor paired with an Espressif ESP32-S3 microcontroller. The ESP32-S3's built-in WiFi capability streams the raw LiDAR scan data wirelessly to the AMD Ryzen AI PC in real time, where it is processed and rendered as a live floor plan inside the app. Caretakers can then pin delivery locations directly onto that map — a patient's room, a common area, a bedside — and assign a medication schedule to the dispensing arm, specifying which bottles to dispense and when. The intelligence behind the robotic arm is built on AMD's ROCm framework and runs on an AMD Instinct GPU Droplet, enabling real-time computer vision to identify the correct pill bottle on the shelf. When a scheduled time arrives, the arm uses that AI pipeline to locate, grip, and load the right bottle onto the waiting rover. The rover then navigates autonomously to the pinned delivery location, guided by the LiDAR map transmitted earlier via the ESP32-S3, reaching the right patient without any further input from staff. The result is a closed-loop system: Espressif handles the sensing and wireless data layer, AMD handles the compute and intelligence layer, and Panko ties it all together — reliably, on schedule, and without the risk of human error in the dispensing chain.

How we built it

Panko was built in layers, with each subsystem developed in parallel before being integrated into a single pipeline. The core of the entire system is an AMD Ryzen AI Mini PC, which serves as both the compute brain and the local data hub. The caretaker app runs entirely on this machine — all storage, scheduling logic, map rendering, and AI inference stays on-device, with no cloud dependency. The robotic dispensing arm's computer vision pipeline runs on AMD's ROCm framework via an AMD Instinct GPU Droplet, giving us GPU-accelerated inference to identify the correct pill bottle on the shelf in real time. For sensing, we used an Espressif ESP32-S3 paired with a LiDAR sensor mounted on the rover. The ESP32-S3 captures spatial scan data and transmits it wirelessly over WiFi directly to the AMD Mini PC, where it is processed and rendered as a live floor map in the caretaker app. Caretakers use that map to pin delivery destinations and set medication schedules for the arm. The delivery rover receives its destination from the app and navigates autonomously to the correct location to complete the handoff — guided entirely by the LiDAR map built through the ESP32-S3 wireless data stream.

Challenges we ran into

Our biggest hardware challenge was arm precision. Pill bottles vary in shape, diameter, and surface texture, and getting the arm to grip them consistently required extensive servo calibration and mechanical tuning — small misalignments that seemed trivial on paper caused real failures during testing. On the software side, getting AMD's ROCm stack configured correctly was a significant time sink. ROCm has specific hardware and driver requirements, and working through the setup during a hackathon — without the luxury of a controlled environment — cost us hours we hadn't budgeted for. Time was ultimately our biggest constraint. With a system this interconnected, every delay in one subsystem cascades into the next. We had to make tough calls about what to fully build versus what to scope as a proof of concept, and we didn't get as far with the rover's autonomous navigation as we had hoped.

Accomplishments that we're proud of

We're proud that the arm works. Getting a robotic arm to reliably identify and grip the correct pill bottle using a GPU-accelerated vision pipeline, built from scratch over a hackathon weekend, is something we're genuinely proud of. We're also proud of the architecture we designed. The decision to run everything locally on the AMD Ryzen AI PC, use the ESP32-S3 as a wireless LiDAR bridge, and offload AI inference to the AMD Instinct GPU Droplet gave Panko a real, defensible technical foundation — not just a demo.

What we learned

We learned that hardware integration is where hackathon projects live or die. Writing code is one thing — getting a servo, a microcontroller, a GPU, and a WiFi module to all agree on what's happening in real time is another challenge entirely. Every interface between components is a potential failure point, and respecting that early would have saved us time. We also learned a lot about AMD's ROCm ecosystem. Going in, most of our team had limited experience with ROCm as a compute framework. Wrestling with it under time pressure forced us to understand it deeply and quickly, and we came out with real working knowledge of GPU-accelerated inference on AMD hardware that none of us had before. Most importantly, we learned that building for a real human need changes how you make decisions. Knowing that a system like Panko could genuinely help someone's grandmother get her medication on time made every debugging session feel worth it.

What's next for Panko, a medical delivery system

The most immediate next step is completing the rover's autonomous navigation. We have the LiDAR mapping pipeline and the app infrastructure in place — finishing the rover means connecting those pieces into a fully closed loop, which is well within reach. Beyond that, we want to add a confirmation layer: a simple display or alert on the rover that logs each delivery and flags any missed doses back to the caretaker app. That feedback loop is critical for a real medical environment where accountability matters. Long-term, we envision Panko integrating with electronic health record systems so that medication schedules are pulled automatically, requiring zero manual input from caretakers. The hardware stack — AMD for compute, Espressif for wireless sensing — is already built to scale. The vision is one Panko system per facility, handling every delivery for every patient, every day, without fail.

Built With

  • amd-instinct-gpu-droplet
  • amd-ryzen-ai-mini-pc
  • autonomous-rover
  • c++
  • data-streaming
  • esp-idf
  • espressif-esp32-s3
  • lidar
  • lidar-sensor
  • python
  • robotic-arm
  • rocm
  • servo-motors
  • wifi
Share this project:

Updates