🚀 Inspiration

The inspiration for this project came from wondering if it's feasible to run a pedestrian detection model on smaller autonomous vehicles like drones and electric scooters, without relying on bulky, high-performance processors.

The alternative approach we explored was using a microcontroller combined with TinyML to deploy the ML model on a resource-constrained, low-power device. This direction felt exciting since most industry solutions still rely on powerful hardware, and we wanted to explore something different and energy-efficient.


🚶‍♂️ What It Does — Pedestrian Detection Using TinyML on Microcontrollers

This project demonstrates that it's possible to detect pedestrians using TinyML running directly on a microcontroller. It opens the door to lightweight, low-power smart systems that can operate independently at the edge, without relying on cloud computing or expensive hardware and all while running on edge.


🔧 How We Built It

  • We used Edge Impulse to collect and label image data.
  • Trained a lightweight ML model suitable for constrained devices called MobileNetV2 SSD FPN-Lite 320x320.
  • Deployed the model onto a microcontroller for real-time inference.
  • Ran tests to verify that the model can detect pedestrians effectively under limited compute and memory.

🚧 Challenges We Ran Into

  • Finding perfect Microcontroller and Tinyml model: Getting a microcontroller that had enough compute and finding good ml model that can run pedestrian detection on it was tricky.
  • Memory constraints: The microcontroller had limited RAM and storage.
  • Model optimization: Finding the balance between model size and accuracy was tricky.
  • Inference speed: Ensuring low latency without sacrificing performance.
  • Data collection: Getting diverse and labeled pedestrian images suitable for edge training.

🏆 Accomplishments We're Proud Of

  • Successfully finding a way to run pedestrian detection on a microcontroller.
  • Keeping the model accurate while staying under tight memory limits.
  • Through research proving that it is feasible for real-world use cases like scooters, drones, and delivery robots.

💡 What We Learned

  • How to use Edge Impulse for embedded ML workflows.
  • Best practices for deploying ML models to microcontrollers.
  • Insights into the hardware-software tradeoffs when building edge AI systems.

🔮 What's Next for TinyML Pedestrian Detection

  • Improve the dataset with more real-world pedestrian scenarios.
  • Port to different microcontrollers and compare performance and test using the microcontroller we wanted.
  • Integrate with full autonomous systems for real-world testing on drones or scooters.
  • Explore object detection instead of just classification for bounding-box level insights.

Built With

  • ai
  • c++
  • code-composer-studio
  • edge-impulse
  • image-classification
  • kaggle
  • live-classification
  • microcontroller
  • ml
  • mobilenetv2-ssd-fpn-lite-320x320
  • neural-networks
  • object-detection
  • pedestrian-detection
  • tensor-flow
  • tinyml
  • uniflash
Share this project:

Updates