Inspiration EcoScan — AI-Powered Trash Sorting System Inspiration Improper waste disposal is one of the biggest environmental challenges we face today. Every day, recyclable materials end up in landfills simply because people are unsure whether an item is trash or recyclable. I wanted to build a system that could automatically identify and sort waste using artificial intelligence, making recycling easier and more accurate without human effort. What I Learned This project taught me a huge amount across multiple fields:

Machine learning — how to collect image data, label it, and train an object detection model using Edge Impulse Embedded systems — how to deploy a trained AI model directly onto a microcontroller (ESP32-WROVER) Computer vision — how cameras capture and process images for real-time classification Electronics — how to wire and control a stepper motor using a ULN2003 driver Web development — how to build a real-time monitoring website using Web Assembly

How I Built It

  1. Data Collection I collected images of trash and recyclable items using a camera, capturing them from different angles and lighting conditions to make the model more robust.
  2. Model Training I used Edge Impulse to train a FOMO (Faster Objects, More Objects) object detection model based on MobileNetV2 0.35. After several iterations of tuning the learning rate and training cycles, I achieved: F1=0.968F_1 = 0.968F1​=0.968 With the following per-class performance: ClassAccuracyF1 ScoreBackground100%1.00Recycle87.5%0.93Trash100%1.00
  3. Hardware

Freenove ESP32-WROVER — main microcontroller with built-in camera Stepper Motor — moves the sorting platform ULN2003 Driver Board — controls the stepper motor Custom sorting platform — moves left for trash, right for recycle

  1. Deployment The trained model was exported as an Arduino library and deployed directly onto the ESP32-WROVER. The device runs inference locally with no internet connection required.
  2. Monitoring Website I built a real-time web dashboard using WebAssembly that connects to the ESP32 camera stream and displays live detection results with confidence scores. How It Works "Camera→AI Model→Classification→Motor" it will move 30 or amount that we put cm moving right or left to the trash cans and lastly a sevor open lids and give put the right trash in the right place The camera captures a frame every ~900ms The FOMO model runs inference on the 96×9696 \times 96 96×96 pixel image If confidence exceeds the threshold
    The stepper motor moves the platform to the correct bin:

♻️ Recycle → moves RIGHT(stepper motor) 🗑️ Trash → moves LEFT ( stepper motor)

After 2 seconds, the platform returns to center

Challenges

Pin conflicts — the camera and stepper motor share GPIO pins on the ESP32-WROVER, requiring careful remapping Model accuracy — the Recycle class was initially only 50% accurate, requiring tuning of learning rate and training cycles WebAssembly SIMD — browser compatibility issues required switching from the SIMD build to the standard WebAssembly build USB cable — lost significant time discovering the USB cable was charge-only and couldn't transfer data Library versions — ESP32 Arduino core version conflicts required downgrading to version 2.0.x for compatibility with Edge Impulse

Now we are still finding more bugs and problems. And add the espcam32 + data and add it with liner slider to work together.

Built With

  • arduion
Share this project:

Updates