Inspiration

Each year, over 350 major disasters strike worldwide, affecting hundreds of millions of people. In 2024 alone, natural disasters caused $368 billion in global economic losses. Survivors are often trapped in unstable buildings, while human rescuers face dangerous delays and high risk. On average, it takes 36 hours to locate victims after major natural disasters. To address this gap, we were inspired to create a low-cost autonomous drone that can rapidly scan hazardous areas and deliver life-saving insights before responders arrive.

What it does

Firefly autonomously navigates disaster zones using its onboard camera, IMU, and AI-powered vision. It detects potential victims even from partial body parts hidden in rubble, while simultaneously mapping the environment and marking hazards. All data is transmitted via Wi-Fi to a connected device, where rescuers view a live dashboard that provides situational awareness and helps prioritize rescue efforts.

How we built it

Hardware

ESP32-CAM: Serves as the central flight controller, handling motor control, onboard processing, and real-time video/data transmission to a ground station via Wi-Fi.

Camera Module: Integrated 30 FPS video streaming for first-person view (FPV) and telemetry

Motors (x4): Brushed DC motors providing lift and enabling directional control through differential thrust.

3D-Printed Mount: Custom mount that secures electronic components, reduces vibration, and stabilizes the onboard camera during flight.

IMU (Inertial Measurement Unit): Provides real-time orientation, acceleration, and angular velocity data for flight stabilization and attitude control.

3.7V Li-Po Battery: Lightweight power source supplying energy to both the flight controller and motors, optimized for portability and endurance.

Software

ESP32-CAM: Captures real-time video feed.

AI (YOLO): Detects human figures and fires, even partially obscured.

SLAM algorithms: Enable autonomous navigation and generate a live 3D map of explored areas.

Dashboard + Frontend: Victim and hazard locations are tagged on the disaster map displayed on a dashboard.

Challenges and Solutions (Learning Moments)

Building the drone from scratch: We designed and assembled the frame using 3D printing, integrated the camera and electronics, and ensured all components fit securely while minimizing vibration during flight.

Optimizing weight for flight stability: Lightweight motors are limited in lift capacity, so we carefully balanced the frame, battery, and electronics to maintain stability without overloading the system.

Complex flight control and IMU feedback: We implemented real-time stabilization using the IMU, translating orientation, acceleration, and angular velocity data into precise motor adjustments to maintain smooth, controlled flight.

Autonomous navigation: The drone navigates disaster zones independently, using SLAM algorithms and AI vision to map the environment, detect victims, and avoid hazards without GPS or manual input.

Power management with a 3.7V batteries: We had to optimize energy usage across motors, the ESP32-CAM, and the flight controller to maximize flight time while ensuring reliable operation of sensors and video streaming

What's next for Firefly

Next, we want to:

  1. Improve stability with stronger motors and flight controllers
  2. Expand computer vision to better detect human survivors in varied conditions (eg. darkness, smoke) by adding other gadgets such as thermal cameras
  3. Add mesh networking to allow multiple Fireflies to work together in the future
  4. Collaborate with disaster response organizations to start real-world use case application

Built With

Share this project:

Updates