Ember: Autonomous First-Response Fire Suppression


Inspiration

Every year, thousands of structural fires cause irreparable damage before emergency services can even arrive on the scene. We wanted to build a proactive solution—a local "first responder" that lives inside a building and neutralizes threats the moment they appear, bridging the gap between detection and professional intervention.

One of the inspiration for Ember stems from the devastating Wang Fuk Court fire in Tai Po, Hong Kong. In late 2025, this tragic five-alarm blaze burned for over 43 hours and claimed 168 lives. In high-density urban environments, the speed at which a fire spreads through scaffolding and renovation materials often outpaces the arrival of emergency services. We built Ember to be the "zero-hour" responder—a device that lives on-site to stop a flame the moment it sparks, preventing a small ember from turning into a city-wide catastrophe.

What it does

Ember is an autonomous fire-suppression robot designed for proactive building safety. It follows pre-programmed patrol paths to monitor high-risk zones. Using an Intel RealSense camera and sophisticated machine learning, it scans for visual signatures of fire. Once a threat is confirmed, Ember interrupts its patrol, maneuvers to the optimal distance, and deploys its onboard fire extinguisher via a high-torque servo mechanism to neutralize the threat immediately.

How we built it

  • The Brain: An NVIDIA Jetson Nano serves as the primary compute module, handling ML inference and high-level logic locally for ultra-low latency.
  • The Senses: An Intel RealSense camera provides depth-sensing and HD visual data, allowing the robot to distinguish between fire and environmental light.
  • The Muscle: An Arduino manages the motor drivers for movement and controls the servos that physically trigger the extinguisher.
  • The Logic: We trained a custom machine learning model on thousands of fire and smoke datasets, optimized to run on edge hardware without needing a cloud connection.

Challenges we ran into

The primary hurdle was the mechanical demand of the suppression system. Calibrating a servo to exert the significant physical force required to depress a standard fire extinguisher handle—without stripping the gears—required several iterations of lever-arm design. We also had to refine the serial communication between the Jetson Nano and the Arduino to ensure that movement commands were executed with zero "drift" during high-stakes detection.

Accomplishments that we're proud of

We successfully achieved fully local processing. By running our vision models directly on the Jetson Nano, Ember remains 100% operational even if the building’s Wi-Fi or electrical infrastructure is destroyed by the fire. This independence is a critical feature that ensures the robot works when it is needed most.

What we learned

This project taught us the complexities of Hardware-in-the-Loop (HIL) testing. We gained deep experience in optimizing Computer Vision models for edge computing and learned the importance of mechanical leverage and torque distribution when bridging the gap between digital logic and physical action.

What's next for Ember

Our next phase involves implementing SLAM (Simultaneous Localization and Mapping) to allow Ember to navigate unfamiliar floor plans without pre-programming. We also aim to integrate thermal imaging to detect heat signatures through smoke or walls, allowing Ember to "see" a fire before it even breaks out into open flames.

Share this project:

Updates