The LA Wildfires and Limited Personnel
In the face of devastating wildfires that recently engulfed parts of Los Angeles, over 7,500 firefighting and emergency personnel rushed to battle the inferno. The response force swelled as reinforcements arrived from across the nation - Texas alone contributed more than 130 personnel and 45 engines/ambulances, while Oregon and several other states dispatched their bravest. Even international assistance poured in from three countries.
Yet despite this massive mobilization, we discovered a haunting reality: the sheer scale of such disasters forces emergency teams to make impossible choices. With the majority of firefighters necessarily focused on containing the spread of fires, fewer resources remain for rescue operations. The tragic consequence? Precious lives are lost to smoke inhalation and burns while trapped in areas that rescue teams simply cannot reach in time.
This sobering realization drove us to ask: What if we could multiply the reach of our rescue teams without dividing their resources? What if we could send scouts into dangerous conditions where human rescuers cannot safely go?
Autonomous Control for Finding Victims Among the Ember
EmberScout is our answer to this critical challenge - an autonomous RC car built specifically for deployment in disaster zones where traditional rescue methods fall short. But what sets it apart isn't just its ability to navigate treacherous terrain autonomously. We designed EmberScout to operate entirely on edge computing, a crucial feature in disaster scenarios where smoke, embers, and infrastructure damage can render cloud-based systems useless.
Our prototype can continue searching for survivors (human body parts) by running advanced detection algorithms locally even when network connections fail. This edge-first approach means EmberScout doesn't need to rely on external GPU systems or cloud services that are often unavailable, guaranteeing usefulness in emergencies such as wildfires.
How We Built EmberScout
EmberScout's hardware foundation starts with a custom 3D-printed PRL chassis housing a TreeHacks-provided Jetson Orin Nano. By integrating NanoOwl and computer vision processing algorithms, we developed autonomous control systems that enable the RC car to navigate independently and identify people trapped in rubble. This runs entirely on edge to maintain functionality in communication-compromised environments. The system is powered by DC brushless motors controlled via ESP-32, while a Next.js interface driven by v0 enables both real-time demonstration capabilities and potential remote teleoperation - crucial for keeping human operators at a safe distance during active disasters. EmberScout seeks to identify the goal of a human body part to "identify" a victim and send a signal to nearby humans to pinpoint the location of a victim.
Challenges and What We Learned
Hardware development at hackathons presents a unique set of challenges, and our experience at TreeHacks drove this point home. First was printing the RC car itself, being able to adjust for the modularity of the 3D print such that we didn't need to run back and forth. In the end, we realized that we needed to bootstrap much of the hardware with tape and hot glue, especially with a design that revolved around our ever-changing electronics.
Power management emerged as our most significant challenge. Fitting sufficient power delivery for robust motors onto a compact RC car base required innovative thinking. Our breakthrough came through implementing a 5V voltage regulator to power the Jetson Orin Nano via GPIO pins (Thanks Sarvesh!). Our next breakthrough was scouring around for a powerful battery that could power our motor systems and carry the weight of our Jetson Orin Nano amongst other devices on the system. Ultimately, we found that the real complexity lies in the interdependence of our software and hardware systems. Building a solution where both elements needed to work in perfect harmony required more than just technical skill - it demanded a crystal-clear vision at every iteration, without losing sight of our overall goal and the time crunch. While working on parallel development pipelines, we needed not only strong product direction but a comprehensive understanding of our complete implementation pipeline before writing a single line of code, ensuring seamless integration between software and hardware components.
After this, being able to wire everything correctly and understand hardware that we had never worked with before in a software setting was another difficulty where we parsed through hundreds of documentation pages trying to understand foreign documentation on our electronics. Ultimately, we got everything to integrate together through lots of communication and help from the TreeHacks hardware team and trial-and-error.
What's Next
Currently, the implemented features include computer vision for object avoidance, path-finding, and manual remote control once the human is found. The EmberScout's target can be set to human body parts such as arms or hands so that the car can detect humans partially hidden under rubble. The next step is to have a response protocol when the RC car finds a person in need of rescue. Our goal is to attach a speaker/microphone model to relay conversation with the victim along with a robust flair/pinpointing system such that rescuers can find the victims. The addition of a thermal camera would also aid in finding people, leading to a more effective disaster response.
In conclusion, we'd like to thank the amazing TreeHacks team for being so ready to help us out, the PRL, the amazing sponsors, and all the mentors (especially Sarvesh, who stayed up with us until 5 AM to aid our hacking and solve crucial problems with us). We hope that EmberScout can serve as a rough prototype for AI-on-edge devices for disaster recovery and we will continue to iterate on the idea due to our vision of its potential.
Demo
Built With
- arduino
- c++
- esp
- jetson-nano
- next.js
- nvidia
- openai
- perplexity
- python
- v0
- vercel
- vlm
Log in or sign up for Devpost to join the conversation.