Inspiration

Our inspiration comes from one of our team members, a first responder who saw firsthand the growing gap between modern technology and the raw, chaotic reality of a disaster zone. They experienced the “Golden Hour” paradox: the crushing weight of knowing that victims are statistically far more likely to survive if reached within the first 60 minutes, while the sheer density of concrete, twisted rebar, and electromagnetic interference can make finding them manually incredibly difficult. Time becomes the boundary between life and death, especially in cases involving active shooters, military search and rescue, and dense urban buildings.

Currently, many search and rescue operations are still a brute-force endeavor. In large-scale structural collapses, first responders can spend critical time clearing areas that are already empty simply because they lack the tools to verify signs of life from a distance. We realized the industry is still failing the people who risk everything to save others.

Our approach is inspired by:

-MIT CSAIL (Professor Dina Katabi): Her groundbreaking research on RF-Pose and wireless sensing demonstrated that radio frequency signals can be used to recover human silhouettes and track contactless vital signs.

-NASA JPL (FINDER Technology): NASA supported FINDER showed that the same signal processing principles used to track spacecraft could also detect the millimetric pulse of a human heart beneath rubble.

We want to build a firefighter ready workflow using commercial mmWave radar, edge intelligence, and a responder-facing interface. The question that drove us was simple: if RF can help reveal life through occlusion, debris, and low visibility, why is that capability still mostly trapped in research labs instead of helping responders make faster, better decisions in the moments that matter most?

Using commercially available 60 GHz mmWave hardware, edge intelligence, and a responder-facing interface, we aim to build a smart building system that can scale far beyond the lab. While MIT and NASA proved the undeniable potential of RF- and radar-based life detection, these systems are often locked behind massive institutional budgets or exist as highly specialized, bulky architectures. The everyday local search-and-rescue team, fire department, or forward-deployed defense unit does not have access to an agile, tactical version of this capability.

That is the gap we are aggressively going after. We are taking the principles behind space-grade radar and state-of-the-art RF human sensing and engineering them into a highly deployable, precise mmWave rescue tool. Because when a building comes down, first responders should not be forced to guess where the heartbeat is.

What it does and hardware capabilities

Our system, the HUB, is a high integration edge computing unit designed to bridge the gap between complex RF physics and actionable field intelligence. Here is the technical breakdown of the architecture:

The Sensing Core: At the heart of the device is the IWR6843 mmWave sensor. This operates in the 60GHz spectrum, sending out chirps that penetrate debris to detect micro-movements specifically the sub-millimeter chest displacements of a breathing human. RF smart building vital-sign tracking system

Precision Scanning: To achieve a full field of view in static environments, the sensing core is mounted on a Stepper Motor. This allows the HUB to perform a mechanical sweep, creating a high-resolution, 360-degree radial map of the surrounding "vitals" without moving the entire unit. breathing

Edge Processing & Orientation: Data from the mmWave sensor is streamed via UART to an ESP32-S3. This microcontroller acts as the brain, fusing the radar data with directional heading from a QMC5883L digital compass (connected via I2C). This ensures that every "hit" is geographically anchored, allowing rescuers to know exactly which direction to dig.

The Tactical Link: The system communicates with the responder's handheld device through a dual-protocol link:

BLE (Bluetooth Low Energy): Streams the processed "vitals" and orientation data to the mobile interface.

UWB (Ultra-Wideband): Utilizing the DWM3001C, the system provides precise indoor localization and ranging, allowing the responder to track their own position relative to the HUB and the detected victims with centimeter-level accuracy. We trained a neural network model that extracts multi-people 3D poses from RF signals. The model is trained using labeled examples from the camera system. Once training is over, the model can infer 3D labelling from RF signals alone

breathin A neural network is trained to minimize a loss function that captures the difference between the current output of the network and the desired output given some labeled examples. The result is a rugged, localized network that turns "blind" searching into a precision operation.

Sensor fusion and tracking

The TI IWR6843 uses mmWave radar to detect people, track their position, and capture micro-motion data related to vital signs. That data is streamed over UART to the ESP32-S3, where it is preprocessed and prepared for the rest of the system. The ESP32-S3 then connects with the DWM3001C over BLE, combining radar-based detection with UWB localization for more accurate spatial awareness. The final fused data is sent to the iPhone, where responders can view tracking information in a simple, usable interface.

Architecture

breatin

Challenges we ran into

A major challenge was that this was a fairly closed hardware system with very limited guidance on how to make sense of the data coming out of it. We were first trying to understand what the sensor output even meant. The other big issue was noise. The data was messy, inconsistent, and full of clutter, so getting to a signal we could actually trust took much more filtering and debugging than we expected.

What Makes it the next startup

The global Search and Rescue (SAR) equipment market is already a $139 billion powerhouse, and it's not slowing down. With a projected surge to $180 billion by 2034 and our mmWave system sits at the exact intersection where Anduril and Palantir have proven there is massive Alpha: rugged, AI-enhanced sensing. Also, safety standards are shifting from reactive to proactive. Just as smoke detectors became mandatory in the 20th century, life detection Infrastructure is the next frontier for smart buildings. Most importantly, reaching a victim within the first hour increases survival rates by over 80%.

What we learned

We learned that mmWave is powerful, but not magic. Making it useful meant understanding how waves reflect, how clutter and noise affect detection, and how much filtering is needed before raw data becomes something reliable.

We also learned that communication and integration matter just as much as sensing. Getting radar, embedded hardware, and software to work together meant dealing with parsing, timing, and clean data flow across the whole system.

Most of all, we learned that building something like this is deeply interdisciplinary. It takes RF understanding, embedded systems, signal filtering, and software integration all working together to turn a closed piece of hardware into a system that can actually help someone.

What's next for RescueVision

We have already spoken with first responders who made it clear this is a real and urgent need. The next step is to move beyond off the shelf hardware and build our own system that's designed for the realities of emergency response. From there, we want to test it in more realistic environments, validate it directly with our customers, and keep iterating until it becomes something responders can genuinely trust in the field.

Built With

  • dwm3001
  • edgeai
  • esp32
  • fft
  • finetuning
  • ios
  • iwr6843aopevm
  • mmwave
  • neuralnetwork
  • python
  • rf
  • sensorfusion
  • signalprocessing
  • tensorflow
Share this project:

Updates