Inspiration
Hurricane Milton, the most devastating disaster in over 30 years, left more than 3 million people without power and overwhelmed emergency services scrambling to respond. While drone technology now floods us with vast amounts of data, the real challenge lies in making sense of it in rapidly evolving environments—operators are still stuck manually sifting through critical information when time is running out. In moments of chaos, the ability to scale autonomous search-and-rescue missions and intelligently uncover patterns from data becomes essential.
SkySearch enables operators to uncover hidden insights on the environment in vast seas of data by integrating real-time data on the environment, telemetry, and previous missions into a single pane of glass (software mission control system). Operators can deploy fleets of drones to investigate regions and collect video feed used to reconstruct the scene, enabling them to drill-down on areas of interest through a semantic search engine.
What it does
Our goal is to enable operators to interact with data and uncover hidden patterns effortlessly.
SkySearch is built around the end-to-end search-and-rescue workflow in the following use cases:
- Search: Drones are deployed through the software by operators and autonomously navigate through terrain to identify objects of interest in real-time.
- Rescue: Operators can interact with live data to isolate hazards and locate people through a unified search interface. Based on this data, the system then recommends risk-aware, optimized rescue routes for first responders.
Core features
- Environment reconstruction of damaged regions and infrastructure with Gaussian splatting
- Risk-aware pathfinding for rescue operations and pathfinding
- Semantic Search through disparate data sources to uncover patterns and recommend actions
How we built it
We designed an embedded architecture that enables software and hardware interfaces to bidirectionally communicate information and commands.
- Drone SDK used for live video streaming
- TP-Link Antennas for a local wifi system to create a more robust data pipeline between the drone and the software interface - rather than relying on Satellites and Wifi
- OpenCV and Apple Depth Pro used to process footage and classify data
- SingleStore for real-time database management
Challenges we ran into
- Accounting for low battery drones
- Integration between hardware and software interfaces
- Balancing human judgement with autonomy
Accomplishments that we're proud of
- Implemented autonomous swarming framework to detect
- Integrated gaussian splatting
- Risk-aware map traversal and recommended "safe routes" for emergency responders
- Dynamic Data Generation to generate and query data dynamically allows for efficient testing and analysis, improving the app's responsiveness and visibility into critical information during rescue missions.
Built With
- amazon-web-services
- depthpro
- fastapi
- nextjs
- opencv
- react
- singlestore
- tello
- ultralytics




Log in or sign up for Devpost to join the conversation.