Inspiration

Climate change has made natural disasters like wildfires and floods more frequent and severe. While satellites constantly orbit our planet collecting data, there is often a critical delay between orbital detection and ground response. We realized that in emergency situations, seconds save lives.

We wanted to bridge the gap between space technology and humanitarian aid. Inspired by the "ORBIT" theme, we asked ourselves: How can we use the vantage point of space to protect people on Earth? OrbitEye was born from the idea of a "Planetary Dashboard"—a tool that turns complex satellite telemetry into actionable insights for first responders.

What it does

OrbitEye is a real-time disaster response dashboard designed to visualize anomalies detected from Low Earth Orbit (LEO). Live Event Tracking: It plots different types of disasters (Wildfires in Red, Floods in Blue) on an interactive global map. Command Center UI: It features a futuristic, dark-mode "Heads-Up Display" (HUD) that simulates a satellite control room, complete with live scanning animations. Instant Dispatch: Users can click on any anomaly marker to view specific satellite telemetry (Coordinates, Confidence Level, Satellite ID) and simulate dispatching a ground team to that location. Data Visualization: A sidebar tracks active satellites and total anomalies in real-time, giving commanders a bird's-eye view of the crisis.

How we built it

We focused on creating a lightweight, high-performance web application accessible from any browser. Frontend: We used HTML5, CSS3, and Vanilla JavaScript to build the core structure.

Mapping Engine: We integrated Leaflet.js with a dark-matter tile layer to render the map and handle geospatial coordinates. Design: We utilized Glassmorphism CSS techniques and FontAwesome to create the translucent, sci-fi aesthetic suitable for space tech. AI Tools Disclosure: To accelerate our development within the hackathon timeframe, we used Generative AI (ChatGPT/Claude) to help debug complex CSS animations for the "radar scanner" effect and to generate the mock JSON dataset that simulates the incoming satellite signals. The core logic and design implementation were handled by the team.

Challenges we ran into

Map Integration: Getting the Leaflet map to render correctly with the custom dark theme took some trial and error. Responsive Design: Making the "Glass" sidebar look good on different screen sizes without covering the important map data was a CSS challenge. Simulating Realism: We didn't have access to a real-time military satellite feed, so we had to write a script to generate realistic-looking "dummy data" that appeared at logical coordinates (e.g., placing wildfires in forests rather than the ocean).

Accomplishments that we're proud of

The Aesthetic: We are really proud of the UI. It genuinely looks like software you would see in a sci-fi movie or a SpaceX control room. Interactivity: The smooth transition when zooming into a marker feels very professional. MVP Completion: We successfully built a functioning prototype that demonstrates the core concept clearly within the tight deadline.

What we learned

Geospatial Data: We learned how latitude and longitude coordinate systems work in web development. UI/UX for Emergencies: We realized that for disaster apps, clarity is key. The design needs to be cool, but the data must be readable instantly. Working with Libraries: We improved our skills in reading documentation for third-party libraries like Leaflet.js.

What's next for OrbitEye: Satellite Disaster Response

Real NASA API Integration: Our next step is to replace the simulated data with the NASA EONET (Earth Observatory Natural Event Tracker) API to show actual real-time events. Computer Vision: We want to implement an AI model that can scan raw satellite images to automatically detect smoke plumes or rising water levels. Mobile App: Converting this web dashboard into a React Native mobile app so field teams can carry "OrbitEye" in their pockets.

Built With

Share this project:

Updates