Inspiration

I come from a fisherman's family. Growing up, the sea wasn't just scenery — it was life itself. I watched coastal communities depend entirely on the ocean, and grew a deep love for the marine world and all its creatures.

When I learned that over 50% of the world's coral reefs have been lost, I felt I had to act. Coral reefs have existed for millions of years — they are the most biodiverse ecosystems on Earth, the nurseries where fish lay their eggs, the foundation of the entire ocean food chain. Their loss isn't just an environmental statistic. For fishing families like mine, dead reefs mean dead livelihoods.

The people who depend on healthy reefs have no tools to monitor or protect them. ReefGuard AI was built to change that.

What it does

ReefGuard AI is a web application that lets anyone — scientist, diver, or coastal fisherman — upload a coral reef photo and instantly receive an AI-powered health diagnosis

The AI analyzes color, texture, and structural integrity to classify the coral as Healthy, Bleached, or Dead, then delivers a health score out of 10, a scientific explanation of the causes, and a tiered set of conservation recommendations — from immediate local action to broader policy advocacy.

No expertise required. Just a photo, and the ocean's story told back to you in seconds.

How we built it

ReefGuard AI is built on a lightweight but powerful stack: Python + Flask for the web server, serving a fully animated, ocean-themed front-end in HTML/CSS/JavaScript

The core intelligence comes from Google Gemma 3 12B, a vision-capable model accessed via the Featherless.ai API. User images are converted to base64 and sent to the model with a structured prompt that asks it to evaluate coral color, texture, and structure — returning a diagnosis, score, root cause analysis, and actionable steps.

Built entirely from scratch as a first development project.

Challenges we ran into

Our biggest challenge was finding the right vision-capable model compatible with our API provider. Many models don't accept image input, and testing took significant time.

As a first-time developer, building a complete web application — server, front-end, API integration, error handling — was a steep learning curve tackled in parallel with the AI work.

We also discovered that many widely-used coral reef datasets contain mislabeled images. Interestingly, our AI was often more accurate than the dataset labels, correctly identifying pale, washed-out coral as bleached when labels said healthy. That was a proud moment, and a reminder that ground truth is harder to establish than it sounds.

Accomplishments that we're proud of

We built a fully functional AI-powered web application from scratch — no prior experience in Python, Flask, API integration, or front-end development.

The AI delivers genuinely impressive output: a medically-style diagnosis with scientific visual reasoning, multi-factor root cause analysis, and five levels of conservation action — from local community response to international policy. This is exactly the kind of tool that didn't exist for the people who need it most.

Most of all, we're proud that technology we built reflects something real. The ocean isn't an abstract cause to us — it's home.

What we learned

We learned Python, Flask, API integration, HTML/CSS, JavaScript, and how vision AI models process and interpret images — all from scratch in the span of this hackathon.

But the deeper lesson was this: technology is a powerful tool for protecting the natural world, and you don't need to be an expert to wield it. If you care enough, you'll learn what you need to.

What's next for ReefGuard AI

We want to put ReefGuard AI into the hands of the people who need it most: coastal fishing communities, marine biology researchers, conservation NGOs, and environmental agencies monitoring reef systems over time.

Our next major feature integrates Google Maps to turn every diagnosis into a geo-tagged data point When a user uploads a photo, they pin its location on a live map — health status, score, causes, and recommended actions attach to that pin. Over time, this builds a crowd-sourced global reef health atlas, where fishermen in the Philippines and researchers in Australia contribute to the same real-time picture of the ocean's health.

Beyond visual diagnosis, we are developing a low-cost acoustic reef monitoring device — a waterproof hydrophone that a fisherman can drop off the side of their boat. Healthy reefs are extraordinarily loud: the snapping of pistol shrimp, the grazing of parrotfish, hundreds of fish species calling. Bleached and dead reefs fall silent. Our AI will analyze the acoustic signature of the recording and classify reef health — no diving required, no expertise needed. This is grounded in established reef bioacoustics research and makes a proven scientific method accessible to anyone on the water.

Ultimately, we believe ReefGuard AI — visual diagnosis today, acoustic monitoring and live mapping tomorrow — could become the early-warning system the ocean doesn't currently have.

Built With

Share this project:

Updates