Inspiration
Coral reefs are one of the most important ecosystems in the world, but they are increasingly threatened by bleaching, disease, and environmental stress. We wanted to build something that could make reef health monitoring more accessible, especially in situations where researchers, conservation groups, or divers need a fast way to assess coral conditions from photos. Coralink was inspired by the idea of creating a practical tool that helps turn coral images into usable health data.
What it does
Coralink is a reef health monitoring platform that allows users to upload photos of coral reefs and receive an ML-based health assessment. The model analyzes each image and classifies the coral’s condition, such as healthy, bleached, or diseased. After analysis, the results are displayed in a dashboard that tracks where each image was taken and records the reef’s health status over time. This makes it easier to visualize reef conditions across locations and spot patterns in coral health.
How we built it
We built Coralink as a combination of image analysis and a web-based dashboard. Users can upload coral reef photos, which are then passed through a machine learning pipeline for classification. The prediction results are stored and surfaced through a dashboard interface that highlights scan activity, reef condition summaries, and map-based location tracking. The platform was designed to make the data easy to interpret, not just for technical users but also for people focused on conservation and field monitoring.
The backend runs a MobileNetV2 model trained via transfer learning on the PlantVillage coral dataset, served through a Flask API. The frontend is a React dashboard built with Tailwind CSS, Leaflet for mapping, and Recharts for data visualization.
Challenges we ran into
One of our biggest challenges was the hardware component. At first, we wanted Coralink to include an ESP32-CAM setup so divers could take low-resolution underwater photos directly in the field. In practice, we ran into reliability issues with the setup, and we were also missing a necessary component, which made it difficult to get the hardware working in time. Because of that, we had to temporarily scrap that part and focus on building a strong software prototype instead.
Another challenge was thinking through the real-world limitations of coral image analysis, especially since underwater conditions can affect lighting, clarity, and color. That made us think more carefully about how much model accuracy depends on image quality and how much future fine-tuning will be needed.
Accomplishments that we're proud of
We’re proud that we were able to turn the core idea into a working prototype. Even after pivoting away from the hardware portion, we still built a system that lets users upload reef images, get a health classification, and view the results in a dashboard with location-based tracking. We’re also proud of creating something that feels practical and mission-driven, not just a demo, but the start of a tool that could be useful for reef monitoring and conservation efforts.
What we learned
We learned that building an ML-powered product is not just about training a model. It also requires thinking about the full workflow: image collection, input quality, prediction reliability, and how results are presented to users. We also learned how quickly hardware constraints can affect scope during a hackathon, and how important it is to adapt fast without losing the core value of the project. Most importantly, we learned how to balance ambition with execution.
What's next for Coralink
Next, we want to improve the accuracy and reliability of the model by fine-tuning it on better coral reef datasets and making it more robust to real underwater conditions. We also want to add user accounts so individuals and organizations can store and manage their own scan history. In the long term, we want Coralink to be accessible globally, so reef health data can be collected and monitored across different regions. We would also like to revisit the hardware component in the future and bring back a more reliable underwater capture workflow.
Built With
- flask
- keras
- leaflet.js
- mobilenetv2
- python
- react
- recharts
- tensorflow
Log in or sign up for Devpost to join the conversation.