L.I.O.N. - Live Invasive-species Observation Network

Helping people see reefs in time: not after the outbreak, after the collapse, after the silence. In time.

Abstract

L.I.O.N. is a hybrid reef-monitoring platform for invasive and bioindicator species detection and ecological observation. It allows users to upload underwater media (images, video, etc.), route those inputs through specialized detection pipelines, and review annotated outputs directly in the browser. L.I.O.N. is a system for reading reef stress through invasive species, coral predators, fish and invertebrate indicators, and megafauna presence.

From a technical standpoint, L.I.O.N. combines a Next.js monitoring surface, a detector-routing API gateway, hosted Roboflow inference for fast deployment-friendly workflows, a remote FastAPI service for heavier YOLO models, an interactive species map sourced from federal observation data, and model-report pages for evaluation transparency. The system currently supports three detector lanes (Lionfish Watch, Crown of Thorns, and a broader Reef Health Suite), each backed by a different inference strategy suited to its computational demands. Every detection run produces structured, exportable outputs: annotated overlays, bounding boxes, species labels, confidence scores, frame-level JSON metadata, and run manifests.

We seek to be at the forefront of computational marine ecology.

Inspiration

A reef in decline does not make noise. The change arrives quietly, over weeks, in ways that a single dive cannot confirm. Herbivorous fish thin out. Algae begins to fur the coral in places it had never reached before, a faint green film at first, then thicker, then suffocating. Juvenile fish vanish from the crevices where they once sheltered. The water column, which on a healthy reef pulses with small bodies and flickers of reflected light, grows emptier. None of this announces itself. Reefs are patient structures. They have been building themselves, polyp by polyp, for thousands of years. They do not die loudly.

Coral reefs are some of the most biodiverse ecosystems on Earth. Often called the "rainforests of the sea," they occupy less than one percent of the ocean floor yet sustain roughly 25% of all known marine species. A healthy reef is habitat, nursery, shelter, food web, and infrastructure all at once. When reefs decline, the damage does not remain local. It moves outward through fisheries, coastal protection, tourism economies, and the broader marine ecosystem. However, over the past century, these natural habitats have rapidly declined to the point where nearly half of them have been destroyed. Our team investigated some of the primary contributors to coral degradation and discovered that invasive species like lionfish are accelerating reef decline by disrupting the delicate ecological balance that coral depends on to survive. By many scientific estimates, nearly half of the world's coral reefs have already been lost, and those that remain face mounting pressure from ocean warming, acidification, pollution, overfishing, and disease. Each of these alone is serious. Together, they create conditions under which even small additional disturbances can tip a reef from stressed to collapsing. Invasive lionfish are one such disturbance, and they are far from small.

The Lionfish Problem

Pterois volitans and Pterois miles, native to the Indo-Pacific, were first documented in Atlantic waters off the Florida coast in the mid-1980s, likely released through the aquarium trade. They are beautiful animals. Striped, elaborate, venomous, and spectacularly efficient at what they do, which is eat. Since their introduction, lionfish have spread through the Caribbean, the Gulf of Mexico, and deep into South American coastal waters with a speed that has alarmed marine ecologists for decades.

What makes lionfish so ecologically destructive is not any single trait but the way several converge. A single lionfish can reduce juvenile reef fish populations on a patch of reef by up to 79% in just five weeks. They consume over 50 species of fish and invertebrates, many of which are critical to reef health: the herbivores that keep algae from smothering coral, the cleaner species that maintain symbiotic relationships, the juvenile fish that represent the next generation of reef biodiversity. In their invaded range, lionfish have virtually no natural predators. They mature quickly. They thrive at depths from shallow mangroves to mesophotic reefs hundreds of feet down, well beyond the reach of most removal divers.

The result is a predator that does not merely arrive on a reef but restructures predation within it. Herbivores decline. Algae overgrows coral. Juvenile recruitment collapses. The reef, already under pressure, loses another layer of resilience, and the feedback loop tightens. Coral reefs do not need another efficient predator. They have one.

Why We Built L.I.O.N.

Existing community-science efforts that rely on manually reported photos and sightings are valuable. They are also slow, and take large amounts of effort across multiple volunteers. Reports arrive days or weeks after the observation. Location data is often imprecise. And a form that says "lionfish sighted" tells you something, but a system that can say "lionfish detected, herbivore density low, crown-of-thorns aggregation forming at these coordinates, megafauna absent from the transect" tells you considerably more. We wanted to build something more active: a system that can begin reading reef footage as it arrives, identify invasive threats early, and place those detections beside broader biological signals. Therefore, we created L.I.O.N., a platform dedicated to tracking invasive and bio-indicator populations to protect coral reef ecosystems. L.I.O.N. began with lionfish. It did not stay there. Once the detection pipeline was working, we realized that the same architecture could serve a much larger ecological purpose. A reef cannot be understood through a single species, however destructive. It must also be read through indicator organisms, predator-prey ratios, coral-specific threats like crown-of-thorns starfish, and the broader texture of what is present and what is missing. The platform grew to match that realization.

So what is L.I.O.N.?

L.I.O.N. (short for Live Invasive-species Observation Network) is a system dedicated to identifying distinct marine species to protect coral reef ecosystems. While L.I.O.N.'s main purpose was originally the early-detection of invasive lionfish, our team eventually realized that this website had the potential to help scientists not just combat lionfish, but also identify a wide array of other species that serve as biological indicators of the overall ecosystem health.

To accomplish this, L.I.O.N. uses a YOLO-based machine learning pipeline to analyze underwater footage in real time, drawing detections directly onto the video frame by frame. Users can upload footage through our web interface, select a detector lane, adjust the confidence threshold, and instantly receive results including labeled bounding boxes identifying each detected species and confidence scores indicating how certain the model is for each detection.

We support object detection models for over 20 species, one of the largest aquatic specialized aggregate object detection models in the world.

For each run, L.I.O.N. returns:

  • Annotated image or video overlays
  • Bounding boxes
  • Species labels
  • Confidence scores
  • Frame-level JSON metadata
  • Run manifests for reproducibility and downstream analysis

Around that detection workflow, the platform also includes:

  • An interactive invasive-species map built from a CSV dataset of 150 mapped observations across 6 invasive marine species
  • A prediction gallery tied to real marine organisms rather than generic placeholders
  • Dedicated model-report pages showing precision-recall curves, confusion matrices, F1-confidence behavior, and training-loss plots

L.I.O.N. has one simple mission: to protect the reefs that the ocean cannot afford to lose.

System Architecture

L.I.O.N. is structured as a layered system with separated responsibilities for presentation, request routing, inference, and structured output.

Front-End Monitoring Surface

The front end is built in Next.js, React, and TypeScript. The main page (app/page.tsx) operates as a long-scroll monitoring dashboard that introduces the ecological context, presents the available detector lanes, embeds the Live Lab upload and review interface, renders a multi-species prediction gallery, visualizes invasive-species sighting data on the interactive map, and provides access to model analytics pages. We designed the interface to feel like a monitoring surface. Someone who works in marine conservation should be able to look at it and think, "I could actually use this." That standard shaped decisions throughout: layout hierarchy, color, typography, control placement, result presentation. The design was originally designed in Figma with a focus on accessibility and ease of use. We wanted L.I.O.N. to convey competence before a user even uploaded a file.

Live Lab Interaction Layer

The Live Lab (app/components/live-lab.tsx) is the operational center of the application. It manages file upload for images and video with client-side validation and progress indication, detector lane selection across all three pipelines, confidence threshold adjustment via slider (so users can tune the sensitivity-specificity tradeoff for their use case), preview generation through URL.createObjectURL, remote polling for asynchronous video processing jobs, and overlay rendering by mapping prediction coordinates onto the preview panel.

When a user selects the Reef Health Suite, the Live Lab exposes specialty toggles for Fish + Invertebrates and MegaFauna + Rare Species, which determine which paired YOLO models are invoked for the run. The component is aware of different deployment conditions and degrades gracefully: if a remote service is unavailable, the interface displays appropriate messaging rather than failing silently.

API Gateway and Request Routing

The Next.js API route (app/api/live-lab/detect/route.ts) serves as a routing layer between the browser and the appropriate inference backend. Its logic:

  1. If the user selects Lionfish Watch or Crown of Thorns, the request routes to the hosted Roboflow model.
  2. If the user selects Reef Health Suite and a remote marine-detect service is configured via environment variables, the input goes to the custom model and FastAPI service, configured through Hugginface Docker Space.
  3. If no remote service is configured, the system falls back to the local Python YOLO pipeline.
  4. For video uploads too large to proxy safely within serverless function limits, the file is first staged in Vercel Blob storage, then a remote job is initiated from the blob URL.

This routing logic means the platform adapts to different deployment configurations without code changes. A cloud-hosted production environment, a local development machine, and a researcher's laptop in a field station with intermittent connectivity can each run a version of the system suited to their available resources.

Hosted Detection Lanes

Two lanes use hosted Roboflow inference:

  • Lionfish Watch
  • Crown of Thorns

These are lightweight and deployment-friendly. For images, the system retrieves prediction JSON directly. For videos, it initiates asynchronous remote jobs and polls for completion. Hosted inference keeps latency low and eliminates the need for local GPU resources on these two well-scoped detection tasks.

Remote Reef Health Suite

The broader reef-health workflow runs on a separate FastAPI service (services/marine_detect_api/main.py) written in Python and Huggingface Docker Space. This service exists because these detection models are too large and computationally expensive for a serverless function to handle within reasonable timeout windows.

The service accepts uploads through /detect/upload and remote URLs through /detect/url, resolves YOLO model weights from environment variables or downloadable URLs, creates isolated run directories for each inference request, invokes the detection pipeline, and returns structured JSON with per-frame detection metadata and annotated media. Current reef-health specialties:

Specialty Coverage
Fish + Invertebrates Butterflyfish, grouper, parrotfish, snapper, moray eel, giant clam, urchin, sea cucumber, lobster, crown of thorns
MegaFauna + Rare Species Sharks, rays, sea turtles, and other less frequently observed large fauna

Local CLI Fallback

The system includes a command-line pipeline (lionfish_yolo.py) that supports both hosted-predict and local-predict execution modes. It processes single images, image directories, and video files. It can accept multiple YOLO model weight files simultaneously, draw bounding boxes with OpenCV, export structured JSON sidecars alongside annotated media, and write comprehensive run manifests to last_run.json.

How we built it:

Front End: Built the web interface using Next.js and TypeScript, allowing users to access an aesthetic UI where they can upload footage and view detections directly in the browser seamlessly. ML Pipeline: Trained and integrated YOLO-based models to detect lionfish and other reef species across video frames, drawing bounding boxes and indicating confidence scores. We did this with a mix of train-test-split, and transfer learning on Video Editor: Used Da Vinci Resolve to produce our pitch video, incorporating smooth transitions and color grading.

Challenges we faced:

While creating L.I.O.N., we encountered a variety of challenges both technical and collaborative. For instance, some of our team members have never competed in a hackathon before participating in SMathHacks. Time management was another major challenge we navigated during this experience. Having to balance different tasks throughout the group was a difficult endeavor. We also faced a lot of technical difficulties such as running out of backend storage on Vercel, fixing bugs through countless iterations, and working simultaneously in Github which sometimes raised merge issues when pull requests overlapped.

What we learned:

Working on L.I.O.N. during this 36-hour long hackathon allowed us to vastly improve our soft and hard skills. Besides providing valuable hands-on learning, this opportunity highlighted the importance of team coordination and proper communication. We learned to navigate challenges together and delegate roles in order to prevent overlap. Learning to adapt from failure and grow from it was vital to moving forward with our project. L.I.O.N. also results in us learning many new technical skills from video editing in DaVinci Resolve, to server-side development, to building and training YOLO-based machine learning models for species detection, to crafting a polished UI in programming languages like Typescript.

Accomplishments We’re Proud Of

We are proud of the map module we added to our application. Using https://catalog.data.gov/dataset/?tags=lionfish, we were able to find the GPS locations of sighted invasive species. Creating a database of this information, our website calls, and populates with this information. Using this information, we hope to also predict future migration patterns of these invasive species, so that researchers can better position catching devices and regulate environments with ease.

Our Next Step

There are multiple places that we want to improve our product so it has a greater impact on the world. The first is clearly articulating the impact that this invasive species actually has on the environment. We would want to do this by clearly showing tags based on animals shown. This is one example: "ESTIMATED IMPACT: 20 native fish consumed/week." However to implement this feature we will need to have and collect more metadata of the fish. For example, by enhancing our model to also detect the weight, size and behavior of the species, we can more accurately calculate the impact it has on the reef, or the surrounding environment.

Other Potential Features include: Predicting the behavioral patterns of marine species, allowing users to have an estimation on the area/direction that the species is likely to spread over the next 48 hours. L.I.O.N. would use camera detections and local environmental variables such as GPS, temperature, and current speed. Grading photos and videos analyzed by the model based on sharpness in lightning, so that the model can detect if the photo is “bad”. This allows us to maintain data integrity and ensures our database isn’t full of junk data.

Built With

Share this project:

Updates