Inspiration

Ecosystem restoration has always felt like something that happens elsewhere — in a lab, behind a grant proposal, managed by people with acronyms after their names. For the billions of people living inside degraded ecosystems every day, that distance is not just frustrating. It is a structural failure.

We asked a simple question: what if anyone, anywhere, could point at a piece of land and immediately start healing it?

Not after a six-month study. Not after hiring a consultant. Right now, with the phone in their pocket.

Every marker on our map is a person — a farmer, a student, a grandmother — who decided that their corner of the world was worth fighting for. That is the real magic of RootNet.

We built RootNet because restoration shouldn't be gatekept by scientists in labs. It should be open, collaborative, and radically accessible — to indigenous communities, urban volunteers, smallholder farmers, and citizen scientists worldwide.


What it does

RootNet is a real-time, community-driven ecosystem restoration platform built on an interactive 3D globe. Click anywhere on Earth, snap a photo, and within seconds you receive a hyper-local restoration plan generated from live biodiversity data, AI-powered image analysis, and verified scientific literature.

Every action you take updates a shared ecosystem health score visible to every contributor active in that region — turning isolated individual effort into collective, measurable impact.

The platform pulls from the world's largest open biodiversity datasets, analyzes your photos with a multimodal AI model, cross-checks every species recommendation against current peer-reviewed literature, and writes your contribution to a community health record that compounds over time.


How we built it

RootNet is a full-stack platform assembled entirely from open source frameworks and open data APIs.

Layer Stack
Frontend React + Framer Motion + Three.js — interactive 3D globe with physics-based animations
Backend Node.js REST API — built for concurrency at global scale
Database PostgreSQL + PostGIS — relational, battle-tested, precision geospatial queries
AI layer Mistral Pixtral — open-weight multimodal model, image → ecological insight in under 3 seconds
Verification Tavily — live cross-referencing against current scientific literature

When a user submits a location, the following pipeline executes automatically:

  1. Biodiversity pull — GBIF and iNaturalist queried in parallel for all species occurrences at those coordinates, returning taxonomy, conservation status, and ecological relationships.
  2. Visual analysis — Mistral Pixtral scans the uploaded photo for vegetation coverage, soil condition, invasive species, and canopy density.
  3. Plan generation — a step-by-step restoration plan is generated, calibrated to the specific biome, climate zone, and seasonal conditions.
  4. Literature verification — every species recommendation cross-checked by Tavily against peer-reviewed sources. No hallucination reaches the user unchecked.
  5. Community update — the action is written to PostgreSQL and the ecosystem health score propagates in real time to all active contributors in that region.
Source What it contributes
GBIF 2+ billion species occurrence records — the most comprehensive biodiversity dataset on Earth
iNaturalist Citizen science observations providing hyper-local, community-verified ground truth
Mistral API Open-weight multimodal model for photo-based species and vegetation analysis
Tavily Grounds every recommendation in current, verifiable scientific literature

Challenges we ran into

Cold data problem. Biodiversity records are not evenly distributed. Central Africa, interior Southeast Asia, and rural Latin America have sparse coverage in GBIF and iNaturalist. In data-poor regions, our restoration plans are less precise — and the communities that need RootNet most are exactly those in under-surveyed areas. We are actively working with citizen science networks to close these gaps.

AI hallucination. Mistral occasionally produces plausible but incorrect species names. We built a confidence-scoring layer and are developing a community review system where users flag bad recommendations, weighted by their contribution history.

API contradictions. Different biodiversity databases return different species counts for the same coordinate. Rather than picking one source arbitrarily, we built a weighted aggregation system that combines confidence scores and community observation density — surfacing disagreement transparently instead of hiding it.

Geospatial precision at scale. Mapping restoration actions at exact coordinates across the entire globe, with real-time shared state between contributors, required careful PostGIS indexing and query optimisation.


Accomplishments that we're proud of

  • A fully functional restoration pipeline — from a click on a globe to a verified, hyper-local restoration plan — built and shipped end to end.
  • Real-time shared ecosystem health scores that make individual actions visible as collective progress, not isolated data points.
  • A multimodal AI integration that turns a phone photo into actionable ecological insight without requiring any scientific expertise from the user.
  • An architecture that is genuinely open at every layer — open source frameworks, open data APIs, open model weights — so any community on Earth can run, fork, or extend it.
  • The moment we added the live contributor counter and someone saw "6 people restoring this watershed right now" — and stayed.

What we learned

Data is messy — and that is honest. APIs contradict each other. Two databases give different species counts for the same coordinate. We stopped seeing this as a bug and started seeing it as an accurate reflection of how much we still don't know about the natural world. Transparency over perfection.

People respond to community. A solitary form that saves data is invisible. A live map showing seven people working on the same ecosystem together is magnetic. Collective identity drives sustained action in a way that individual dashboards never will.

Offline-adjacent beats offline-first. Full offline-first architecture is a significant engineering investment. What field teams actually need is locally-buffered forms that sync when connectivity resumes. Same real-world impact, dramatically faster to ship.

Localization is not optional. Restoration happens in places where English is not the language of daily life. RootNet needs to speak Spanish, Portuguese, Q'eqchi', and Guaraní at minimum. The internationalisation architecture is ready. The translations are one community sprint away.


What's next for RootNet | Open Source Ecosystem Restoration

Offline support. A service worker layer so restoration teams can log actions, submit photos, and receive plans without an internet connection — critical for field teams working in remote ecosystems.

Community verification system. A structured review flow where experienced contributors validate AI-generated recommendations, building a human-in-the-loop quality layer on top of the automated pipeline.

Indigenous knowledge integration. The most valuable ecological knowledge on Earth — oral histories, generational land management practices, seasonal burning calendars — exists outside any digitised corpus. We want to build structured pathways for local communities to contribute that knowledge directly into RootNet as a first-class data source, not a footnote.

Localization sprint. Spanish, Portuguese, Q'eqchi', Guaraní. Open call for translators and community partners.

Expanded biodiversity coverage. Partnering with regional citizen science networks to increase iNaturalist observation density in under-surveyed areas — so RootNet's restoration plans are as precise in the Congo Basin as they are in Central Park.

Built With

+ 10 more
Share this project:

Updates