Why I Made This Project In recent years, the Urban Heat Island (UHI) effect has quietly become one of the most dangerous environmental stressors in modern cities. As concrete and asphalt replace trees and vegetation, densely built areas trap and radiate heat way more. The result? Higher temperatures, skyrocketing energy consumption, worsening air quality and increased public health risks. This is especially problematic because urban heat disproportionately impacts vulnerable communities, including lower-income neighborhoods that have less green space and fewer resources to spend. At the same time, AI itself is under called out for being an environmental burden due to massive energy demands from training and deploying large-scale models. That paradox inspired this project: What if we used AI not to harm, but to help the planet?
Why Releaf Matters and What it does Releaf is more than a heatmap it’s a decision-making tool. Unlike most tools that just visualize environmental problems, Releaf gives data-driven, location-specific solutions with projected impact. It uses real-world datasets and AI predictions to help users from citizens to city planners take meaningful climate action. What makes it unique is the combination of geospatial intelligence, real-time weather data, and actionable recommendations all in one place.
How I Built It I used React with TypeScript for the frontend and Leaflet.js to build the map interface. The backend is powered by TensorFlow.js, which runs a trained neural network model directly in the browser using live weather and urban data. I gathered datasets from NASA, EPA and NOAA.
AI The ML model is a 4-layer neural network trained on environmental features like vegetation index, building density, and historical temperature data. It outputs a risk score for each map zone and classifies areas into severity levels. Based on those predictions, a second layer of logic recommends the best interventions and estimates potential cooling impact. All of this runs in real time, right in the browser.
Challenge I Overcame One of the biggest challenges I faced was integrating real-time environmental APIs with the frontend map while maintaining consistency across multiple cities. I used data from Open-Meteo, NOAA, and NASA EarthData, which all have different formats, rate limits, and coordinate systems feeding it into the Leaflet-based visualization all while syncing it with the machine learning model predictions was difficult. I had to write multiple functions to normalize the data, ensure caching to prevent a lot of API calls back to back, it's not perfect but it works.
What I Tested Most I tested the environmental data APIs and the prediction model the most, starting with validating API calls to Open-Meteo, NOAA GHCN, and NASA LAADS DAAC to ensure they returned accurate and consistent climate and vegetation data for cities like Phoenix and Miam using custom checks against known historical records. The machine learning model was tested with over 2,400 data points, including NDVI scores, elevation, and temperature and I compared its predicted heat zones with real urban heat maps from NASA and NOAA.
What I'd Improve If I had more time, I’d integrate community reporting tools to let residents share heat-related problems, expand city coverage globally, build a backend to generate city-specific reports for local governments or environmental nonprofits, and incorporate more consistent APIs to improve the accuracy and timeliness of real-time data.
How User Input Works Users simply select a city and click a heat zone to view analysis and recommendations. This was designed for simplicity and speed, avoiding complex forms or inputs. Data is pre-fetched and processed on click, with AI running in-browser for privacy and responsiveness.
Only the first part is shown in the top description, Mb
Log in or sign up for Devpost to join the conversation.