As urban cities like Atlanta become home to more and more people than ever before, there is an increasing strain on cities' green (or lung) spaces. We want to help preserve these spaces by educating populations with the help of data visualizations.
What it does
It is a web app that uses satellite imagery and image processing models to identify urban green spaces and creates metrics on their health. For example:
- NDVI - Normalized Difference Vegetation Index
- Vegetation Density (per square kilometer/mile or per pixel of satellite imagery)
How we built it
We used the following technologies:
Microsoft Bing Maps API - More specifically, Bing Maps REST Services for Image Acquisition, Location Queries and Polygon Marking.
OpenCV - We built image processing algorithms using built-in OpenCV functions. These were used to measure the metrics above.
ReactJS - We used React to build our front end web interface that provides data visualizations as well as an interface for users.
NodeJS - We used NodeJS to serve our React app as well as to run an experimental version of our Image Processing algorithms.
Challenges we ran into
We intended to make a web app that used an Azure Web App but weren't able to due to technical difficulties.
Accomplishments that we're proud of
Being able to effectively recognize healthy green spaces (being able to differentiate between parks and lawns, trees and grass).
Being able to use the Microsoft Bing Maps API to collect Static Images based on a user's query (queries don't have to be descriptive)
What's next for Urban Green Cover Analysis
Making our application robust and bug free.
Expanding the capabilities of our analysis and the metrics we're able to measure using only imagery.
Work with historic data to present trends, extrapolations and predictions.