Title and logo
Litter detections in a streetside image
Each node is an image scraped from Google street view over Manhattan
Real pics from Manhattan street view
Example street map (blue) with waste heat map (red->yellow) and optimized trash cans (green)
Manhattan litter heat map and optimized waste bin placement
Our team grew up in the Bay Area and Chicago, watching our beautiful cities surrounded by natural landscapes continually polluted and littered. It's heartbreaking to see the parks and streets you love trashed by people that don't seem to care. But we've all been there, you need to throw something out but don't see a waste bin for blocks in each direction. Do you wait for a bin or gently set your empty coffee cup next to the curb? No one wants to scorn the environment, it's just convenience and cities can't afford to place trash bins everywhere.
That's why we created LitterBug to help tackle cities' trash problem. LitterBug uses easily accessible and collectible data to give heat maps of where trash is concentrated in a city and gives smart suggestions such as optimizing for the shortest distance from clusters of litter to a minimized number of trash bins.
What it does
We decided to try to attack the dirtiest city in America: New York City(travelandleisure.com). LitterBug can use data from either: a vehicle recording geotagged images for up to date images, or Google Street View static images. We used Google Cloud Maps Static API to collect images in a grid over Manhattan corresponding to the city blocks. We oriented the heading of each image to be perpendicular to the street it's on so the image faces the sidewalks. Each image is placed in a graph of nodes and edges.
The images are held in a Microsoft Azure cloud computing server and used to run image recognition for litter. This is done using tensorflow with an open-source model we found. The object detection is run on GPUs and the number of classifications (litter) is stored for that coordinate.
Finally, the map is put together with litter detections to create a heat map of litter in New York. This can be used for deploying volunteers or street cleaners as well as placing new recycling and waste bins. We created a novel algorithm that groups areas of litter into clusters, and places a small number of waste bins such as to minimize the walking distance along each street from pieces of litter to the nearest bin. This allows the municipal body to only maintenance trash bins where they are needed while reducing litter and inconvenience for residents.
How we built it
The project has 3 major parts:
1-Transforming the map of a geofenced area into a graph model and requesting Google Street View Images along each edge
2-Setting up litter object detection of city images on a Microsoft Azure server and labeling them using GPU processing
3-Importing graph and labels into a heat map of litter and running waste bin optimization algorithms
Together, these pieces make up the pipeline that can help city governments make smart decisions on keeping their city clean.
Challenges we ran into
1-The main issue we ran into was graphics drivers for the Azure server. Without them, running image recognition on so many images would have taken over a day.
2-Making decisions in the waste bin algorithm on how to cluster litter and balance number of bins vs quantity of litter
3-Working with Google Maps Static to save images and give them the right heading (from a 360 sphere)
Accomplishments that we're proud of
We are super proud of the scalability of the project. Compared to looking at a few pics from Mass Ave, part of a huge city is very exciting. We are also proud of the use of cloud computing to make the image recognition easier as well as our novel waste bin optimization algorithm!
What we learned
We learned a lot about APIs and cloud computing while working on LitterBug. It's one thing to write code in an IDE, but another altogether to implement several libraries, APIs, and hosts into an efficient pipeline. Working with API was syntactically difficult, especially when you don't fully understand the code you're working with, but in the end makes the project much more powerful. We also learned to think big. Originally, we planned to use a dozen pictures from Mass Ave, but partway through the project we realized through Google Cloud we could run data on a whole city despite being hundred on miles away.
What's next for LitterBug
The most important addition to make LitterBug useful is a good GUI for either geofencing and area on Google maps or importing data from a municipal vehicle that circuits the city. With Azure, it is possible for the pipeline to not expect much from the user or the user's computer.