Description

RecycleHub is an iOS application we created to revolutionize the process of recycling by making the process simple yet profitable. The app has three main features - the scanner, map, and logs. The scanner classifies what type of trash the user is scanning, and identifies specific recyclable materials in the object. The map points users to nearby recyclable centers, and provides specific information regarding when the center is open, what recyclables it accepts, and how to get there using Google Maps. The logs help the user keep track of their recycling history, provide them with statistics regarding amounts of specific materials recycled, and estimate the amount of money they can make off recycling based on their recycling history. All these features come together to create an app that provides users an intuitive way to recycle in today’s digital age while rewarding them all the while.

Helpfulness

As a result of the COVID-19 pandemic and the consequent lockdown, we noticed that sales in single-use goods such as plastic water bottles and soda cans have skyrocketed, even in our own households. However, the general consensus that we've noticed in the Bay Area is that recycling is tedious and offers very little in return. This was our inspiration for RecycleHub, an app that guarantees to get more citizens involved in the process of recycling by making it simple, efficient, and profitable, thus incentivizing users to help make our city greener.

We want it to be less of a chore for people to drive to their local recycling center and aim to make recycling something fun to do.

Implementation

To tackle such a large project, we decided to divide the project into parts based on each member’s specific skillset. Kyle and Adithya worked on the machine learning portion of the project, using Python to build the convolutional neural network (CNN) that classifies trash using Google's MobileNetV2 architecture. We chose this architecture because it is very efficient and can easily be run on a mobile device. The network was ported to IOS using the LibTorch library. Nikhil focused on the maps feature of the app, and built the features related to google maps using the Google Maps iOS SDK. In combination with the Places Library, Nikhil located recycling centers near Cupertino. By calculating the distance to each possible center, Nikhil was able to determine a relevant list of centers catered to the user's needs and display them on the map's front end.

Dinesh adapted all the features into an IOS app, designed the graphical interface and user experience of the app, set up the scanner to send correctly formatted images to the CNN. He also created a statistics feature for users to keep track of recycling history and their accumulated profit with price values based on guidelines from CARecycle. Dinesh applied Yelp's Fusion API to gather specific data about nearby recycling centers in order to display images of recycling centers, find the materials they accept, and locate their addresses.

During the process, we ran into many challenges. Porting the convolutional neural network from Python to Swift was rather difficult as there was no indicator that we needed to save the network in a special way to make it compatible with Objective C++ until we got an error message saying so. Porting the app was made even harder as none of us knew Objective C++ prior to developing the app.

It was also hard to communicate between different groups since two people worked on Swift while two others worked on Python, so they each had to learn a little bit about what the other was doing to understand what was going on and bridge the tasks together.

However, despite all the setbacks, we persevered, using online tools, APIs, development tools, and each other to help make our vision happen. As a team we discussed our plans, setbacks, and successes, and supported each other to make RecycleHub a reality.

Results

After training our CNN for 10 epochs with one thousand images, we were able to achieve a 97% accuracy in classifying different types of recyclables using mini-batch gradient descent with a learning rate of .003 and a momentum of 0.9.

We're incredibly proud of the app we have created: an iOS app with a rich UI, accurate convolutional neural network, and seamless incorporation of the Google Maps SDK and the Yelp Fusion API. We were able to take on three difficult and seemingly unconnected tasks and integrate them into a single app, despite being unable to meet in person due to the pandemic.

Share this project:

Updates