I was recently mailed a flyer from my city's sanitation department. It was on how to disposes of harmful products that can be found any where in a home. That gave me my inspiration. So, for this hackathon, I wanted to have an impact on the sanitation market and help people understand and learn about how to keep their city and environment well maintained.
What it does
The app involves the user taking a photo and passing the image into a trained machine learning model. The model will try to match it to a item that needs to be disposed in a special manner. Then, the app will give you information on how to dispose of the item and places where the item can be safely disposed of.
The app also allows you to keep track of what items have been scanned, which makes it easier to manage the items that needs to be disposed of.
How we built it
The app was built using Flutter as the mobile framework of choice. It's fast and performs well. I made my machine learning model by using TFlite, a on-device machine learning model. It allows me to skip making a server to run a regular TF model on an image. TFlite model is generated using a custom Google Colab notebook, and the dataset used for the model was created specifically for this model. Firebase was used as a database to hold information for the user. The Google Maps API was used to show locations. Figma was used as a wireframing and design tool.
Challenges we ran into
I am someone who is still new to Flutter, so creating the mobile application took a considerable amount of time. Making the model was also very difficult for three reasons:
- There was no dataset that fit my needs, so I needed to make my own dataset.
- I couldn't add all the items I wanted to add because of limited resources and time constraints.
- I have never used TFlite before, so it was a slow process.
Accomplishments that we're proud of
I am very proud of myself on finishing this app by myself, but also disappointed because finding a team may have made this app even better. I was very doubtful that I would be able to finish on time, since there were so many things to learn. The app came together very well, closely following the Figma wireframe. A logo that actually looks half decent.
What we learned
I've gained more experience in Flutter, Firebase, design, and machine learning. I've, also, learn more about using APIs in Flutter, such as the Google Maps API.
What's next for Discern
- Implement object detection instead of image classification.
- Improve the model using more images (a small number of images were used this time).
- Make the app more fluid and faster.
Non-profit (Wild Card)
Name: SAFE Disposal [SAFE stands for: Solvents, Automotive, Flammables and Electronics.]
Goal: To help people understand and learn about how to keep their city and environment well maintained.