The Genesis of Bin Buddy: An Engineer's Answer to Environmental Confusion
My inspiration for Bin Buddy stemmed from a simple, yet deeply impactful observation: a widespread passion for sustainability clashing with the confusing, fragmented reality of local recycling rules. I saw a critical gap between people's desire to do good and their ability to do so correctly. Too often, this confusion leads to "wish-cycling," which contaminates recycling streams, or worse, resignation, where perfectly recyclable items end up in landfills. I knew that a smart, accessible technology solution could empower individuals to make a tangible environmental difference.
The Build: Architecting an On-Device Intelligence Engine
My initial plan involved a cloud-centric architecture, and I invested significant time exploring cloud deployment strategies and integrating services like Firebase. However, as I navigated the complex landscape of cloud LLM providers, I faced a pivotal challenge: how to deliver instantaneous results without compromising user privacy or requiring a constant internet connection.
This challenge inspired a strategic pivot towards a more sophisticated and user-centric solution: a fully on-device, dual-model AI pipeline.
- Instantaneous Object Recognition: The first stage leverages a highly optimized MobileNetV2 Core ML model. This allows the app to perform real-time, on-device image classification, identifying an object the moment the user captures a photo.
- Hyper-Contextual Advice Generation: The classification from MobileNetV2 then seeds a prompt for a powerful, on-device Large Language Model using Apple's Foundation Models. By fusing the identified item with the user's real-time geolocation, the LLM generates definitive, hyper-localized recycling instructions.
This on-device architecture is the core of Bin Buddy's magic. It ensures lightning-fast performance, complete offline functionality, and an unwavering commitment to user privacy, as no images or location data ever leave the device. To round out the experience, I built a modern, declarative UI with SwiftUI and used SwiftData for a robust local persistence layer to power the app's gamification engine.
The Journey: From Cloud Exploration to On-Device Mastery
This project was a profound learning experience. My initial foray into cloud services provided invaluable insights into product planning and the importance of architectural flexibility. The biggest challenge I faced was the initial attempt to train a custom computer vision model from scratch. While I wasn't able to complete it within the hackathon's timeframe, the process taught me the critical trade-offs between custom solutions and leveraging state-of-the-art, pre-trained models for rapid and effective prototyping.
Ultimately, the decision to pivot to an on-device architecture was the most rewarding. It pushed me to dive deep into Core ML and the new Foundation Models framework, resulting in a product that is not only more powerful but also more respectful of the user.
Bin Buddy is more than just an app; it's a demonstration of how cutting-edge, on-device AI can be harnessed to solve everyday problems and empower communities to build a more sustainable future.
DUE TO TIME RESTRICTIONS, I WAS NOT ABLE TO GET SOME FUNCTIOANLTY IN THE VIDEO, BUT IT CAN DETERMINE IF AN OBJECT IS RECYCABLE OR NOT, ADAPTS BASED ON YOUR LOCATIONS POLICIES, BE GAMIFIED, SHOW RECYCLING STEPS, ETC.
Built With
- apple-foundation-model
- imagenet
- swift
Log in or sign up for Devpost to join the conversation.