Inspiration
Seeing how my grandparents struggled to figure out which item goes in which bin, especially since they don’t know English or the city’s recycling rules, inspired me to create EcoBin AI. I wanted to build something that could help them (and others) classify household waste easily, using visuals and simple interactions. The goal was to make waste sorting effortless, educational, and personalized for every household—benefiting both the elderly and children.
What it does
EcoBin AI uses deep learning to classify household waste into the correct bin category based on Durham Region’s recycling rules, while allowing full customization for other cities’ guidelines. Users can upload or scan images of waste items and instantly see which bin each item belongs in whether it’s the garbage bin, green bin (compost), container bin (dark blue), or paper bin (light blue). The system also tracks recycling activity in detail, showing how many scans have been made all-time and daily, what items and bin groups are being scanned most frequently, recent scans, and whether the overall rate of scanning is increasing or decreasing. Users can also customize bin images to match their home setup or adapt the system to follow their city’s unique recycling rules, making it flexible and user-friendly. EcoBin AI makes sustainable waste sorting smarter, simpler, and accessible for everyone.
How I built it
Programming Language
- Python 3.11 – Used across backend, machine learning, and frontend scripting
Frontend
- Streamlit – Interactive UI for image upload, predictions, and analytics
- Plotly – Dynamic charts (pie, bar, line) for visualizing recycling data
- Pandas – Data manipulation and analytics tables
- Matplotlib – Supplementary charting support
- Custom CSS + Google Fonts (Poppins, Montserrat, Open Sans) – For a clean, modern UI
- streamlit-cookies-manager – Secure session handling with encrypted cookies
- streamlit_autorefresh – Auto-refreshes the dashboard every 10 seconds
Backend
- FastAPI – Serves ML predictions and API endpoints
- SQLAlchemy – ORM layer for PostgreSQL
- dotenv – Loads environment variables
- requests – Handles communication between frontend and backend
- hashlib – Password hashing and user authentication
Machine Learning
- TensorFlow / Keras – Loads and runs the ResNet50 image classification model
- PIL (Pillow) – Image preprocessing (read, resize, format)
- NumPy – Array transformations for model input and output
Database & Hosting
- PostgreSQL – Stores user accounts, submissions, predictions, and analytics data
- (Optional: Railway / Render) – Cloud hosting for backend and database
Challenges I ran into
One of the main challenges was designing a clean and responsive Streamlit dashboard that updates tables in real time while displaying timestamps in a proper date-time format. I also faced issues with app crashes when my Railway credits expired, which disrupted deployment and required rebuilding the backend hosting setup. Another major challenge came after formatting my laptop, where several dependencies and environment variables broke, forcing me to re-sync my GitHub repository and debug multiple configuration errors. Additionally, optimizing the initial page load time for a smoother user experience took several iterations of refactoring and caching to achieve consistent performance.
Accomplishments that I'm proud of
Building EcoBin AI was a turning point for me. Before this project, I struggled with severe imposter syndrome and often doubted my ability to code or create something meaningful on my own. But through this process, I gained the confidence to take on more challenging projects and trust my skills as a developer. I successfully built an end-to-end system, integrating FastAPI, Streamlit, PostgreSQL, and a ResNet50 model, that actually worked in real time and solved a real problem. Seeing the project come together reminded me why I started learning in the first place: to build things that help people. This experience not only improved my technical skills but also helped me overcome self-doubt and believe that I can grow, learn, and create impactful solutions.
What I learned
Through building EcoBin AI, I learned how to connect multiple technologies together into a seamless, end-to-end system. I gained hands-on experience with FastAPI, Streamlit, and PostgreSQL, and learned how to integrate them with a TensorFlow (ResNet50) model to serve real-time AI predictions. I also learned how to preprocess datasets from Kaggle, handle image classification pipelines, and manage frontend–backend communication using REST APIs.
Beyond the technical side, this project taught me patience and problem-solving, from debugging broken deployments to reconfiguring environments after system resets. Most importantly, it helped me understand the full development lifecycle of an AI project, from ideation to deployment, and gave me the confidence to build future machine learning systems independently.
What's next for EcoBin AI
Moving forward, I plan to make EcoBin AI even smarter, faster, and more accessible. My next steps include improving the Streamlit UI for a cleaner and more responsive design, adding smooth animations, and optimizing page load times for better performance. I also plan to expand the dataset to include more diverse waste items, improving the model’s accuracy across different regions.
Another major goal is to introduce multi-language support so that non-English speakers especially elderly users can easily interact with the app. I want to make the system fully adaptable to other cities’ recycling rules, not just Durham Region, so it can be used globally with minimal setup.
In the long term, I aim to integrate IoT-based smart bin devices that use the same AI model for real-time sorting, making recycling automatic, educational, and truly sustainable.
Built With
- fastapi
- hashlib
- keras
- matplotlib
- numpy
- pandas
- pil
- plotly
- postgresql
- python
- requests
- resnet50
- sqlalchemy
- streamlit
- tensorflow
Log in or sign up for Devpost to join the conversation.