Sourcely was inspired by our frustration as engineers spending hours sourcing components and comparing prices across vendors. We wanted a tool that could read any Bill of Materials, understand it, and instantly find affordable suppliers.
We built Sourcely with a Flask backend and a React frontend, combining Ollama for user input extraction, Gemini for real-time seller discovery, and the Model Context Protocol (MCP) to automatically add items to online carts. This integration created a seamless end-to-end sourcing pipeline—from file upload to purchase-ready links.
Our biggest challenges were ensuring reliable MCP integration and managing inconsistent BOM formats. Despite these issues, we built a functional, multi-model AI system in a short timeframe.
Through this project, we learned how to utilize both local and cloud-based models effectively as well as gained practical experience in working with Ollama, Gemini, and MCP. Moving forward, we plan to expand Sourcely beyond engineering, turning it into a general-purpose shopping assistant for effortless, cost-efficient sourcing.

Log in or sign up for Devpost to join the conversation.