We are a team of college students who love to shop. However, we have found that discovering when items go on sale is tedious. Although there are products like Honey, they only support specific merchants that they have explicitly partnered with. We decided to take on this problem, wanting to build a general-purpose solution to track any item on the Internet.
What it Does
COP allows you to paste a link to any product and track the price of the item. A user tells us the price point that they wish to be notified at, and we send a text message to the user as soon as the price is below that threshold.
How We Built It
We created a simple backend with Flask, using a database backed by sqlite3 to prototype and deploy changes as quickly as possible. We used ngrok until we were finished, then deployed our code onto an EC2 instance. The entirety of the iOS client was designed in Figma and coded in Swift.
Challenges We Ran Into
The main challenge we ran into was correctly determining the price of an item, given the website. Although this seems like a trivial problem to solve, there exists a significant amount of noise, and each website shows the cost of a product in different ways. We realized that existing platforms only worked on certain websites because they added manual handling for every website they supported. We aimed to solve this in a different way by asking the client to submit the current price of the item they want to track. Given this additional information, we found that we had enough signal to be able to correctly track price changes consistently across different sites.
Accomplishments That We're Proud Of
We originally planned to work on an existing idea, but were inspired by the others' pitches to work on a new idea from scratch. We brainstormed different approaches, designed, and developed a fully functional product in 24 hours.
What We Learned
We learned that getting enough sleep before hackathons is crucial, and drinking Red Bulls only works to a certain extent.
What's next for COP
We plan on increasing our consistency with new websites as much as possible, and using our data to eventually create a ML model for this problem.