People are in constant need to shop for groceries, especially during such a pandemic when people are staying at home. However, due to the spiking demand for food, it is sometimes difficult for people to get into supermarkets and find items they want on empty shelves. We would like to create a mobile application which helps people locate which retailers still have what they need in stores around them, so that they don't have to run into desparation when their items are not available while risking their health to go to supermarkets.
What it does
The mobile application is essentially an item finder and locator for people in need of shopping for essential items, which helps people plan their visits to supermarkets in a way that minimizes their exposure to the outside world while maximizing the shopping efficiency.
- Users can search for the items they need on the search tab.
- Users can check the items for their price, availability, and distance to these stores with the items based on your location.
- Users can save available items to cart for future references.
- Users can browse options from multiple stores such as CVS, Target, Walmart, etc.
- App supports fuzzy search from user input.
How we built it
We built a web scraper with various packages in python, such as beautifulsoup, selenium, json, etc. The scraper collects data from multiple major stream stores' websites and then stores them into our database, which is constructed with cloud firestore. These functions serve as our backend for the application, and they are hosted on a virtual machine on google cloud platform. The code is executed once per hour to get regular updates for the items. We also made a RESTful API with flask which allows the frontend to communicate with our backend.
For the frontend, we built the application with React-native, considering that people with different mobile operating systems have the same need. The frontend consists of a tab-navigation-based user interface with a shopping list and a search page, which is simple and easy to use. The app retrieves items from firebase based on users' geolocation and desired items and displays items' information to the users.
Challenges we ran into
Our first challenge is that we have never used selenium as a web-hosting platfrom before, so we have to make the right configurations to load the websites correctly in order to perform scraping. The hosting platform was built almost from scratch and it took us a lot of time to find the best configurations.
Besides that, we have to keep the information for the items up to date because users will always need the latest information concerning the availability for the items. To deal with this, we decided to run the scraping code once per hour using "apscheduler" in Python to make sure that we have the updated item information in our database. This turns out to be a nice strategy.
We also have to figure out a way to find the items user need when their inputs are vague. Say "whole milk" and "milk". To deal with this, we integrated a fuzzy search machine learning algorithm into our frontend so that it always looks for the right items in our database. Even if the user happens to misspell the word, we are able to retrieve that item in the database with correct spellings.
Accomplishments that we're proud of
We think the real time backend scraping is a awesome feature of our app. This is achieved by using Selenium package in the backend. It is scheduled to run of every 1 hour to scrape latest stock data from main stream retailers like Walmart, Target, and CVS.
What we learned
Definitely team work! We also learned how powerful web scraping is and most importantly, how much technology can brought to our life!