“The goal of tidying is to create space for meaningful objects, people, and experiences” -- Marie Kondo
This project was inspired by the teachings of the wonderful Marie Kondo, who envisions a future where people can live in harmony with their possessions rather than be suffocated by them.
This web application incentivizes people to clean up the world around them by giving them a streamlined “picture-to-post” user experience accomplished with the click of a button. From a single image, this application is able to determine an object name, description, and acceptable starting price from scraped data found across multiple e-commerce websites. The program then automatically creates a listing on eBay for the object.
We hope this product would help to spark joy to all who use it.
How it Works
The program is a web application built with a React frontend and a Flask backend. We utilize the Google Cloud Vision library to perform object detection and scrape online marketplaces for price and description data using the
BeautifulSoup modules. Finally, these values are used to create an eBay listing using the eBay API.
The frontend of the website is built with React, initialized using the
create-react-app tool. As the goal of the project was to create a seamless user experience, we chose to avoid a cluttered UI and stuck to simply an upload button and a login button for the user. The user selects the upload button, which opens a file selection dialogue. This file is then sent over the network to a Flask server which is listening for data on a specific port. When it receives a file it uses the image to create an Ebay posting, after procuring the relevant information. When finished, it sends a response back to the frontend, containing the URL of the posting.
We use the Google Cloud Vision platform to perform object detection on the input image. Making use of Google Cloud allows us to offload all of the heavy lifting to the cloud, as we are simply able to utilize the vision platform’s
web_detection function to recognize the objects in an image. We then use the generated images as keywords when scraping for object price and description information from the web.
Web Data Scraping
These keywords are passed to another Python script which scrapes online marketplaces for description and pricing data. The two websites currently implemented are Amazon.com and the Walmart online store. A heavy focus was put on structuring the script and dictionaries such that adding new marketplaces would be as simple as possible.
For all currently implemented and future web stores accessed by this app, the search results page is the best place to scrape product information, as it shows information for multiple products at the same time. By looking at the structure for the URLs generated when each website is searched, a function was created that generates a valid search result URL based on the provided keywords. This URL is then read into the script using the Python requests library.
The first step of creating parsing functions for both of the currently implemented websites was determining how the data on the search page could be accessed best using
BeautifulSoup. For Amazon the solution was parsing the soup based on XML formatting. This resulted in an array of objects that corresponded to each individual product listing on the page. The price and name of the product could then be fished out of these objects. A similar process was implemented for the Walmart online store, however the data from that search page was easier to scrape when parsed with HTML.
Each product name is then compiled into a list. That list is filtered to remove certain common but irrelevant words (and, of, with, etc.) and sorted based on the number of occurrences of each word in all of the product names. The top 12 most common words (in order) are selected to be the item description for the listing, and the product prices are simply averaged to find a good starting price.
The generated description and average price are then passed to a script that interfaces with the eBay API and creates a listing (using the most relevant keywords as the product name). The first step of this process is navigating to the website and pressing the login button. Upon pressing this, a session id is requested with GetSessionID. This allows us to create the login prompt by reconstructing the full URL with RuName and SessionID. Then, once the user logs in, a following GetToken call gives us the necessary user credentials to perform API actions. Now, we can take advantage of eBay SDK to list items. This is accomplished using the AddItem call in the Trading API. In the request, key parameters of the listing such as item name, description, cost, and condition are packed to create the listing. Now, once the request has been made, the item is listed for sale. The URL for this listing is then passed back to the website so that the user can immediately view it on eBay, thus completing the loop. The images were hosted using the Imgur API to upload item photos.
- Integration with eBay charity donations
- Integration with eBay “list for free” functionality
- Streamlining of packaging and sending of individual items
- Integration with more online marketplaces for pricing information
- Better predictive pricing and NLP for name/description
- Conversion to a react-native app for iOS and Android
Challenges & Development Story
Our team began this project knowing we were interested in using the eBay and Google Cloud APIs, and we were inspired by the “2020 Vision” theme to incorporate some kind of computer vision or image processing. We very quickly landed on the idea of using a single image to create a smart eBay listing, and soon realized that this would fit very well into Marie Konda’s decluttering philosophies. The user experience we focused on was that of someone going through old things in their house, determining whether or not each of those things sparked joy, and then deciding to list that object on eBay. We wanted to make the last step as streamlined as possible, such that the only actions required would be logging into eBay and providing a single image of the object being sold. Our program would then create the best possible listing based solely on that image. We kept our development as agile as possible by dividing individual responsibilities by relevant modules within the program. Nikhil managed the React website frontend for the image upload and the Flask server backend that used the Google Cloud Vision library to convert the image to a set of keywords, Alcaeus spearheaded working with the eBay API and interfacing it with React, and Lucas handled the web-scraping and listing parameter generation. We allowed features to come and go with the development process, our one priority was to keep adding features and not get stuck in any one place. By keeping in mind that one user experience that matters we managed to stay on target while actively changing the scope of our program as we built it. We set an internal deadline of 12 hours prior to the submission of the project. We ended up having a fairly well implemented solution by that time limit, but had a few more encompassing features that we wanted to add (eBay login in particular) that ended up taking some time. By then carrying over our scheduling and division of work philosophies from the development of the program to the development of the documentation we were able to set quick deadlines for ourselves in the last few hours of development in order to finish the video and DevPost.
I, Lucas, had a particularly interesting experience with this project. This was my first ever hack-a-thon, virtual or otherwise, that I have ever participated in. It’s too bad that I didn’t get to experience the full event, but holing up in my apartment with my team for 36 hours and busting out an application was uniquely rewarding. This was also my first ever end-to-end software project. My undergrad so far has been primarily focused on hardware, however I have always had a desire to learn more about software development and this was the perfect opportunity to do so. Not only did I learn much more than I already knew about writing Python pythonically and using git correctly, I also got to see how these tools allow for a fast and modular development cycle from start to finish.
My responsibility in this program was to develop the web-scraping and data analysis module in Python. I faced a lot of challenges figuring out how to even approach the giant walls of text being generated by
BeautifulSoup, but with some help from my teammates I was able to implement more features than I expected to. This part of the program, which produces the description and starting price for the eBay listing, is arguably the least crucial part, as the pipeline could be drawn straight through it and all of the main functionality would still be there. However I feel that it acts as the sort of cherry-on-top. Having a description that makes some sense and will trigger the correct eBay search keywords, as well as a somewhat accurate starting price gives the program an air of polish that would not be accomplished otherwise. I was also able to contribute to momentary discussions of development decisions and I feel that I had a positive effect on the end result of this project and the speed at which it was completed.
I am proud of my first hack-a-thon performance, and I am especially proud of my team for what they accomplished and grateful to them for helping me through my many, many blocks.
Check out a screencap of our demo! https://www.youtube.com/watch?v=Jkee14ydp9Q