Inspiration

We noticed that although there is a lot of information readily available on a majority of vehicles in the market, there is a blind spot regarding salvaged vehicles, so we decided to help fill the void.

What it does

This project extracts and compiles data in a structured format. It allows for ease of access and ease of readability, so that way anyone could understand what the data is about.

How we built it

Using Python, we created a web scraper that collects price, damage, odometer reading and other information on salvaged cars from Salvage Bid that would then be placed into a .csv file.

Challenges we ran into

A major issue we ran into during the project is the computing resources needed to collect and implement the large data sets using a web scraping tool.

Accomplishments that we're proud of

We are proud of the successful implementation of a web scraper for collecting data from Salvage Bid, since we were all new to the realm of web scraping.

What we learned

This project gave us more insights into web scraping, and how we could compile and analyze useful data in a readable format. This data could then be implemented in various ways; furthermore, we discovered how computing resources are a major factor in how much data a web scraper can collect.

What's next for The World of Wreckage: Analyzing Salvaged/Damaged Vehicles

The next step would be to move the web scraper from operating and compiling the collected data on a physical host system to operating on a cloud-based system. This would raise the headroom the web scraper has on the amount of data the web scraper collects. Furthermore, we would like to implement a machine learning program to analyze the factors in the data from the web scraper such as odometer reading and the type of damage that would predict the sale of the car and allow people who flip cars to then determine the amount of potential profit they would make from the flip.

Built With

Share this project:

Updates