Team #: 37 Team members:
- Eduard Var
- Kirynn#1427
- molly#8000
- johnson7267#4128
Inspiration
We were inspired to make this project after our own struggles trying to find student housing at Queen's University. Landlords post rentals on so many different websites that finding the perfect house is almost impossible.
What it does
This application scrapes rental postings from some of the most common websites Kingston landlords post their houses on. These postings are then conveniently displayed all on 1 page, to make our application the only site students need to use to find their future home. It will also update the units displayed to the user using a machine learning model that predicts the price threshold the user is willing to spend, and guessing the actual price a rental unit is worth.
How we built it
We built the project with JavaScript and Python through Visual Studio Code. The machine learning was done through Jupyter Notebook with sklearn and pandas to manage data frames.
Challenges we ran into
The main challenge we faced was time-constraint and we were trying to accomplish too much in too little time. We also were unfamiliar with some of the languages which hindered our ability to finish the project in the given time.
Accomplishments that we're proud of
- Got a complete backend database through SQLite populated and prepared
- Front end web application up and running
- Backend code to maintain communication between database and backend functions
- Machine learning price predictor with a performance of 65%
- Fully operational web scrapers for Kijiji, Keystone, Mackinnon and Panadew
What we learned
Learned to maintain our ambitions in check, and to maintain consistent contact with all members of the team throughout the project to have everyone doing a task at all times.
What's next for House Finder
- More data: we gave it a good shot by scraping all of Kijiji and Keystone regarding Kingston housing rentals. But we were limited in the tools available to us and the amount of time available to pre-process and feature engineer certain parts of the data. A good next step would be to get more data either through expanded tools or through a cleaner, more organized source.
- Deeper front end integration: having more information displayed to the user that is stored on the back end. Comes with more time available to refine it.
Log in or sign up for Devpost to join the conversation.