As a student I rarely have the resources or time needed to interact with the booming Real Estate industry. Seeing the CBRE challenge of having to approximate the value of an investment seemed like a good opportunity to both learn more about the industry while also being able to apply my ML/AI skills to a large dataset and help other student dip into learning.
What it does
Site In-Site is single page application that allows both users with little to no experience and experts alike quickly view Real Estate that has been chosen by our ML/AI model as good jump points into making an offer with an approximation of how much the land is actually worth. Users select a city and are able to view that houses place in the google maps API before seeing a summary of the property. For users with their own data they can fill in a form that will then use our in-house model to instantly approximate the value of their property.
How we built it
For our model we trained a model using TensorFlow using housing and state income data we cleaned. The model was then hosted on a python Flask backend that sets up an API that can be accessed from anywhere. The application is built using a Svelte front end and consumes the flask API to figure out how much a property is worth before showing that to the User.
Challenges we ran into
Small dataset in regard to the number to the number of houses between Dallas, SoCal, and Philadelphia. We had to toss some of the observations from that data when we observed they were massive outliers for the set. Additionally, the model has trouble predicting very expensive homes probably due to the few number of expensive homes available to train on.
Accomplishments that we're proud of
The front end of our application looks beautiful and its the first time our team feels competitive with just the front end. Additionally, for most of our team this was our first time deploying a model and training in such a small period of time. For the time we had the model has loss within an acceptable range and is actually driving our website.
What we learned
When it comes to predicting a value given a number of features you need to sometimes reduce the dimensionality in order to get good results. Not always good to one hot encode a feature with many unique values. We learned how to use R to quickly and thoroughly clean large datasets that we would have struggled to comprehended before.
What's next for Site In-site
We want to move to a model where we have more available predictors to look over a city in a more concise way. Additionally to raise the number of instances of housing observations including how it changes over time so that our model can predict with more accuracy trends over time.
Log in or sign up for Devpost to join the conversation.