Inspiration

When we first saw the challenge provided by Fannie Mae we felt that it was a problem that could help improve the lives of others. In 2008 the housing market crash created a financial crisis that could've been accurately predicted with the correct models to predict the actions of the housing market. Although the outcome of our project was not perfect, we felt it was a definite step in the direction of predicting market behaviors to prevent future crashes and predict future trends.

What it does

Our website provides a friendly user interface that allows visitors to input major cities within the United States. The interface uses a model based on past trends of housing sales to predict the future volume of housing sales within those cities. Our website also provides data including walkability, crime rates, and average home pricing within the cities to inform the user before they make any important financial decisions.

How I built it

We first started by dividing the project into different tasks that would then later interact. The three major components were the website coded in HTML, the interfacing which used JavaScript and Flask within Python, and the back end which used machine learning in Python. Kevin Gu and Kevin Fu worked together on data extraction from Zillow and the Fannie Mae API as well as webscraping to fill in any missing data, Daniel worked on creating a friendly, smooth user interface, and David worked on connecting the back end to the front end and ensuring overall functionality.

Challenges I ran into

We ran into a few different problems in all different aspects of our website. First for the user interface we had to implement Chart.js and a map from Google's API which both proved challenging however through testing we were able to combine the two elements into the same website. In addition, on the training side, formatting the housing data was difficult and the majority of the time spent on the back end was actually spent on retrieving and formatting data. In terms of connecting the website to the back end none of us had experience with Flask so we had to learn that from scratch to allow the two to interact.

Accomplishments that I'm proud of

We managed to train a deep learning model in an extremely short amount of time based on data from online API's we were unfamiliar with. In addition we learned how to use Flask over the course of the project which is a useful tool for the future. The website also turned out very well and we were able to make all the moving parts and visualizations work well together.

What I learned

We learned how to use Flask to create a Python back end for HTML websites instead of entirely server side with JavaScript. In addition, we learned a lot about using default classes in HTML to format nice data visualizations as well as data augmentation for the model.

What's next for Fannie 500

In the future we hope to first expand our extra data like crime rates and walkability to any generic zip code rather than specific Metropolitan Specific Area's (MSA) as well as adding other data like traffic and education data. In addition, we hope to create a more accurate model based on a better model with more data from sources other than Fannie Mae and Zillow.

Built With

Share this project:

Updates