One of our team member's friend was planning to attend a university that required a large sum of tuition and fees. However, given the recent layoffs after the COVID-19, his dad was affected by the layoffs, plunging his family into financial stress. But, even worse, when he pursued for other employment, he was laid off for a second time because he worked in a region that faced many layoffs.

What it does

Layon is an interactive web application that displays an interactive map of layoff statistics per state in the United States of America. It also has features that give the user a city by city breakdown of the layoff statistics in order to distinguish between more specific regions. The slider allows one to visualize the change in layoff statistics over time throughout different regions. Finally, we used machine learning to predict 5 years ahead of the current data of layoffs for each state.

How we built it

We utilized the WARN database alongside Python and R scripts to filter and clean out data for layoff notices. We passed it into a time series forecasting machine learning model deployed on SciKit Learn and Google Cloud in order to predict it. Finally, we fed the data in the form of JSON to a ReactJS web app that parsed it into an interactive map and data displays.

Challenges we ran into

The first challenge we ran into was data formatting. We initially began by trying to use Java to implement the hash structure for generic data formatting later on. Then, we switched to Python for easier conversion to a JSON format.

The second challenge we ran into was framework strategy. We initally started in Splunk because of it's ease in implementation and we believed it suited the needs of the project. However, as we continued working, we recognized that ReactJS provided a more flexible framework that allowed us to implement more specific UI features, such as the US map.

The third challenge we ran into was in the machine learning aspect. At first, we struggled to utilize Prophet as that seemed like the most common approach to implementing time series forecasting models. However, we were able to figure out how to implement it using SciKit Learn and evaluate the model through MAPE accuracy measures. We also ran into issues with insertion of frame data, but were able to incorporate Google Cloud and the Vertex API to boost our processing power.

Accomplishments that we're proud of

Interactive map: We are proud in the fact that we were able to design an interactive map with multiple different form of data displays throughout the web app. Additionally, we became familiar and got practical experience with SVG maps.

Developing the prediction model: Despite the complexities and intricacies of a time series forecasting model, we were able to conquer the challenge of not having Prophet to work with the models and instead being able to use SciKit Learn to do so. Additionally, despite running into challenges with the processing power, we were able to seamlessly implement Google Cloud.

Data cleaning: We were able to, instead of simply just using Google Sheets or other editing software, create complex Python and R scripts that parsed through 44,800 data points in our dataset and filter the cluttered data.

Strategy: We were able to visualize exactly what we had in mind for the project and all of it's specifics, including edge cases, before we even sat down to implement the programs.

Learning and using new technologies: Members were able to explore new languages, such as Python, alongside new frameworks, such as Pandas. Additionally, they were able to use new tools such as Github Desktop.

Divide and conquer: We were able to ensure that every member was contributing to a portion of the entire project and learning with practical experience at every step of the way.

What we learned

We learned how collaboration works in software projects as well as integration as we didn't have much experience with that in the past.

In terms of technology, we learned how time series forecasting models worked with their many different features. Additionally, we learned how to work with SVG graphics and interactive maps with ReactJS and utilize UI libraries such as Material UI. Finally, we gained practical experience with our in-class knowledge through implementing version control with Git, using R to data clean, and much more.

What's next for Layon

  1. Implement employment rate statistics in order to spot job creation and allow more exploration of high supply jobs
  2. Optimize model using more training data, integrating live data, and expanding processing power through Google Cloud
  3. Collecting and expanding data internationally alongside different industry to alleviate potentially worse economic circumstances in a bigger scope

Built With

Share this project: