Inspiration

As millions of people suffer throughout the nation from the sweeping problems of Natural Disasters, our team has reflected on how we might assist the people who lose anything and everything. These storms are responsible for the losses of billions of dollars in and thousands of lives.

Our team was determined to severely mitigate the losses generated from storms by predicting the costs and impacts of them. In order to have a civic impact, we wanted to help communities and governments adapt and more effectively respond to weather disasters. We wanted to become civically engaged in our government and community, and we realized providing software to solve massive problems was the best way to do so, especially during quarantine. We hope our solution will bring together the overarching national community of citizens affected by disasters and encourage government, crowd sourced planning to combat these detrimental effects.

Thus, we took a unique approach from the common hackathon project. Instead of creating an application meant for general use, we developed an application specifically for state and city governments. We plan to implement our software as part of a nationwide government plan to promote smarter disaster response and efficient planning. Instead of having a grassroots approach to helping the community, we believe using the government’s platform is the best method of outsourcing our solution. Since governments often utilize outside developers to build applications, we believe our website fills a normally unoccupied niche, and projects like this should be encouraged in the hackathon community.

Thus, we developed Tempest, an application that uses ML to allow governments to prepare for storms and disasters by providing visualizations and predicting important statistics.

What it does

Tempest is a unique progressive web application that lets users and governments predict the outcomes of natural disasters. Using web scraped data, we were able to predict where storms would end up causing most damage and create interactive visualizations to aid the process.

We first developed a tornado model, which can predict the probability that a tornado does severe damage as well as the monetary value of the damage. We trained our model on data from NOAA, which contains tornado data such as wind speeds, duration, and azimuth values. Our model then outputs a magnitude probability from 0 to 1, with 0 being no impact and 1 being devastating. In addition, our model also predicted the monetary damage from each storm in dollars. We trained our model using AI-Fabric from UiPath, allowing us to train all our models at fast speeds. Our map includes completed tornadoes from Sept. 2019 to July 2020, and we also predicted tornadoes from the upcoming month of August since data exists for it. We exported all our map data by month from our python model, and from there we fed it into a map visualization we found through insightful documentation. This allows governments to adequately prepare for disasters and speed up recovery and minimize costs.

Even more dangerous than tornadoes are hurricanes. We embedded a map of upcoming hurricanes from the website LivingAtlas.org. We then used our tornado model and retrained it on this hurricane data. More importantly, our model takes the information and outputs both the magnitude of the hurricane on the Saffir-Simpson Hurricane Wind Scale, which classifies hurricanes on a scale of 1-5, based on data such as wind speeds and temperatures. We displayed the three upcoming hurricanes in the US. Additionally, we also output how much monetary damage each of the upcoming hurricanes will cause along with a satellite image of the storm, allowing residents and local governments to allocate proper funds and shelter themselves as much as possible.

Hurricanes can often produce floods that can ravage and destroy communities. Understanding how floods will cause damage allows communities to rebuild faster, reducing costs and time without a home. Thus, we developed a Style Transfer model that allows city planners to prepare for the aftermath of floods, which can visualize the damage in a location due to a flood. City planners will upload an image of the location before and during the flood, and our algorithm will predict the damage to the location and output a picture of what the damage will look like. The model finds commonalities in the images and keeps outstanding features from the flood image in order to properly display the damage. We deployed a portion of this model on our website to test, as the entire model couldn’t be deployed due to size. With this information at hand, city planners can swiftly respond to floods and prepare for the aftermath of disasters.

How we built it

After numerous hours of wireframing, conceptualizing key features, and outlining tasks, we divided the challenge amongst ourselves by assigning Ishaan to developing the UI/UX, Adithya to connecting the Google Cloud backend and working on implementing the interactive map features, Ayaan to developing our hurricane and flood models, and Viraaj to developing the tornado model and implementing and retraining the hurricane model.

We coded the entire app in 6 languages/frameworks: HTML, CSS, Javascript, R, Python(Python3 /iPython), and Flask. We used UiPath for training our algorithm. We used Google Cloud and PythonAnywhere for our backend. We developed our interactive maps using HTML and R, and embedded weather websites using web scrapers. We deployed part of our PyTorch model on PythonAnywhere using Flask. We hosted our website through Netlify and Github.

In order to collect data for these models, we developed web scrapers. We created a web scraper to scrape live-updating weather websites. For our home page, we got data from the NOAA. For our hurricane model, we collected previous data from Medium and webscraped for upcoming data using Arcgis. For our aftermath algorithm, we were able to deploy a version on PythonAnywhere which takes the two input images and creates an aftermath image. However, since we don’t have access to a cloud GPU, creating the image takes a while each time, so we didn’t completely deploy it.

Challenges we ran into

The primary challenge that we ran into was developing our geographic models. Since the data was very complex and requires cleaning, we weren’t sure how to start. Luckily, we were able to do enough EDA to understand how to develop the models and utilize the data. Training these models was also a huge challenge, and we saw that it was taking a long time to train. We luckily found AI-Fabric from UiPath, which allowed us to train our models easily in the cloud. While we were not able to deploy our models, as they are too large to deploy on free and available servers, as long as governments give us images and data, we can give them cost predictions.

Accomplishments we are proud of

We are incredibly proud of how our team found a distinctive yet viable solution to assisting governments in preparing and responding to disasters. We are proud that we were able to develop some of our most advanced models so far. We are extremely proud of developing a solution that has never been previously considered or implemented in this setting.

What we learned

Our team found it incredibly fulfilling to use our Machine Learning knowledge in a way that could effectively assist people who may lose their homes and livelihoods. We are glad that we were able to develop a wide range of predictive and generative models to help a vast range of people. Seeing how we could use our software engineering skills to impact people’s livelihoods was the highlight of our weekend.

From a software perspective, developing geographic models was our main focus this weekend. We learned how to effectively combine web scrapers with machine learning models. We learned how to use great frameworks for ML such as UiPath and transfer learning. We grew our web development skills and polished our database skills.

What is next for Tempest

We believe that our application would be best implemented on a local and state government level. These governments are in charge of dealing with hurricanes, floods, and tornados, and we believe that with the information they acquire through our models, they can take steps to respond to disasters faster and more effectively.

In terms of our application, we would love to deploy our models on the web for automatic integration. Given that our current situation prevents us from buying a web server capable of running the aftermath model frequently, we look forward to acquiring a web server that can process high-level computation, which would automate our services. Lastly, we would like to refine our algorithms to incorporate more factors from hurricanes to more accurately predict damages.

Our Name

Tempest is a creative synonym for wind related storms.

+ 1 more
Share this project:

Updates