Climate change is real, and it's here. Around the world, rising sea levels are causing chaos in low-lying areas, spurring relocation and urban planning efforts that, if poorly forecasted, could waste much of a country's wealth. Current efforts to spread sea level data to the masses are lacking in the fact that they view data from a global standpoint rather than a local standpoint. That is, these large sea level visualizations only take into account global averages, rather than local averages and small local deviations (sea surface height anomalies) from the global average. We, therefore, birthed this project in an effort to bring local sea level rise data to the public.
What it does
Our website has a simple UI interface that prompts the user for an address. The address can be a city name, a complete address, geographic coordinates, etc., we aimed to make it robust this way. The website returns a scatterplot graph (with regression line) of the sea level rise at the closest reference point to that location. We also included a few statistics: the years when that location, given its current elevation above sea level, would be at low, medium, and high risk of coastal flooding. We included a final statistic predicting when that location would first become completely submerged underwater.
How we built it
The first part of our project was data collection. We found two datasets from this site. One describing globally averaged sea level changes since 1992, and one containing ~1000 subsets each describing sea surface height anomalies at each of around 600,000 points around the globe. We also found how to, in python, return both the geographical coordinates and elevation of a given address. We cleaned up the sea level data, recording at time intervals of 6 months. We then created a simple React+Flask application to connect front-end and back-end components together, creating a form tag that would query our back-end server for sea level data. Finally, in our Flask backend, we made a script that would render a matplotlib scatterplot of sea level change data from the past 25 or so years. It would also find the regression line fitting the data, predicting when the inputted location would become at-risk for coastal flooding.
Challenges we ran into
Our biggest challenge was predicting when an inputted location would become at-risk for flooding. At first, we tried to use a Recurrent Neural Network (RNN). We trained it and had it connect to our backend. However, we found that it required too much computational power to train properly and thus did not end up as accurate as we wanted. The datasets, in general, were really difficult to work with. For example, the globally averaged sea level dataset was somewhat incompatible with the sea surface height anomaly dataset. We had to set a base sea level height as a reference point as well as convert frequently between numpy arrays, lists, and tensors. It all proved to be a major hassle, perhaps even more so than connecting the Flask back-end to the React front-end.
Accomplishments that we're proud of
We are very proud of successfully integrating all the components of our website: the datasets, the React front-end, the Flask back-end, the elevation query data, the user input, and the RNN (even if it did not end up as we had hoped). It was our first time using ALL of these components, save for Flask.
What we learned
We learned a lot about website development with this project -- it really is a miraculous feeling to have it finally function in the end.
What's next for SeaWatch
Properly training an RNN to predict sea level rise more accurately, preferably with a good GPU!