One of our project members has a company which helps municipalities with early flood detection. This company Greenstream uses Google Cloud Platform (GCP) as the basis of their IoT system. Greenstream deploys sensors which are solar powered water level meters. The readings are similar to the existing USGS sensors. Where as the GS sensors report data every six minutes, the USGS sensors seem to report on a daily basis. This is implied based upon the public data. Perhaps its more often, but I'm going based upon the available data reports.
Since Greenstream has a 6 minute sample interval it can help with early flood detectionn. However, these sensors can not be deployed in every location.
With that said, enhancing Greenstreams ability to detect early flood situations using machine learning is a worthy goal.
What it does
The model predicts the water level for a given lat, long and daily precipitation level.
How we built it
We used a Google BigQuery, Google Dataprep, Pandas, and TensorFlow.
Challenges we ran into
Getting the data was the most time intensive part. Some of the data was from the USGS public data and some was in Greenstreams firebase.
Accomplishments that we're proud of
The dataprep wrangle bit was fun. That was an amazing thing to see it grab all the data at the root of a bucket and then provide a mechanism for converting the data to a format compatible with the USGS data. Learning how to use tensorboard in a notebook was also fun.
What we learned
Oh man, we learned a lot. Learned about USGS water monitoring stations, how to use dataprep, bigquery, keras in tensorflow and a ton of pandas.
What's next for Virtual sensors for water level prediction
Get more data and see if we can get a better prediction for sensors.