Inspiration

The recent fires in Napa were an eye-opener for us as humans. We are responsible for climate change and the ill-effects of climate change. It's about time we took ownership and fixed everything that has been going wrong. While we strive to combat climate change, saving the world by preventing the ill-effects of the damage already done is also our responsibility. To protect mankind from any foreseen natural disasters, and combatting climate change by raising awareness about it, were the primary motivations for this project.

What it does

Shelly and I decided to leverage our Computer Science knowledge to develop a Twitter bot that could warn us about the possibility of natural disasters due to climate change. The warnings and analysis done can prevent disasters or at least reduce the damage caused by them.

How we built it

Data Gathering: We needed 3 sets of data:

  1. Historic data on ideal/expected weather conditions.
  2. Recent Tweets about and everything impacting climate change.
  3. Historic disaster-related data

Data Preprocessing:

  1. The tweets had to be tokenized, stemmed and all stop words were removed.
  2. The percentage of the difference between expected weather conditions and actual weather conditions was calculated.
  3. Sentiment analysis on tweets related to climate-change was done to find out number of 'negative' tweets per region.

Modelling

  1. Based on historic disaster data, negative tweets, and differences in historic and current weather conditions, a classifier can be trained to predict how much a particular region is prone to a calamity due to climate change in the near future.
  2. A visualization of maps with color-coded regions based on their level of proneness to disaster can be created.

Predicting An image of the visualization is tweeted by the bot with respective influential government bodies, NGOs, and influencers tagged so as to raise awareness and alert the authorities to proper precautions.

Challenges we ran into

Technical:

  1. Data gathering (still half done) is a challenge because the datasets we found on the internet were too curated and clean. To develop a generic bot, cleaning real-time data and getting the right historic data is a challenge. We deliberately did not go for the clean, freely available Kaggle and other similar datasets.
  2. Single-machine used for development slowed us down.

Non-technical:

  1. Finding teammates took 2 days of the 3 days allotted for Hackathons since a few people backed out last minute.
  2. This left us changing problem statements and project ideas.
  3. Access ofTwitter APIs to a single person slowed down development drastically.

Accomplishments that we're proud of

To come up with a design good enough to have a social impact within a day!

What we learned

  1. Risk management in project planning needs to be graceful.
  2. Natural Language Processing techniques for Text Mining in Node.js.

What's next for Divinator

  1. Use multiple datasets to train Divinator better.
  2. Use data from Facebook, Instagram, and other Social Media as well.
  3. Check how other algorithms like Random Forest and SVMs work on the data.
  4. Deploy the bot on Heroku or AWS.
  5. Process images and videos on Twitter and other social media platforms to check whether or not a region if affected by climate change.
Share this project:

Updates