Video demo: https://youtu.be/YjGc_3YVWc8
Inspired by the public health response to the COVID19 coronavirus. We wanted to empower governments and public health officials by enabling them to know how people on the ground are reacting to their policies.
Presently, governments gauge the civilian response in a very reactive and delayed manner. The #corona tool would enable real-time response so they can address civilian sentiment in real-time.
- Twitter has been identified as both a potential facilitator and also a powerful deterrent to adverse civilian behaviour.
- User generated content from social media platforms can provide early clues about user response to real-world incidences, which can be useful within scopes such as product analyses, government civilian awareness, and brand analysis, to name a few.
What it does
- An interactive tool that displays sentiment around COVID-19 across the world, which helps public health officials gauge which government policies are well-received and which are not.
- We designed an interface to harvest civilian sentiment and response on Twitter during this COVID-19 epidemic. Coupled with intelligent data mining, visualization, and filtering methods, this data can be compiled into a knowledge database that would be of great utility to decision-makers and the authorities for rapid response and monitoring during such scenarios. Using real world data sourced directly from the Twitter API, we demonstrated that the proposed framework has yielded meaningful numerical information, to reveal potential civilian response to the ongoing COVID epidemic.
How I built it
We pulled coronavirus confirmed case counts/mortality rates/recovery rates from Johns Hopkins repository and tweet data from Twitter using the Twitter API. We then filtered the tweets by geographic location and ran a sentiment analysis on the tweets using the Textblob and Tweepy Python libraries. We wanted to plot this data on a map but it did not work as expected.
Challenges I ran into
We were going to tag Tweets to gelocations and have an interactive map (Image 3) but we ran into two big issues. First of all, the tweets we were able to scrap had geolocation turned off for most of them, this threw our entire project off. And it was difficult to create polygons on an interactive Google Map. So since we couldn't figure geolocations out, we created a tag searcher on tweets that gives us it's sentiment values! This way users can search for topics, events, whatever they want and receive an analysis on it.
Accomplishments that I'm proud of
We were able to scrape twitter information, and perform a sentiment analysis on it!
What's next for #corona
- Integrate regional information to give greater context of the full picture
- Integrate infection rates by region for added context
- Improve the model to recognize a greater array of key words relevant to COVID-19, as well as recognize and understand international languages