Inspiration

Through the Tech & Research track, we were able to come together from this project, finding inspiration is being able to address a relevant issue regarding internet censorship, and analyzing the big data we were given. This opportunity was a great way to gain research experience.

What it does

This project involved parsing through big data (JSON files), making the information readable, visually graphing it, and then analyzing said graphs. During this process, we categorized the data. By breaking up the data into three categories -- news, social media, and health -- we can then more easily analyze which countries are choosing to censor what kind of information.

How we built it

We built this project using CensoredPlanet data (JSON files with raw data and “Global Measurement of DNS Manipulation”), Excel in regards to our data manipulation, and Tableau to visualize our results. We coded using Python.

Challenges we ran into

We originally ran into challenges in regards to parsing the raw data given and making it easily readable. We went through many attempts involving pandas, python scripts, and json parsers. We also had trouble visualizing our data. We were originally going to use this: https://experiments.withgoogle.com/chrome/globe. But Tableau proved to be much more user friendly and efficient.

Accomplishments that we're proud of

We're proud of our final result and what we were able to accomplish in 24 hours.

What we learned

We learned how to parse through big data, visualize the data after making it readable, and analyzing our finding through a computer science lens.

What's next for Detecting Internet Censorship

What's next: finding more objective data and using more sources to gain a better understanding of our visuals.

Built With

Share this project:

Updates