Inspiration

In a sense, social media has democratized news media itself -- through it, we have all become "news editors" to some degree, shaping what our friends read through our shares, likes, and comments. Is it any wonder, then, that "fake news" has become such a widespread problem? In such partisan times, it is easy to find ourselves ourselves siloed off within ideological echo chambers. After all, we are held in thrall not only by our cognitive biases to seek out confirmatory information, but also by the social media algorithms trained to feed such biases for the sake of greater ad revenue. Most worryingly, these ideological silos can serve as breeding grounds for fake news, as stories designed to mislead their audience are circulated within the target political community, building outrage and exacerbating ignorance with each new share.

We believe that the problem of fake news is intimately related to the problem of the ideological echo chambers we find ourselves inhabiting. As such, we designed "Open Mind" to attack these two problems at their root.

What it does

"Open Mind" is a Google Chrome extension designed to (1) combat the proliferation of fake news, and (2) increase exposure to opposing viewpoints. It does so using a multifaceted approach -- first, it automatically "blocks" known fake news websites from being displayed on the user's browser, providing the user with a large warning screen and links to more reputable sources (the user can always click through to view the allegedly fake content, however; we're not censors!). Second, the user is given direct feedback on how partisan their reading patterns are, in the form of a dashboard which tracks their political browsing history. This dashboard then provides a list of recommended articles that users can read in order to "balance out" their reading history.

How we built it

We used React for the front end, and a combination of Node.js and Python for the back-end. Our machine learning models for recommending articles were built using Python's Tensorflow library, and NLP was performed using the Alyien, Semantria, and Google Cloud Natural Language APIs.

What we learned

We learned a great deal more about fake news, and NLP in particular.

What's next for Open Mind

We aim to implement a "political thermometer" that appears next to political articles, showing the degree to which the particular article is conservative or liberal. In addition, we aim to verify a Facebook-specific "share verification" feature, where users are asked if they are sure they want to share an article that they have not already read (based on their browser history).

Built With

Share this project:
×

Updates