Fake news spreading represents a serious issue in any context. Especially during the COVID-19 pandemic, controlling misinformation and identifying trustworthy sources is of utmost importance to successfully overcome this critical situation.
What it does
We propose to create a news-aggregator website where the user can find news related to COVID-19 and rate them based on their reliability, providing optional references and credentials. In this way, news coming from the most voted sources will be highlighted, while less trustable articles will be shadowed.
How it works
The website collects news relevant to the coronavirus pandemic from a variety of sources and allows users to vote the reliability of the news and add comments. The users can increase their credibility citing other sources and adding their credentials.
How we built it
As a proof-of-concept, we are building a basic framework using Python. We are parsing RSS feeds using BeutifulSoup, so to obtain an SQL database of relevant news. The database is then linked to a website, built with Django, which displays the news and provides voting and commenting functions.
Challenges we ran into
Talking into consideration the fact that none of us is a web developer, everything was a challenge. For instance, find a way to automatically collect feeds from relevant sources represented a big challenge for use and we did not implement this into the proof of concept.
Accomplishments that we are proud of
At the moment, we demonstrate the functionality of the website using a couple of RSS feeds (BBC and Nature). The website works as a news aggregator where users can vote and comment the news. Moreover, we implemented a functionality to verify the identity of some users (by submitting their credentials to the admin): a verified user's vote has more weight than a non-verified user's vote.
What we learned
We don't want to repeat it for the third time, but yes none of us is a web developer, so we learnt a lot of basic things. We learnt how to work with html, Django, and BeautifulSoup. We also learnt how to create and read an SQL database.
What's next for Trusted news aggregator website with vetted-voting feature
It woud be nice to have an automatic way to collect rss feeds into a database, through which we could then iterate the parsing to obtain the relevant news.