we were inspired by the pitch of professor kurt from ZHAW university and his research also by the wide range of data that the governments provide. We also connected this to the need to provide user with data driven experience so we were passionate about combining both challenges.

What it does

The user highlight a part of an article where he has doubts and the plugin installed in his browser would take this information and relate to the government data in order to verify it. We know that the machine can fail and in that case we engage the user to give common sense to the machine by linking question terms with relevant existing data.

How we built it

The projects has 4 parts, ethical journalism, research, front end and back end. We looked up fake news and how population verify an information they have doubt about. We found out that they use google. instead we decide to make a plugin that could help them verify it immediately. So we built this plugin through a vanilla JS.

We took the SRF website and we changed the action of highlighting. We put a request there to send the question to the verifier (that is the algorithm) and we popup an answer back.

On the backend, we get the request (question) and we forward it to the ValueNet. ValueNet translate it to an SemQl then SQL. Then we use this SQL query to find out the relevant information from the government data set in order to pop it up for the user

Challenges we ran into

  • The model needed GPU to work no cloud service could easily provide this to us.
  • We tried to run the the backend from ZHAW university without the GPU. we were expecting the backend to work slowly but it didn't work too.

Accomplishments that we're proud of

  • Communication with the team was phenomenal
  • Having a cross-platform application (mobile+web) that user can try.
  • We combined two challenges into one and we addressed an important societal challenge (miss-information).
  • We took a the know how from the ZHAW university lab and we apply to a real social problem

What we learned

  • Technical: We learned how to rank crowdsourcing answers with different way
  • Technical: We learned how to combine the wisdom of the crowd and the power of the machine
  • Soft: We learned about the power of having interdisciplinary teams
  • Soft: we learned how to communicate and break the ice with multi-cultural team

What's next for SwissChecker

We would like to get a collaboration between ZHWA, Confederation and SRF to fight miss-information. On a practical side, we would like to implement the crowdsourcing approach on the different leverage point that could increase the efficiency of the model (pre-processing and post-processing data)

+ 1 more
Share this project: