Many times while browsing the web, we are forced to see some images and content that we can't unsee. Such content is often disturbing and not appropriate to be viewed in public. It is even more harming when children are exposed to it. Battling this type of online harassment we decided to come up with a chrome plugin called Web Sanitizer. Most of the existing solutions maintain a white and blacklist of sites and block the whole site for the purpose. Many times it is not required as images come from ads and website's recommendation. We use machine learning algorithm provided by Clarifai to flag images with content that can be classified as adult and violent. This solution can be easily extrapolated to videos as well. So Happy Carefree Browsing!!
What it does
Web Sanitizer is a chrome plugin which classifies the images in the page as adult and violent, using deep learning and if flagged replaces it with a pre-set image. It preserves the page layout and doesn't block the whole site. The deep learning algorithm is done by the API provided by Clarifai.
How we built it
Before rendering the images on the web page, we take its URL and check with Clarifai's API for the tags it can be associated with. Then we match the image tags with a pre formed list of bad-words indicating adult content. If flagged, we redirect this image load request to a pre set safe image.
Challenges we ran into
- Defer the HTTP GET request for the images until images have been categorized.
- Keeping ourselves up and running for 24hrs, looking at the ugliest photographs on web. :)
Accomplishments that we're proud of
- Finishing our project in time.
- A ready to use plugin, which we can use ourselves use while browsing in class :p
What we learned
- We as a team can push ourselves to the corners we haven't seen before.
What's next for Web Sanitizer
- Filter out the video content as well
- Use text content as well to categorize images in context.