Inspiration

For Swamphacks 2016, we wanted to use the Clarifai API to develop software that could help the average internet user.

What it does

GreenLight is a Chrome extension that aims to protect users from opening inappropriate image links by providing a clean, convenient, and non-technical interface to the Clarifai service.

How we built it

Every Chrome extension runs similar to a standard webpage, using HTML, CSS, and JavaScript. In our scripts, we of course used the Clarafai API ([link] https://github.com/Clarifai/hackathon), along with jQuery. Some visual elements were designed with the Processing language.

Challenges we ran into

JavaScript, Chrome extension development, and the Clarifai API itself were all new technologies we had to simultaneously learn while doing this project in only 24 hours. Each presented a new set of problems none of us had ever dealt with before, such as understanding the asynchronous nature of JavaScript, facing security limitations of Chrome extensions, and learning how to properly call the Clarifai API.

Accomplishments that we're proud of

GreenLight provides an easy way to verify that any image link is safe to open. Furthermore, since GreenLight shows the list of resulting tags on an image, we hope that anyone using or developing the Clarifai service can use this extension to see results immediately and conveniently.

What we learned

Along with all that we learned in general about programming the technologies we used, we also learned that identifying release-ready milestones is an important goal to strive for as software engineers.

What's next for GreenLight

We hope to improve GreenLight in many ways. We'd like to improve the visual design of the popup, making it easier on the eyes, as well as responsive to the area it needs to show a list of results. Our "NSFW" determination can be greatly expanded and enhanced. Additionally, we'd like to give another option where the user can choose their own keywords to scan for, as well as how many keywords should cause a "Not safe" report, and whether to show the list of resulting image tags. Finally, we'd like to integrate the extension to work with social media, to prevent users from accidentally uploading problematic photos.

Share this project:
×

Updates