Inspiration

We wanted to give users a free method to end data discrimination, gain data privacy, and secure their data.

What it does

Through the use of CyberOPT algorithms, we to get the most accurate information about the user and report it back to them. We then display this information to the user and give them an option to opt out of these websites and programs.

How we built it

We used several Python web scraping tools, such as BeautifulSoup4 and Selenium in order to first collect the data. Then, we used the SMTPLib Python library in order to send an email of the user's report to the user themselves, from the email we created in our domain. When the user gets the email to opt out, we use IFTTT to let the server know when the user has chosen to opt out of Information collected by data collection websites such as Intelius, MyLife, and TruthFinder. Our bot, using Artificial Intelligence and Automated website control, automates the process of opting out for several programs, whether using the web bot or by a simple emailed legal notice.

Challenges we ran into

Many programs that we worked with did not work with our code or did not collaborate with Python. To work around it we had to covert different languages of code into python in order to combat the issue. Our website also had a few flaws with our hosting server falters at times.

Accomplishments that we're proud of

Our program is able to both find information about a person and opt out of being included in the website's information distribution page just by asking the user a few simple questions. Best part: it's completely free.

What we learned

Java script is a better coding language to work with when wanting to work with multiple programs.

What's next for CyberOPT

We will be making changes to website so that rather than emailing information to a person we immediately bring them to a google sheet with all of their info and from there they can choose what sources to opt out from.In the future we are also trying to implement a way to add machine learning for CAPTCHA. Along with that we will make changes to the number of websites we collect information from so that we can collect even more accurate information. We were unable to do that at the moment because many websites required payment to access information and our main priority is for users to access this information for free. Additionally, we want to incorporate machine learning into solving Google captchas and making the process smoother for the user.

Share this project:
×

Updates