It was soul crushing learning that classes at Cal were canceled for the foreseeable future. Losing access to friends was hard, and having no access to study spaces like libraries made me feel unmotivated and bored. The transition to online learning was tough, and it was especially tough in regards to exams. On my EE16B midterm, I took the midterm alone while nearly all my friends collaborated on their midterms. It was hard, to say the least, especially when I was at such a major comparative disadvantage. Not only that, but rumors of professors cracking down on cheating made me scared that I would get caught for an offense I never committed. I realized there needed to be some sort of accountability to ensure that learning is not compromised by students OR by instructors. My computer is a craptop, that doesn't have much space or RAM. I need something lightweight, that would not require ANY download, but it seemed like there was nothing out there of that kind.
What it does
Proctor.ml makes at home testing possible through computer vision aided cheating detection and seamless real time communication between the student and the teacher. No longer do students have to deal with constantly crashing Google Docs for clarifications, or be worried about taking a bathroom break during the exam. Proctor.ml makes online learning transparent and allows students to actually communicate with their professors or teachers. The product consists of a UI that allows professors to create exams to proctor, each with a unique link. Students simply follow the link in their browser, and keep the window open in the background of their exam. With no software download required, the instructor can now see the active status of their student. Currently, students can be connected, not in frame, on a bathroom break, or having two in frame.
How I built it
Using pretrained models for face and eye detection, my partner Kathy and I developed a client side and server side interface that both connected to our backend in Google Cloud, using websockets for real time communication and SQL to manage our persistent data.
Challenges I ran into
Finding a library to actually perform the face detection we needed was pretty difficult, as even popular libraries like OpenCV were pretty dysfunctional and finicky. The real time communication through web sockets was tedious work, but worth it for the end product. A huge hurdle was negotiating between what data would be managed through HTTP requests and what data would be managed through our websocket connections. Definitely the most tiresome and annoying part of this project was setting up SSL. Modern browsers require SSL secured connections for video capture, and setting up the actual product with SSL and a domain was tough. I am sitting here at 6AM after 8 hours of trying to get some simple signed certificates and having them work.
Accomplishments that I'm proud of
Building a full fledged web app with authentication, data persistence, and websockets is one thing - but fitting in computer vision with live media capture was something that made it all the more satisfying when we finally typed in https://proctor.ml into the browser and we were greeted by our shiny new application.
What I learned
Kathy and I learned how important it was to draw and whiteboard things out, and it was only through our planning that we were able to get the product done. Specifically, I think I learned TOO well to not underestimate how much time the deploying of the application would take, but also know that next time I have to deploy an app with SSL security I can use certbot to sign my certificates without having to shell out hundreds to GoDaddy. I also learned how to use Google Cloud, which is honestly fantastic. Coming from AWS, it's clear that for my productivity the interface and the ability to immediately connect to a web browser based terminal were pretty impactful.
What's next for Proctor.ml
Proctor.ml is going to continue its rapid growth by expanding directly to UC Berkeley and local high schools. I have initiated conversation with my alma mater Bishop O'Dowd High School in Oakland to see if they would want to use the service as they have no proctoring service for their finals. I hope to email the Cal administration after the competition is over as well. The technology is going to continue to progress. I hope to add phone detection along with lip movement tracking and computer screen tracking as well. Given how much we accomplished in 36 hours, I have no doubt proctor.ml will be the the most robust testing platform on the market.