Inspiration

For americans (and probably all students abroad) we remember filling in those bubble sheet scantron tests. The nightmare of potentially going over the bubble haunts this team to this very day-- we even still have to deal with these sheets on occasion. However, instructors generally prefer short response answers rather than arbitrary multiple choice. This way it shows the potential of actual imbued thought, and provides a way for instructors to provide partial credit for students. The problem here-- short answer questions take time to grade. At anywhere from thirty to five hundred students in a class, with anywhere from a few to several dozen short answer questions per exam per person-- nearly impossible to grade in a manual way.

What it does

Strips the manual grading aspect of short response answers. For a proof of concept, using a phone app, a students' answers can be read and compared to systematic math questions with precalculated answers.

How we built it

Questions were generated systematically on a Flask web server on Google App Engine. An android app connects to the server and requests the quiz, made with a custom-modified version of the sympy library to take "impossible" questions into account (undefined answers, no real roots, so on). Answers for those questions are also pre-calculated.

The quiz is returned, the student has a time limit to complete the quiz, able to go back and forth between questions, having to take a picture of their paper for each answer. The student's answer is detected and compared statistically to the "true" answer.

Challenges we ran into

Originally, we wanted to interact with the server and have hand drawn answers read by a raspberry pi or other board. We were not able to get such a device, so we quickly pivoted into an android app. The intent is still a pi though, or a custom device, and a mock up was made. On multiple occasions, official documentation on Google Cloud and android apps was either contradictory or misleading, and many intended features had to be scrapped in order to submit on time.

Accomplishments that we're proud of

Quickly learning and dealing with making said libraries and getting anything on Google Cloud to work at all, seeing all the contradictory information from official sources and severe, open issues that are left unresolved online.

What we learned

Teamwork, stress under pressure and a lack of sleep while being unintentionally lead down the GCP rabbit hole.

What's next for ParseTron

Move on to both simpler and more complex mathematical expression, a custom device locked down to further prevent cheating, and maybe even moving on to other disciplines such as english essays (detecting grammatical compositions via natural language processing techniques). And most significantly, moving to an Azure or AWS setup, which would not have to deal with any strange Google Cloud subtleties.

Built With

Share this project:
×

Updates