As students, it's incredibly difficult to write down or even take pictures of every lecture slide that a professor or teacher goes over. To solve this, we invented an easy-to-use application that can digitize those notes that cluster your camera feed but never end up being read by displaying digital notes - notes that can be altered, at that - in a web application connected to a sophisticated Firebase database.
What it does
By utilizing Machine Learning, SnapNotes parses the notes presented on PowerPoint lecture slides and white boards by taking a photo and pulling out the specific kind of information that SnapNotes wants to pull out. The notes captured in the photo are then transferred into the Firebase database that organizes all of them and allows students to access their notes.
How we built it
Swift to build our mobile application and
Flask, a Python-based microframework, to build our web app. For our computer vision model, we used the Google Cloud Platform and utilized Firebase for our backend.
Challenges we ran into
We had problems with the machine learning models since its accuracy was not the best and therefore things like handwriting were hard to detect with great precision.
Accomplishments that we're proud of
- Brandon and Kevin were able to learn how to use Photoshop for UI/UX design
- Harris was able to assist in developing a design theme that synthesized the multiplatform look and feel while also recreating James' Google Cloud text parser in a Python-based web app.
- James was able to get our computer vision working since it was something that was causing was throwing errors. ## What we learned We learned how to manage time and resources during the entire 12-hour hackathon since we needed to know what to drop and what to add to our application. ## What's next for SnapNotes Adding more features on the mobile application beyond just taking a photo. We'd also like to add user authentication to allow multiple users to be on the app at once.