We use notetaking apps everyday to take lecture notes, work on problem sets, and share idea sketches. Unfortunately, notetaking apps today are little more than digital paper notes, making our tablets essentially thousand-dollar notebooks. We sought to empower our notes with a digital assistant that could help us save time doing our homework, like inline definition look-up, LaTeX formatting, algebra correction, and live note-sharing. (We accomplished the first 2.) The goal was to build our own notetaking app, and use its processing power and its internet connection to provide us, and students like ourselves, a 21st century notetaking experience.
What it does
Our app lets you draw and take notes like other normal apps, but its core difference is that it uses the powerful MyScript handwriting recognition API to recognize all your handwriting as text or LaTeX. This allows Notelet to compile a LaTeX project from your notes by converting blocks of your handwriting to text and math, as well as more advanced features. Users can select words and look them up in the native Apple dictionary, and they can additionally select math expressions to evaluate or plot inline from Wolfram Alpha.
How we built it
We built it by experimenting with the MyScript API, the Wolfram Alpha api, and through listening to feedback from students who also share the the same issues.
Challenges we ran into
- Using the Rev.ai api for voice translation
This was more difficult than anticipated, and we couldn't figure out how to encode voice recordings into HTTP POST requests. With another few hours or so, we probably could've added it. This would have been awesome to add, as it would be used to automatically transcribe lectures, and provide searchable audio.
Accomplishments that we're proud of
- Fully renderable LaTeX document produced from handwritten stroke document
- Dictionary word look-up
- Math plotting and expression using Wolfram Alpha API
What we learned
Notetaking is a delicate, but not too difficult of a service to build. The key difficulty seems to be in what constraints we put on user interaction. Since we rely on the user to select blocks of their handwriting for us to detect, we had to tradeoff automagical detection asynchronously for something that was much easier to implement.
What's next for Notelet
Big picture, we want to build out a fully functional beta with a smooth note-taking experience that we can put in the hands of students to get feedback from them as to what works, what doesn’t, and other features they would like to have to help them in their educational experience. We envision this as creating an IDE of sorts for doing problem sets and notetaking, and hope to see what response we get from students with this vision in the form of a concrete, tablet-based experience.
Additionally, we hope to build out the drawing portion of the app a bit more, and provide more organized way of saving important lecture definitions. It would be awesome to be able to create definitions that have associated drawings with them (e.g. associating "mitochondria" with a picture of the organelle).
In terms of accessibility, we'd love to expand upon the Voice-as-Input portion, so that more students have access to the functionality we are bringing to this app. With some more thought, we might be able to make mini games out of lecture audio to make sitting in class more engaging!