Inspiration

We were inspired by someone crashing a lecture wanting to learn something new - getting out of their comfort zones - and to make it easier for them to follow along. CrashFacts encourages learning by making it less troublesome to find facts about what you hear. This can be extended to apply to those who feel confused and stressed in their own classes when they don't understand what's going on.

What it does

CrashFacts transcribes audio from lectures in real time and identifies and defines any key vocabulary or complex terms. In addition, it provides further relevant information from online and class resources, i.e. Wikipedia, class notes (of which one can upload documents to the WebApp). This web app is also a repository to allow for an efficient review of key concepts taught in lectures.

How we built it

CrashFacts was built with JavaScript and React, with Microsoft Azure Cognitive Services for Live Speech to Text, and Text Analysis. We also use Firebase for file storage and user persistence.

Challenges we ran into

Our UI and application was quite complex as we had to perform live transcription, then take out chunks of text to perform textual analysis. Certain terms also were problematic to get entity definitions for.

Accomplishments that we're proud of

We managed to build what we had planned :) We are especially excited about what we've built because it is highly practical and we will be using them ourselves!

What we learned

We learned about many APIs and successfully integrated the relevant ones into our project. The development of this project also cemented our knowledge of React.

What's next for CrashFacts

We intend to expand its capabilities by including facial recognition features which allows our webapp to determine the user's understanding of the content and mark out the corresponding sections for further review.

Share this project:

Updates