Inspiration

What it does

Summarises a paragraph of notes into a few sentences to help the user understand the meaning more easily and in a shorter time span

How we built it

We first developed the

Challenges we ran into

Learning how to use LSTMs in Seq2Seq architectures proved to be challenging and a lot of research was required to understand how it worked before we were able to implement it.

Accomplishments that we're proud of

Managing to understand how LSTMs worked in Seq2Seq architectures and managing to code up a basic versino of one

What we learned

How LSTMs and Seq2Seq architectures worked

What's next for Revision Notes Reducer

Train our model over a larger data set to improve the accuracy of the predictions

Built With

Share this project:
×

Updates