Inspiration

We dream about making the world a better and fairer place for everybody. We found it very frustrating to scim through the lectures to search for certain parts. In Addition TUM live doesn't provide subtitles for the impaired.

What it does

We transcribe TUM live lectures and provide an easy to use interface to search for keywords. We generate subtitles in multiple languages, to also aid non native German speakers. To make this possible, we have created extensions for Firefox and Chrome browser.

How we built it

We use state of the art AI models by OpenAI to transcribe the lectures. We achieve this by supplying the AI with an audio file of the lecture's recording. For our project, we use a micro service architecture to offer the most flexible and scalable infrastructure.

Challenges we ran into

Sleep. No, seriously.

Accomplishments that we're proud of

We are proud of our fitting design for the TUM live platform and being able to achieve our goal in such a short timeframe.

What we learned

Most importantly we learnt to never give up and believe in our self. We gained insights in Flask, Web development, AI and database systems.

What's next for Wordless

We want to get the browser extension into the app stores of the browsers. Also analysing the lecture slides to be able to search for this content is planned.

+ 5 more
Share this project:

Updates