Why we built it#
We noticed the lack of interactive webcasting tools available in the education sphere. We wanted to leverage ML to create a smart webcasting service.
What it does# Amnis is a web application that uses Google's Cloud Natural Language Processing API to process audio from live webcasts and generate tags for the video. These tags can then be searched by users to find live videos on subjects they are interested in. In addition, the page has a live comment box where viewers can ask questions or upvote questions that they find relevant. The webcaster can then view the top questions and answer them.
How its built#
We were able to create a working prototype with no outside help other than documentation and starter code.
What we learned#
We learned how to work with APIs and how to stitch together many complicated components of an application into one cohesive unit.
We would love to build Amnis into a platform for all University students to use in their studies. As students, we understand the difficulties faced by other students and we wanted to make lectures a more accessible experience.