Inspiration

We were exploring how we could use Hume's API which identifies facial expression emotions and emotions based on speech patterns and language. This inspired us to create an app that can analyze and give feedback based on a video recording, which is helpful for interview preparation.

What it does

The app takes in information about the candidate and suggests interview questions to answer. The candidate user is able to record answers to the video which are then analyzed using Hume's API integrated with OpenAI's API for sentence corrections. Then a report is returned identifying the top five positive and top five negative emotions as well as suggested sentences the candidate can use to improve their response.

How we built it

We built the backend in Python and the front end in React. We created functions to generate different parts of the report such as analyzing facial emotions, analyzing language emotions, and returning the transcript of the video. In the front end we have the ability to record video responses and generate sample interview questions based on user inputs.

Challenges we ran into

Integrating the video into the app, connecting the front end and back end.

Accomplishments that we're proud of

We were able to get the back end functioning pretty easily with Hume and OpenAI's APIs, so the application functions on the back end and provides an analysis report.

What we learned

Full stack development, including integration of new APIs.

What's next for AI Interview Coach

Adding more functionalities such as analysis of several videos and improvement of interview answers over time.

Built With

Share this project:

Updates