Communication Track, Under 18 Division

Solo: Neshanth Anand Discord User: gunmetalpoem

Slideshow Link

https://docs.google.com/presentation/d/15jKeemTzV7xgBCm9VcejzE65PZahwG1aIzIZBFjwx6I/edit?usp=sharing

Inspiration 💡

Sometimes, I make videos and give presentations, which seem to flow one way in my mind, but the experiences are completely different from those of others. What if you could understand your audience? Vibe was inspired by the increasing importance of video communication in today's digital age. Recognizing that emotions play a pivotal role in how videos are perceived and understood, I aimed to provide content creators, educators, and communicators with a tool that could unravel the emotional layers within their videos. The idea stemmed from enhancing the emotional connection between video content and its audience, whether in virtual classrooms, social media, or professional presentations. By integrating advanced emotion analysis technology, Vibe offers a unique window into the emotional dynamics of video content, empowering users to create more impactful and engaging videos.

What it does ⚙️

Vibe is an innovative app that performs frame-by-frame emotion analysis of MP4 videos. Utilizing the highly accurate pre-trained VGG-19 model, it identifies and categorizes dominant emotions in each frame. This analysis is visually represented through intuitive data visualizations like pie charts and plot graphs, allowing users to easily comprehend the emotional vibe of their videos. Vibe's custom star-rating system lets users focus on specific emotions and rate their intensity, providing deeper insights into emotional trends. The app generates downloadable reports for each analysis, making it a valuable tool for content creators, educators, and anyone seeking to enhance their video communication skills. Whether it's gauging student engagement in virtual classrooms, refining public speaking techniques, or ensuring emotional coherence in video editing, Vibe offers a groundbreaking approach to understanding and improving the emotional impact of videos.

How I built it 🏗️

The heart of the app is TensorFlow's VGG-19 model, known for its high accuracy in image analysis, which I used for emotion detection. I processed video input using OpenCV for frame extraction and image processing. The user interface was built with Streamlit, offering an intuitive experience for uploading videos and interacting with the app.

For visualizing the emotion analysis, I incorporated Plotly for dynamic, interactive charts, like pie charts and scatter plots. The app features a custom star-rating system based on the emotion prevalence in the video, and a report generation function that compiles and allows users to download the analysis as a CSV file. The combination of these technologies made Vibe a user-friendly and efficient tool for video emotion analysis. For the front-end and app deployment I used streamlit and it's widget code blocks to generate clean and functional UI.

Challenges I ran into 🛑

Some challenges I ran into included initially setting up the VGG19 model and the logic for frame-by-frame analysis. After that, I had a tough time deploying the app using the streamlit-github deployment pathway. There were issues with packages not being found, and I learned how to name certain files like "requirements.txt" to specify these packages. After the base app worked, it was simple to add more features and settings.

What's next for Vibe 🔭

Real-Time Analysis Extension: Transform Vibe into a browser extension for live emotion analysis during video playback or editing.

User Forum: Create a platform for users to exchange tips on effectively portraying each of the detected emotions.

Audio Detection Integration: Enhance emotion detection accuracy by incorporating audio cues, like matching crying sounds with sad expressions

Built With

Share this project:

Updates