Inspiration

Stagesense was born from something personal. Public speaking has always been a challenge for me — not because I lack ideas, but because confidence, posture, and delivery don’t always come naturally under pressure. I realized many students and professionals face the same struggle, often without access to meaningful feedback or coaching. I wanted to build something that could act like an intelligent practice partner — one that helps you improve both what you say and how you present yourself.

What it does

Stagesense is an AI-powered public speaking coach that analyzes speech and posture during a live session.

It:

  • Tracks posture through webcam-based pose detection
  • Transcribes speech in real time
  • Identifies filler words and pacing patterns
  • Generates structured AI feedback after the session

The result is actionable insight into confidence, clarity, and overall presence.

How I built it

I built Stagesense as a React web application focused on accessibility and simplicity. It integrates:

  • Browser-based speech recognition
  • Webcam pose detection for posture analysis
  • Session recording and playback
  • An AI API that evaluates transcripts and generates feedback

After each session, the transcript is analyzed and returned as clear, categorized suggestions to help users reflect and improve.

Challenges I ran into

Real-time features were the biggest challenge. Coordinating webcam input, transcription, recording, and AI feedback required careful state management and debugging.

Transcription accuracy and syncing playback with transcripts were harder than expected. I also had to troubleshoot API key management and environment configuration issues when moving between setups.

Another challenge was differentiation. I had to clearly define what made Stagesense unique — the combination of posture awareness and speech coaching.

Accomplishments that I'm proud of

I’m proud that Stagesense is a fully working system that merges physical presence feedback with AI-driven speech analysis.

Under hackathon time pressure, I:

  • Built a complete live-to-analysis workflow
  • Integrated multiple real-time systems
  • Focused on core value instead of overengineering

Most importantly, I turned a personal challenge into something functional and meaningful.

What I learned

I learned how to manage asynchronous APIs, real-time browser features, and environment variables more effectively. I also improved my frontend state management and debugging skills.

Beyond technical growth, I learned that strong products come from clarity. Solving one problem well is more powerful than building many features halfway.

What's next for Stagesense

Next, I’d like to expand Stagesense with:

  • Interview and pitch-specific training modes
  • Progress tracking across sessions
  • More advanced gesture and movement analytics
  • Personalized improvement plans

Stagesense started as a hackathon project, but I see it growing into a platform that helps people practice confidence in a structured, measurable way.

Built With

Share this project:

Updates