Inspiration

We wanted to make streaming more interactive and effortless by automating common streamer actions using AI and computer vision, and NLP.

What it does

Cue Bot switches cameras based on eye movement, auto-clips hype moments from Twitch chat, and triggers sounds through gesture detection.

How we built it

We used Python with OpenCV and MediaPipe for camera and gesture analysis, integrated Twitch APIs for chat monitoring and clipping, and added audio triggers for real-time reactions.

Challenges we ran into

Syncing multiple inputs like camera and audio at the same time. Gesture accuracy and API integration also required a lot of debugging.

Accomplishments that we're proud of

We successfully automated camera control and chat-based clipping. We've also OBS integrated it into Twitch.

What we learned

We learned how to combine computer vision, real-time data, and APIs to build smarter, more responsive systems for creators.

What's next for Cue Bot

We plan to add facial emotion recognition and customizable gesture-to-action mapping for streamers.

Share this project:

Updates