Inspiration
Getting better at a sport has always required access to a great coach. It takes someone who can watch you move, identify subtle flaws in your technique, and give you a personalized drill to fix them. That kind of feedback is expensive, inaccessible, and not available at 11 pm after the game when you're reviewing footage from today's match.
We built PlayLog because we believe every athlete deserves a world-class coaching tool in their pocket. With the convergence of TwelveLabs and large language models, we saw an opportunity to build something that can understand and take your game to the next level.
What it does
- Upload — Record and upload a video clip through the web app
- Index — TwelveLabs indexes the video for deep understanding
- Analyze — Sport-specific prompts extract insights about your technique
- Coach — Google Gemini synthesizes everything into strengths, improvements, and drills
- Ask — Have an insight-based conversation with an AI coach grounded in your actual footage
How we built it
- Frontend: React + Vite + TypeScript, Tailwind CSS, hosted on Vercel
- Backend: Convex for database, file storage, and real-time reactive queries
- Video Intelligence: TwelveLabs API (v1.3) with Pegasus 1.2
- Coaching AI: Google Gemini 3.1 for structured feedback and multi-turn chat
- Computer Vision: MediaPipe running in-browser for real-time pose landmark extraction
Challenges we ran into
TwelveLabs v1.3 API migration: Field names changed silently between versions.
name became index_name in list responses, and engines became models in
index creation. The API still returned 200s with empty data, so our app was
spinning up duplicate indexes on every call.
Authentication architecture: We hit a wall when we realized our Convex backend
had no auth provider wired up at all. The ConvexProvider was missing from the
React tree entirely, meaning every query and mutation was silently failing.
Mutations were also accepting userId as a raw client argument. It was a security hole
that required rethinking our identity flow.
Accomplishments that we're proud of
The integration between MediaPipe and TwelveLabs is something we're extremely proud of. Pose landmarks extracted entirely in-browser, combined with TwelveLabs' semantic video understanding, gives Gemini the whole picture of athlete performance that either system could produce alone.
The conversational coaching layer is also very useful. The AI coach maintains full context across the session. It remembers your video, your feedback, and your entire conversation history.
What we learned
- Plan your data model early. Deciding how you store and structure data is much harder to change later than it is to get it right upfront.
- Divide and conquer. With a tight hackathon timeline, clearly splitting frontend and backend work between teammates was the only way to ship everything in time.
What's next for PlayLog
- Real-time streaming: coach responses token-by-token via Gemini streaming
- Convex-scheduled polling: so indexing stays consistent across page refreshes
- Multi-sport expansion: basketball, baseball, swimming prompt libraries
Built With
- convex
- gemini
- mediapipe
- react
- tailwindcss
- twelvelabs
- typescript
- vite
Log in or sign up for Devpost to join the conversation.