Inspiration

We're badminton players ourselves, and we've always wished there was a tool to help us break down our matches and improve. When we looked into it, we were shocked to discover that badminton - the 2nd most played sport on earth with over 220 million active players - has zero AI-powered analytics tools. Tennis has SwingVision. Football has Hudl. Basketball has Second Spectrum. Badminton has nothing. That gap was too big to ignore.

What it does

BadmintonIQ takes standard match footage and automatically extracts actionable insights using computer vision. Upload a video, and it delivers:

  • Player tracking - detect and follow players throughout every rally
  • Shuttlecock detection - track trajectory frame-by-frame
  • Court keypoint mapping - calibrate court geometry from any camera angle
  • Movement heatmaps & rally breakdowns - visualize positioning patterns and tag individual rallies

No special cameras. No hardware. Just footage and AI.

How we built it

  • Computer Vision Pipeline: YOLOv8 for player and shuttlecock detection, a custom CNN for court keypoint extraction
  • Tracking & Interpolation: Custom algorithms to smooth shuttlecock trajectories and detect shot events
  • Frontend: Next.js with a presentation-style landing page and an interactive demo page that simulates the analysis flow
  • Motion Filtering: Added post-processing to discard static false positive detections

Challenges we ran into

Fine-tuning computer vision models is hard. The shuttlecock is tiny, fast, and frequently occluded - getting reliable detection required retraining with higher-resolution inputs and careful threshold tuning. Getting a working MVP demo under hackathon time pressure meant constantly triaging between accuracy and "good enough to show." Video codec compatibility was another unexpected rabbit hole - our processed videos wouldn't render in-browser until we debugged the encoding.

Accomplishments that we're proud of

We built a functional end-to-end pipeline that takes raw match footage and produces tracked, annotated output - something that literally doesn't exist for badminton yet. The interactive demo on our website lets anyone experience the analysis flow firsthand.

What we learned

  • How much effort goes into making CV models work on domain-specific data vs. generic benchmarks
  • The importance of motion filtering and post-processing over raw model output
  • Building a polished demo under time constraints forces you to make hard prioritization calls

What's next for BadmintonIQ

  • Shot classification - identify smashes, drops, clears, and net shots
  • Tactical pattern detection - surface recurring serve-return sequences and attacking tendencies
  • Multi-match comparison - track player improvement over time
  • Academy dashboard - let coaches manage multiple players from one interface

Built With

Share this project:

Updates