Interest

Our inspiration came from the early and mid-2000s era of music players and videos — think Winamp visualizers and neon-soaked effects — combined with the cyberpunk theme of this hackathon. We wanted to reimagine those nostalgic visuals using modern AI, creating something that feels both retro and futuristic.

What it does

The Music Visualizer takes an uploaded song and generates AI-driven, real-time visuals based on its frequency spectrum and user-chosen colors. The result is an immersive display where sound, color, and motion merge into dynamic artwork.

How we built it:

-Frontend (UI): Built with HTML and CSS to handle song uploads, color selection, and style choices. -Server: Used FastAPI + Socket.IO for real-time communication between the UI, the AI engine, and the visualizer. -AI Engine: Integrated Google Gemini API (with a mock fallback) to analyze audio spectrum data and produce visualization instructions in JSON format. -Visualizer: Implemented with Pygame to render different styles (radial lines, bars, circles, waves) that respond dynamically to the music.

Challenges we ran into

-Sockets not working: Early on, connections between the UI, server, and visualizer failed, and it took time to debug the networking. -Visualizer issues: Rendering initially didn’t display inside the UI, and optimizations were needed to keep everything running smoothly. -Computation delays: Sending every chunk of audio to the AI made things too slow and hit API rate limits — we had to redesign to batch or mock data.

Accomplishments that we're proud of

-Built a fully connected pipeline from song upload → AI analysis → real-time visuals. -Created a system flexible enough to run with Gemini or a mock AI mode, so we could keep working even when quotas ran out. -Got Socket.IO connections working across the entire system, from the visualizer backend into the HTML UI, after a lot of trial and error. Making the visuals actually appear inside the website (instead of popping up separately) was a big breakthrough.

What we learned

-How to integrate real-time sockets across multiple Python processes and a web frontend. -The importance of mocking AI services to keep development moving when APIs hit rate limits. -Techniques for optimizing visual rendering so the output feels smooth and responsive.

What's next for Music Visualizer

-Add more visual styles (e.g., particle effects, neon grids, VR support).

Share this project:

Updates