Inspiration

SynthCity was inspired by the idea of turning music into something you can see and build.
We wanted to create a world where players don’t just listen to music — they architect it. By combining procedural audio with generative AI and interactive visuals, we imagined a metaphorical city where every beat becomes a building and every melody becomes a skyline.

We drew inspiration from rhythm games, simulation sandboxes, and visualizers like AudioSurf and City Skylines — but with a twist: this world composes itself through AI.


What it does

SynthCity is an interactive music sandbox that lets players construct a vibrant city where architecture and sound are deeply connected.
Every structure the player adds — towers, factories, plazas, neon roads — generates its own musical pattern. The entire city becomes a living composition that evolves in real-time.

The app features:

  • AI-driven music generation using the Gemini API
  • Procedural sound layers powered by the Web Audio API
  • Animated 2D/3D visuals that move, pulse, and glow to the beat
  • Player-controlled city-building tools that influence rhythm, tempo, and harmony
  • A sleek neon-cyber aesthetic that ties visuals and audio together

It’s part game, part visual music instrument, part generative art experience.


How we built it

We built SynthCity as a browser-based application using:

  • Frontend: JavaScript, HTML, CSS
  • Graphics: p5.js / Three.js for animated city visuals
  • AI Engine: Google Gemini API for generating musical structures and pattern logic
  • Audio: Web Audio API for layered sound synthesis and rhythm sequencing
  • Backend: Node.js for API routing and project data handling

The workflow:

  1. Player places a “building” or “district” on the grid.
  2. The system sends a structured prompt to Gemini describing the city’s current layout and musical context.
  3. Gemini returns a pattern (tempo, rhythm, instrument type).
  4. Web Audio API generates the sound in real time.
  5. The city visually reacts to each musical element (lights, motion, pulse, glow).

This created a seamless loop between player action → AI generation → audio → visuals.


Challenges we ran into

  • Synchronizing visuals and audio smoothly without lag.
  • Building structured prompts that let Gemini generate predictable music data instead of full compositions.
  • Creating a readable yet stylish UI that fits the synthwave aesthetic.
  • Managing timing precision in the Web Audio API to avoid off-beat loops.
  • Optimizing performance, as city visuals + audio synthesis can be heavy in real time.

Accomplishments that we're proud of

  • Developed a fully functional audiovisual sandbox that feels alive and reactive.
  • Created a unique fusion of AI, sound design, and creative city-building.
  • Designed a visual identity (neon cyber-streets, glowing blocks, pulsing towers) that reinforces the music theme.
  • Built a system where AI meaningfully influences gameplay, not just text.
  • Achieved tight synchronization between AI → audio → visuals.

What we learned

  • How to use the Web Audio API for real-time composition and synthesis.
  • How to prompt Gemini to return structured, minimalistic musical data.
  • How to optimize continuous animations and reduce frame drops.
  • Ways to make an interface intuitive even in a complex sandbox environment.
  • That players are amazed when sound and visuals form feedback loops — and when AI enhances creativity rather than replacing it.

Here’s a simple model of how our rhythm system operates:

$$ Beat(t) = Instrument \times Pattern(t) \times \sin(2\pi f t) $$

Where:

  • (Beat(t)) = the sound amplitude at time (t)
  • (Instrument) = tone generator based on building type
  • (Pattern(t)) = AI-generated sequence of pulses (0 or 1)
  • (f) = frequency determined by the building's “height” or “complexity”

What's next for SynthCity

We plan to expand SynthCity with:

  • Multiplayer cities, where multiple players create collaborative soundscapes
  • AI-generated instruments with custom timbres
  • Save/load systems for cities and compositions
  • Playable performance mode for live audio shows
  • More building types with unique rhythm and harmony styles
  • Mobile version with touch-based building controls

Ultimately, we envision SynthCity as an evolving creative tool — a place where anyone can sculpt music through architecture and explore the fusion of sound, AI, and generative art.

Built With

Share this project:

Updates