Inspiration

I've always loved rhythm games, but I kept thinking - why can't players make their own levels? Most rhythm games give you a fixed setlist, and that's it. I wanted to build something where the community creates the content, not just consumes it. RiffRivals is my attempt at making that happen on Reddit, where people already share and remix creative content all the time.

What It Does

RiffRivals lets you create musical challenges in two ways. You can either record yourself playing virtual instruments (drums, piano, bass, synth) and post it for others to replicate, or use the Chart Creator to design falling-tile levels with precise timing. Once posted, anyone can attempt your challenge and compete on the leaderboard.

How I Built It

The project runs on Reddit's Devvit platform, which handles the post system and user authentication. I built the interactive parts with React and TypeScript, and wrote a custom audio engine using the Web Audio API to synthesize sounds in real-time.

Technical Architecture

The app is split into three main parts that work together:

Client (Frontend)

This is where all the gameplay happens. I built a canvas-based game loop that runs at 60fps, rendering four lanes where notes fall from top to bottom. Players hit keys when notes reach the hit line at the bottom. The input system captures keyboard presses (A/S/D/F or 1/2/3/4) and touch events for mobile, recording timestamps down to the millisecond.

interface FallingNote {
  id: string;
  note: DrumType | PianoNote;
  lane: number;        // 0-3 for four lanes
  startTime: number;   // Millisecond timestamp when note spawns
  hitTime: number;     // When note should be hit
  duration: number;
  velocity: number;
}

For audio, I used Tone.js wrapped in a custom engine that manages four different instruments. It handles real-time synthesis and keeps the audio synced with the visuals, which is crucial for rhythm games.

Server (Backend)

The Node.js backend is pretty straightforward. It stores scores, manages leaderboards using Redis sorted sets, and handles challenge data. When a player completes a challenge, the server validates the score and updates the leaderboard:

interface ChallengeScore {
  userId: string;
  accuracy: number;
  timing: number;
  perfectHits: number;
  greatHits: number;
  goodHits: number;
  missedNotes: number;
  completedAt: number;
}

I also implemented data compression for larger compositions since Redis storage isn't unlimited. The API has endpoints for submitting scores, fetching leaderboards, and retrieving challenge analytics. Everything runs serverless on Devvit, so there's no persistent connections or long-running processes.

Shared (Types & Constants)

This layer keeps the client and server in sync. All the TypeScript type definitions live here, plus shared constants:

const HIT_WINDOW = 100;     // ±100ms timing window for hits
const LANE_COUNT = 4;       // Four vertical lanes
const LANE_WIDTH = 80;      // Pixel width per lane

Having shared types meant I could catch data structure mismatches at compile time instead of debugging weird runtime errors.

Timing Precision

Getting the timing right was probably the hardest part. Rhythm games need millisecond-level precision or they feel off.

Notes spawn based on the song data, converted to milliseconds:

// Check if it's time to spawn the next note
if (elapsed >= currentSongNote.startTime) {
  const newNote: FallingNote = {
    id: `${now}-${songNotesIndexRef.current}`,
    lane: getLaneForNote(currentSongNote.note),
    startTime: now,
    hitTime: now + settings.speed * 1000,
    // ... other properties
  };
}

When a player hits a key, I calculate the difference between the key press timestamp and when the note should've been hit:

const timeDiff = Math.abs(now - note.hitTime);

// Score based on timing precision
if (timeDiff <= 25) return "PERFECT!";   // ±25ms
if (timeDiff <= 50) return "GREAT!";     // ±50ms  
if (timeDiff <= 100) return "GOOD!";     // ±100ms
return "MISS!";

The tricky part is keeping everything synchronized. The canvas renders at 60fps, which means each frame is about 16ms. Notes need to fall smoothly while maintaining precise hit detection:

// Calculate note position based on time elapsed
const progress = (now - note.startTime) / 1000;
const position = progress * settings.speed;

// Check if note is within hit window
const distanceToHitLine = Math.abs(position - 400);
const inWindow = distanceToHitLine <= HIT_WINDOW;

Browser audio latency is also a pain. The Web Audio API has inherent delays that vary by device and browser. I had to compensate for this to keep the visual and audio in sync. When you hit a note, it needs to feel instant, even though there's processing happening behind the scenes.

The scoring system tracks everything - perfect hits, great hits, good hits, and misses. The combined score uses weighted calculations:

const accuracyScore = (notesHit / totalNotes) * 100;
const combinedScore = (timingScore * 0.7) + (accuracyScore * 0.3);

This level of precision is what makes competitive play possible. Without it, the game would feel floaty and unresponsive, and leaderboards would be meaningless.

Challenges I Ran Into

Web audio is painful. Browsers won't let you play sound until the user clicks something, which meant adding initialization flows I didn't originally plan for. I also hit a nasty bug where notes would all spawn at once instead of spreading out over time. Turns out I was mixing up seconds and milliseconds when loading saved charts - spent hours debugging that one.

Making it work smoothly on both desktop and mobile was harder than expected. Touch events behave differently than keyboard events, and I had to optimize the canvas rendering heavily to keep 60fps while drawing falling notes and handling audio.

What I'm Proud Of

Getting real-time audio synthesis working felt like a huge win. When you press a key and hear a drum hit or piano note with barely any delay, that's satisfying. The Chart Creator also turned out better than I hoped - placing notes visually, testing immediately, then posting to Reddit all flows pretty naturally.

The scoring system actually works too. It feels fair when you nail the timing and get rewarded for it.

What I Learned

This project taught me way more about web audio than I expected. Properly managing audio contexts, optimizing canvas rendering, and synchronizing visual elements with audio playback are all trickier than they seem. I also learned a lot about working within Devvit's constraints - you can't use websockets, so everything has to work around the request-response model.

What's Next

There's plenty I want to add. More instruments would be cool. I'd like to implement actual remix challenges where you can layer new tracks on existing post and i want to expand on the feature where users can post themselves playing something like a riff, beat, or melody and others can build on top of it. The idea is to turn it into a true collaborative song-building experience, where each contribution adds a new layer to the track. Over time, a simple riff could evolve into a full piece made by multiple creators jamming together.

The foundation is solid though. It works, people can create and share, and the gameplay feels good. Everything else is just building on top of that.

Built With

Share this project:

Updates