Inspiration

Honestly? I was feeling lonely. Coding is creative, but it's often just you, a keyboard, and a screen that's way too bright at 2 AM. When LLMs started getting good, a thought hit me: what if coding felt less like typing and more like a jam/gaming session?

I imagined a future where you could just... vibe. Use your phone as a controller to speak, draw, and navigate, turning any screen into a shared coding console. The goal was to build something fun, collaborative, and a great excuse to not be hunched over a keyboard.

What it does

Vibe Console turns any screen into a multiplayer code editor and your phone into the ultimate controller.

  • Phone-as-Controller: Join a lobby with a simple code, and your phone becomes a smart controller.
  • Multi-Modal Input: You're not just typing. Use a D-pad for navigation, a touch canvas for drawing ideas, and your voice for commands, all sent from your phone.
  • Instant Co-op: Players join a shared session via a QR code. The host locks the lobby, everyone picks a cloud editor like Bolt.new or Loveable, and the session begins.
  • No "It Works on My Machine": Everyone is in the same cloud environment. What you see is what they see.

How I built it

This project is a high-wire act of modern web tech, and I'm honestly surprised it all works with some bugs due to tokken limits! I build it with Bolt.new and yes through Vibe coding

  • Frontend: The console display and the landing page are a React + TypeScript app, built with Vite and styled with Tailwind CSS. The multi-stage UI, from the lobby to the editor selection, is handled by react-router-dom and a bunch of stateful components.
  • Backend & Real-time DB: We went all-in on Supabase. It handles our entire backend, from the PostgreSQL database to user sessions. The database schema is the backbone, tracking sessions, devices, and all player inputs.
  • The Magic: Communication Stack: This is the fun part. We built a hybrid system:
    • Supabase Realtime: For lobby management. When a new joins or the host locks the session, Supabase's realtime channels push updates to everyone instantly.
    • WebRTC: For the actual in-game controls. Once the lobby is locked, we establish direct peer-to-peer connections using our WebRTC hook. All the signaling (offers, answers, candidates) is passed through our Supabase webrtc_signals table, but the control data itself is sent with super low latency directly between browsers.

Challenges I ran into

Let's be real, this was ambitious. Our biggest challenge was the communication stack. Juggling WebRTC for speed and Supabase for reliability was a puzzle. I had to build a robust WebRTCManager that could handle failed connections and fall back gracefully, so the user never feels a hiccup.

Another headache was designing the InputRouter. It needed to process events from both WebRTC messages and our Supabase device_inputs table (our fallback) and treat them identically. My migration is a testament to realizing halfway through that I needed to add 'voice' and 'canvas' as valid input types!

Finally, getting the Supabase Row Level Security policies just right took a lot of trial and error. i needed it to be bulletproof but also allow for the dead-simple, anonymous "jump-in-and-play" experience.

Accomplishments that I am proud of

I'm also incredibly proud of the seamless flow from the landing page to the actual editor. The state management in ConsoleDisplay.tsx and PhoneController.tsx to handle all the different stages (joining, waiting, selecting, playing) came together beautifully. It just works.

And honestly, the hybrid WebRTC + Supabase architecture. It feels like I picked the right tool for every job, and it resulted in an app that's both powerful and resilient.

What I learned

The biggest lesson was that the future of interfaces is multi-mode. A keyboard is great, but combining it with voice, touch, and gestures unlocks a whole new level of interaction.

I also learned that Supabase is an absolute beast for projects like this. It handled my database, real-time sync, and WebRTC signaling without me having to write a single line of server-side code. It's the ultimate enabler for ambitious-but-lazy developers.

What's next for Vibe Console

World domination, obviously. But first:

  • More Editors: Integrating with more cloud-based development environments and creating Using Zed editor.
  • Richer Inputs: What can we do with the accelerometer? Or haptic feedback? Let's get weirder.
  • AI Code Gen: Actually hooking up the voice commands to an LLM to generate code in the selected editor Using Elevenlabs or openai . That was the original dream, after all!
  • Public Beta: Getting this into more hands and seeing the wild ways people use it.

Built With

Share this project:

Updates