Inspiration

I wanted to blend the physical world with casual mobile gaming. Everyday objects become game pieces, and the camera becomes the controller. It started with a simple question: what if you had to literally “catch” a color in the wild?

What it does

The app spawns a random target color, then challenges the player to find a matching real-world object through their camera. A tap triggers a real-time color-check; if it’s close enough, a celebratory demon cheers you on. Miss, and a grumpy demon appears instead.

How I built it

I built a single-page React + TypeScript app using Vite and Tailwind CSS. Camera access streams through getUserMedia, and a hidden HTML canvas samples RGB values at the tap point. Zustand manages game state, while i18next powers Japanese, English, and Chinese support.

Challenges I ran into

Calibrating color tolerance was tricky because lighting conditions wildly shift RGB readings. I solved it by tiering the tolerance per mode—loose for Easy, strict for Hard. Another hurdle was aligning tap coordinates across different screen and camera aspect ratios.

Accomplishments that I'm proud of

I shipped a fully localized, camera-driven game that runs smoothly on mobile browsers without any native app installation. Seeing friends frantically scan rooms for “lavender” or “hot pink” and laughing at the demon effects was the real payoff.

What I learned

Real-world color recognition is noisier than expected; simple Euclidean distance works, but context-aware thresholds make or break the experience. I also learned the importance of responsive overlays so the UI never obstructs the camera view.

What's next for Color Tag

I'm planning AR-style directional hints that guide players toward the target color, a multiplayer “capture the hue” battle mode, and seasonal color packs so the hunt never gets old.

Built With

  • medo
Share this project:

Updates