Inspiration

There's a person who bikes to work every day, buys secondhand, skips the meat, and turns off the lights when they leave a room. They do all the right things — and they still feel like it doesn't matter. Like one person's choices are a drop in an ocean of industrial emissions and corporate indifference. That feeling kills momentum. It's why people stop trying.

Live, Laugh, Plant is my thank-you note to that person.

I am not here to guilt anyone into recycling. I am not building another carbon footprint calculator that makes you feel terrible about flying. I built this for the people who are already doing the work but can't see the forest for the trees — because the forest takes years to grow.

The truth is, good deeds compound. The CO₂ you didn't emit by biking today is real. The tree you plant this spring will still be pulling carbon from the air in 2050. The habit you build now shapes what feels normal to the people around you. None of it shows up on a single Tuesday. But it shows up.

My app shows up for you. Every action you log is the truth, told back to you with numbers: you have saved 47.3 kg of CO₂. That is a real thing that happened because of you. Your pixel world isn't just a game mechanic — it's a visual record of what your choices have built over time. Barren land becomes greenery becomes a thriving ecosystem. Because that's what actually happens, just slowly enough that most people never get to see it.

I built this because the people doing good deserve to feel it.


What It Does

Most sustainability apps are built for the environmentally anxious — people who want to know how bad things are. I built this for the environmentally consistent — people who already make good choices and just want to know it's adding up.

You log what you did. The app tells you exactly what it's worth: in XP, in kilograms of CO₂, in the visible state of a world that grows because you fed it. The pixel world isn't decoration — it's evidence. Every zone you unlock, every tier you reach, every streak day is the app saying: this is what you built. This is real.

Your Living Pixel World A Living 3D World That Evolves With Your Actions — EcoVerse starts as a brown land rendered in quasi-isometric 3D. As players log real-world sustainability actions (biking, recycling, planting, secondhand shopping, etc.), XP accumulates and the world visually transforms through 6 tiers: Barren → Stirring → Growing → Thriving → Flourishing → Eden. Grass patches spawn procedurally across the yard, and trees, bushes, flowers, and garden beds unlock at specific XP and category thresholds — so 3 transport actions might spawn a specific garden bed while reaching tier 3 reveals unlocked trees.

Players walk a GLB character model around the world with WASD, approach glowing task zones (each category has its own color and floating label), and press E to log a sustainability action — immediately seeing the reward: a +XP toast pops, and at tier thresholds a confetti celebration modal announces the new world state. There's also a fully-modeled home interior you can physically walk into through the door, with hardwood floors, beds, sofas, kitchen cabinets, fridge, TV, and desks — all 3D GLB assets serving as the logging environment for indoor actions.

AI Product Scanner Point your phone camera at any product label or packaging. Gemini 2.5 Flash reads it and returns a carbon grade (A–F), an estimated CO₂ footprint, a breakdown of emissions across manufacturing, transport, and end-of-life, greenwashing flags with explanations, and eco-friendly alternatives. You also earn XP for every scan — because learning is an act of care too.

Action Tracker 20+ predefined sustainable actions across 6 categories with real CO₂ savings data. Biked instead of drove: +15 XP, −2.3 kg CO₂. Planted a tree: +25 XP, −10 kg CO₂. Went vegan for a day: +10 XP, −1.8 kg. Every number is grounded in reality. Every log feeds your world.

Real-time Air Quality Monitoring Users can check live pollution levels for any location worldwide, either by GPS or by typing a city name. The page shows an interactive dark-themed map centered on the location with an AQI-colored marker, a 1–5 Air Quality Index card with color-coded status, and a full pollutant breakdown (PM2.5, PM10, Ozone, NO₂, CO, SO₂) with individual level bars. A health legend lays out who's affected at each AQI level and what precautions to take — from "enjoy outdoors freely" to "wear an N95 mask" — so users can make informed decisions about outdoor activities.

iMessage Sustainability Coach Text a number. That's it. Say "I composted my food scraps today" and the app replies:

♻️ WASTE — You're on a roll! Composting is nature's recycling — you just saved your scraps from a landfill future! 🌱

No app install. No account. Just a number you text when you do something good, and something that texts back to tell you it mattered.


How I Built It

Layer Technology
Frontend + API Next.js 16 (App Router)
2D World PixiJS + Canvas 2D API
AI / OCR / Agent Google Gemini 2.5 Flash
iMessage Agent spectrum-ts + Photon cloud relay (gRPC)
Database SQLite (better-sqlite3)
Hosting Vercel at livelaughplant.garden

The scanner uses getUserMedia for a live camera viewfinder — capturing a frame to canvas, compressing it, and sending it to the scan API. Gemini receives the image as inline base64 and extracts product details, carbon estimates, and greenwashing signals purely from the visual — no barcode required.

The iMessage agent is a standalone Node.js process using Spectrum's gRPC stream. Every incoming text is passed to a Gemini prompt that returns structured JSON with a category, emoji, and a plant-pun reply. The entire agent is ~70 lines.

3D rendering is powered by Three.js (0.184) through React Three Fiber (9.6) with @react-three/drei (10.7) for GLB model loading (useGLTF), floating HTML labels, and asset preloading. The WebGL canvas is dynamically imported with ssr: false so the heavy runtime never ships to the server. The player uses a GLB character model that rotates toward the movement vector and has a procedural walking animation — a sine-wave bob at 15Hz with subtle rotational tilt — no skeletal animation needed. Movement runs independent-axis collision checking against hardcoded wall AABBs, which gives smooth wall-sliding rather than sticking on corners. The camera is quasi-isometric: fixed offset of [player.x, 12, player.z + 12] with locked parallel lookAt and per-frame lerp smoothing.

Tier progression is data-driven through a worldElements.js mapping file — each of 30+ decorations (trees, bushes, flowers, garden beds, fence segments) is registered against either a tier ID OR a per-category action threshold. A computeWorldState() function evaluates user XP and category counts against this map and returns the unlocked element set, so the world re-renders with the newly available decorations after every action.

For atmosphere, we layered a PixiJS 2D sky canvas behind the 3D world: three drifting cloud sprites whose color and opacity are tied to tier, plus a translucent smog overlay whose alpha literally equals max(0, (4 - tier) * 0.18) — the air mathematically clears as you level up. Ground grass uses a single textured plane with a canvas-generated alpha map painting floor(tier * 33) patches — one draw call, no z-fighting, scales perfectly. Interactive task zones are stacked colored rings (outer glow + inner fill) with Drei's + GSAP for bobbing labels above them. A CRT scanline overlay ties the whole thing together with a retro-arcade feel.

The air quality feature is powered by a Next.js server-side API route that proxies three OpenWeatherMap endpoints — Geocoding (to convert typed location names to coordinates), Air Pollution (for AQI and pollutant concentrations), and Current Weather (for contextual temperature/humidity). The API key lives server-side only, never exposed to the client. The interactive map is rendered with Leaflet.js using CartoDB Dark Matter tiles to blend with our retro aesthetic, with a custom HTML marker that takes the AQI color (green → red) and a translucent radius circle showing the affected zone.


Challenges I Ran Into

PixiJS + React 19 incompatibility. @pixi/react didn't support React 19. I rewrote the entire world renderer using native Canvas 2D, then migrated back to PixiJS using direct imperative API calls outside the React tree — no wrapper library.

iMessage gRPC relay. Photon's shared relay connects and authenticates cleanly but the Spectrum SDK uses @repeaterjs/repeater for lazy async streams that silently fail to deliver if the consumer isn't attached at exactly the right time. Getting reliable message delivery required deep debugging of the stream lifecycle.

Serverless SQLite. better-sqlite3 writes to disk. On Vercel, process.cwd() is a read-only mounted filesystem. Every write silently failed until I redirected the DB path to /tmp in production — a one-liner fix that took an hour to diagnose.

Camera on mobile. Safari requires explicit permission prompts for getUserMedia with facingMode: 'environment', and older devices fall back differently. Building a graceful degradation path from live viewfinder → native file picker with capture attribute took multiple passes to get right across devices.


Accomplishments I'm Proud Of

  • A genuinely playable 2D world that reacts to real-world behavior — not a dashboard with a gamification skin bolted on
  • End-to-end OCR pipeline that grades any physical product from a phone photo in ~3 seconds, no barcode needed
  • Greenwashing detection that calls out brands making misleading eco claims — something no mainstream shopping app does
  • A conversational iMessage coach that works over SMS with zero app install friction
  • The whole thing is live at livelaughplant.garden and works on mobile

But most of all: the framing. I'm proud that this app exists to validate people, not correct them. That's a harder problem than building features.


What I Learned

Gemini's multimodal API is remarkably capable at reading product packaging — it correctly identified ingredients, materials, and manufacturing origin from blurry phone photos of random household products. The vision understanding is better than I expected at zero-shot.

Building game mechanics on top of a utility app creates a completely different engagement dynamic. The moment I added tier progression and the confetti celebration on tier-up, the app stopped feeling like a tracker and started feeling like something worth returning to.

And the most important thing: the story matters as much as the technology. The same features framed as "log your carbon" versus "see what you've built" are completely different products emotionally — even if the code is identical.


What's Next for Live, Laugh, Plant

  • Persistent cloud database (Neon Postgres) so XP and world state survive across devices and cold starts
  • Barcode scanning via ZXing for instant product lookup against a CO₂ database, so grading takes under a second
  • Third-person camera toggle day/night cycle synced to the player's real local time, weather effects (rain/snow) that affect movement speed, and co-op multiplayer where friends can visit each other's worlds and see their tier states side-by-side.
  • WhatsApp + Android Messages support via Spectrum's additional providers so the coaching agent reaches everyone
  • Weekly impact texts sent automatically: your CO₂ saved this week, your streak, and one personalized challenge for next week
  • Seasonal world events — a wildfire that threatens your world if the global community's logged actions fall below a threshold, reforestation events that reward collective effort

Built With

Share this project:

Updates