Inspiration

We've landed on the Moon. We've mapped 6,150+ confirmed exoplanets. We've taken photos of galaxies so far away that the light left them before Earth had oxygen in its atmosphere. And yet, the way almost everyone experiences space is the same way they experience a TikTok: a flat 2D rectangle, usually a few inches wide. That gap bothered us. Space is the most three-dimensional thing that exists, and we've flattened it. The obvious fix is VR/AR headsets, but those cost $300 to $3,500, require accounts, require calibration, and end up sitting in a drawer. None of that scales to a classroom in Lagos or a science museum in Indianapolis. StarkHacks is hosted by the Humanoid Robot Club at Purdue, the cradle of astronauts. Walking into the Armory made us ask: what would it take to put a planet in the palm of someone's hand, with nothing between them and it? That was the spark for Pony Stark. The name is a wink at Tony Stark projecting the Iron Man suit in mid-air and flicking through schematics with his hands. We wanted that, but for planets. And we wanted it to cost under $50 in parts.

What it does

Pony Stark is three things stitched into one experience:

  1. A gesture-controlled 3D holographic display for space, using Pepper's ghost. Wave a hand and the solar system spins. Pinch to zoom into a planet. Tap-in-air to select it. The planet appears to float inside a transparent acrylic pyramid, visible from all four sides with no headset, no glasses, no app install.
  2. Detailed information about planets, with gaps filled in by Gemini. The moment you select a planet, we stream its context from NASA and European planetary databases into the Gemini API and get back a fully complete summary.
  3. A decentralized data layer for the gaps NASA can't fill.

Here's the part that surprised even us. NASA's Exoplanet Archive is peer-reviewed and incomplete by design: it only ingests parameters from refereed publications, and for most of those 6,150 confirmed exoplanets, fields like atmospheric composition, surface imagery, and habitability estimates are blank. There's also no real incentive for anyone outside academia to fill them in.

So when Gemini returns "insufficient data" for a planet, Pony Stark lets the user claim a data point on-chain using a Solana smart contract. Other users can support or challenge the claim. Claims that survive community review get a small SOL payout from a shared pool. Bad claims get slashed. Peer review, decentralized, with skin in the game. Think Wikipedia meets Chainlink, aimed at the parts of the sky we haven't finished mapping.

How we built it

We had 36 hours, one Armory, and a very large pile of acrylic. The hardware stack: The display: A four-sided acrylic pyramid mounted on a high-brightness LCD, each face cut at exactly 45° to the base. This is the Pepper's Ghost principle: light from the screen hits the acrylic at 45°, reflects toward the viewer, and the brain fuses four reflected images into one floating 3D object. The geometry matters. For a viewer at horizontal line-of-sight, the reflected virtual image is positioned such that θincidence=θreflection=45° which means the image appears to sit at the centroid of the pyramid regardless of where the viewer stands. That is what gives it the "real object" feel.

The sensor: an iPhone LIDAR sensor that classifies gestures (swipe, pinch, grab, tap-in-air) in real time. The compute: A laptop running a C++ OpenGL scene rendered to four camera viewports, one per pyramid face, with each viewport rotated 90° from the last so that the reflections stitch into a coherent 3D object. This is the part that took the longest to get right.

The software stack: C++ and OpenGL for the planetary rendering (textured spheres, orbital mechanics, procedural starfields) iPhone Swift app and LIDAR data for gesture recognition WebSocket bridge between the gesture server and the renderer Gemini API for planetary info Solana devnet for the data-claim protocol, written in Rust using the Anchor framework. Claims, stakes, challenges, and payouts all live on-chain.

The gaming prototype that came first: We actually started on the Gaming and Entertainment track with a holographic Connect 4 you could play with gestures. The pitch: a single physical setup replaces every board game you own. Chess today, Connect 4 tomorrow, Settlers of Catan next week, and the pieces never get lost under the couch. Midway through Saturday we realized the exact same hardware and gesture engine could point at the sky instead of a game board, and the space angle was a much bigger swing. We pivoted, kept the game as a working demo for the "future applications" section, and re-skinned its UI with a cosmic theme so it felt like one coherent product.

Challenges we ran into

We cut a piece of small but just about enough transparent acrylic at the Bechtel Center into the shape of a pyramid to replicate the Pepper's Ghost effect. However, the acrylic pieces were not big enough and the showering light from the Armory made it difficult to appreciate the true beauty of our holographic projector. So we found a cardboard box and nested it over our laptop and acrylic assembly to remedy this.

The pivot. Switching from Connect 4 to a full space explorer 18 hours in was scary. We had working hardware and a working game. Throwing away a polished thing to build a riskier thing is the part of hackathons that nobody tells you about.

Accomplishments that we're proud of

We built a working sub-$50 holographic display from raw acrylic and the physics of a 163-year-old stage illusion. No VR headset required. Gesture control feels good. People who tried it instinctively reached out without being told how it worked.

The Solana claim protocol actually runs. Claims, stakes, challenges, and payouts all execute on-chain in our demo. We didn't quit the pivot. The space version is dramatically better than the game-only version would have been, and the game is still in there as a future-applications demo.

We respected the physics. We never call it a "hologram" in our docs without the asterisk. It's a Pepper's Ghost reflective display, and that distinction matters to the people judging this.

What we learned

Optics is a branch of engineering, not decoration. Getting four reflections to fuse into one object is a geometry problem, a materials problem, and a lighting problem simultaneously. We learned to design for all three at once. Scarcity of incentives is often the real bottleneck, not scarcity of data. NASA has better telescopes than anyone on Earth. What it doesn't have is a mechanism to reward a high schooler in Mumbai for correctly identifying something the telescope missed. Smart contracts are genuinely good at this specific thing, and we learned how to design one that's hard to game.

Pivot early or pivot never. We almost didn't pivot. The Connect 4 demo was working and safe. The space version was speculative and half-built. If we'd waited four more hours, we wouldn't have had time. The lesson is that in a 36-hour window, the cost of pivoting grows exponentially with time remaining.

Share this project:

Updates