Inspiration

E-commerce today is still fundamentally 2D: scrolling grids, static images, and disconnected checkout flows. Yet in physical retail, what drives purchase decisions is interaction: picking up a product, rotating it, inspecting details, and receiving contextual guidance from a salesperson.

We wanted to explore what happens when you combine spatial interaction with AI-guided selling and comparisons with verifiable checkout flows.

What it does

Paytial is a spatial commerce experience where users can:

  • Pull products into a 3D model stage
  • Naturally rotate and inspect items like in a physical store
  • Receive AI-generated sales guidance based on interaction context
  • Complete a checkout flow that returns structured, proof-style settlement metadata

How we built it

We built Paytial using Next.js, React, and TypeScript, combined with the WebSpatial SDK and Spatial Multitasking on PICO OS 6. Product listings are rendered as floating UI “slates.” When a user selects an item, a mapped .glb model is dynamically loaded. We implemented a shared interaction system where rotation velocity (v) is determined by the change in user drag coordinates over time.

Technical Stack & API Rationale

  • Solana (Blinks): Used for the checkout layer to enable one-click, in-context transactions. Blinks allow us to bypass traditional multi-step forms, returning a verifiable settlement object.
  • Snowflake Cortex: Powers our AI Sales Agent. We utilized Snowflake Cortex with the mistral-large model to process interaction data and generate real-time pitches.
  • ElevenLabs: Provides the voice synthesis for our AI agent. Responses are streamed through the /api/voice/stream endpoint for live voice feedback.
  • Vultr: Facilitates Edge Deployment. We containerized the application via Docker and deployed on Vultr to ensure low-latency 3D interaction and fast AI responses.
  • Azure: Utilized for robust cloud infrastructure, scalability, and handling the high-bandwidth requirements of spatial assets.

Challenges we ran into

The biggest hurdle was mapping user intent to 3D movement. Slight mismatches in the transformation matrix caused immersion breakage.

Another challenge was bridging UI events with AI context. We needed to transform low-level signals like gaze and dwell time into structured inputs without overwhelming the LLM.

Accomplishments that we're proud of

We built a fully interactive spatial commerce flow where AI dynamically responds to real user behavior. We also reimagined checkout by introducing proof-style settlement outputs, showcasing next-generation payment UX.

What we learned

We learned that spatial computing is not just about bringing existing interfaces into 3D, but about rethinking interaction value. Moving beyond flat screens requires a fundamental shift in user journey definition.

What's next for Paytial

As AR/VR hardware evolves, Paytial will evolve into a persistent, real-time shopping layer where users can interact with products, AI agents, and transactions seamlessly within their environment.

Built With

Share this project:

Updates