Inspiration

Have you ever felt alone, stressed, or wished you could easily talk to a close friend, a professional therapist, or even just a comforting, playful puppy? We feel you, and that's exactly what inspired us. We believe augmented reality (AR) has the power to create real emotional connections with others through our senses. Our goal was to build an immersive experience that offers users emotional support anytime, anywhere right in front of their eyes.

What it does

ARmigo is an augmented reality experience built for Snapchat Spectacles that alleviates anxiety through multiple channels. We want to highlight 3 key features of our AR application:

  • One-on-One Chats: Engage in a virtual conversation with a virtual companion, your reliable friend and listener,
  • Professional Mental Health Advice: some characters also equip rich domain knowledge in psychology and mental health, empowered by the comprehensive agentic system.
  • Corgi Companion: Interact with an adorable animated Corgi that can handshake, play dead, roll over and offer delightful companionship.

How we built it

Development Environment: Lens Studio was our main platform for building and animating the AR experience, a powerful studio tool that provides a gamified development experience for AR developers.

Virtual Human Interaction: We leveraged many built-in and external toolkits to establish one-on-one communication with our custom virtual human agents. Our pipeline begins by capturing the user’s audio and video inputs through the snap spectacles hardware, where the audio inputs are further converted to natural language using Lens Studio’s native speech recognition library support. Together text and image inputs are distributed to LLM-based AI agents for thorough research and reasoning. The agent has access to a diverse database where it can thoughtfully “produce” an informative and relevant answer, and return its responses as synthesized audio output back to the user.

Agent Framework: We deployed a Fetch.AI agentic workflow adapted from LangGraph framework that involves cyclic interactions between AI agent, user agent, and toolkit. The framework integrates ASI-1 mini as LLM and Tavily as database search engine.

Server Infrastructure: We developed and hosted a lightweight Flask API in Python for communication between the Lens Studio JavaScript environment and our backend Agent Framework. We further used ngrok to expose the Flask server to a public endpoint, ensuring compatibility with Lens Studio's external URL fetch capabilities.

3D Modeling: Animated models for the human therapist and Corgi dog were sourced from Mixamo and Sketchfab. We customized and edited these models in Blender to fit Lens Studio requirements.

Interaction Design: Lens Studio’s built-in Interactable buttons and hand gesture interpreters enabled user actions (e.g., choosing a companion, initiating handshakes).

Challenges we ran into

Lens Studio Limitations: We initially struggled with local server connections — Lens Studio does not support direct fetches from localhost which prevents our application from accessing external resources such as Fetch.ai’s Agent toolkits. To mitigate this issue, we leveraged ngrok to set up a public endpoint to host Fetch.ai’s service and incorporate it into our application.

AR Platform Learning Curve: None of our team members had prior experience developing with Snap Spectacles or AR platforms. We had to learn the Lens Studio environment, prefab modeling, interaction scripting, and Spectacles deployment from scratch — all within the hackathon timeline.

Model Compatibility: To design our prefab objects, we also Integrated Mixamo/Sketchfab models with Blender and Lens Studio, which demanded significant re-rigging, optimization, and troubleshooting to maintain both quality and performance. Through trial and error, we quickly adapted and developed necessary objects for displaying what we envisioned for the project.

Accomplishments that we're proud of

We strengthen the interactions and senses in augmented reality for users' mental health. All of our logics are triggered by interactions like voice, shared vision, and hand gestures. In addition, we connected our platform to the AI-driven search engine and agent to equip with better domain knowledge. Furthermore, we utilized prefab models and created good designs.

What we learned

As our first major hackathon, we were very excited to explore this unique project opportunity, developing with Snap’s Spectacles hardware in Lens Studio. We reinforced our knowledge of rapidly prototyping and deploying end-to-end AR experiences for the lens community. While being completely novel to AR development and 3D modeling, we quickly adapted to the pressing demands of this topic, tirelessly worked to resolve and integrate issues between different services and platforms, and ultimately created an immersive emotional support system that combines vision, voice, and interactions among us and the digital world.

Built With

Share this project:

Updates