Inspiration

This project was inspired by a collaboration with Roxbury-based mural artist Rob “Problak” Gibbs and his Breathe Lifeímural series across Boston, which centers joy, kinship, and affirmation in Black public space. These murals are not static images. They are living reflections of community history, care, and resilience.

We were motivated by a core question:
How can AI and AR extend the life of public art in a way that is accessible to artists, not just technologists, while preserving artistic intent?

What it does

Breathe Life Mural AR transforms murals into interactive, location-based AR experiences using an AI workflow designed to be approachable for non-technical users.

Artists and collaborators can:

  • Select which mural elements animate
  • Control pacing, intensity, and interaction ranges
  • Layer ambient spatial audio
  • Preview changes in real time without writing code

The experience combines visual motion, spatial audio, and user interaction to create murals that feel alive while remaining grounded in place and meaning.

We demoed Breathe Life Mural AR with youth from the Roxbury Flagship Clubhouse, inviting them to experience the mural in augmented reality for the first time. Seeing the mural animate, respond, and breathe sparked immediate excitement and curiosity. Many participants were encountering AR and immersive tools in this context for the first time, which opened conversations about art, technology, and creative possibility in their own communities. Youth were also introduced to adjacent immersive technologies, including Snap Spectacles and Oculus VR headsets, and engaged eagerly with how these tools could be used creatively rather than passively.

How we built it

We developed an artist-first, low-friction AI workflow deployed through the Black Terminus AR platform. The pipeline was intentionally designed so that non-technical artists can meaningfully participate in shaping the AR experience.

Mural capture and layering
Murals are digitized and segmented into semantic layers using AI-assisted tools, with manual overrides to maintain artistic control.

Constrained AI motion presets
In order for artists to have autonomy in how they collaborate with large multimodal models, we cre ated guided, prompt-based motion behaviors that allow artists to shape animation outcomes without technical setup.

Accessible audio controls
Spatial audio layers are added using simple controls for volume, proximity, and direction, making sound design intuitive for non-audio engineers.

Artist-in-the-loop iteration
Changes can be tested and refined visually and in context, reducing reliance on engineering intervention.

On-site AR deployment
Experiences are deployed as location-based AR, ensuring murals remain tied to their physical and cultural context.

Challenges we ran into

Designing for simplicity without losing depth
Creating an interface that was intuitive for non-technical users while still offering expressive range required multiple iterations.

Preventing over-animation
Giving users control also meant building guardrails to prevent effects that could undermine the mural’s message.

Balancing automation and agency
Determining what AI should automate versus what artists should explicitly control was an ongoing design challenge.

Accomplishments that we’re proud of

  • Designed a user-friendly AI workflow accessible to non-technical artists
  • Enabled artist-led control over motion, interaction, and audio
  • Transformed 25 murals into living AR experiences
  • Integrated multimodal AR without sacrificing cultural integrity
  • Demonstrated a scalable model for community-centered AR creation
  • Successfully demoed the experience with Roxbury Flagship Clubhouse youth, expanding access to AR and immersive technologies through public art

What we learned

Working directly with youth audiences reinforced the importance of accessibility and delight. During the Roxbury Flagship Clubhouse demo, participants were eager to explore not only the mural AR experience but also adjacent technologies such as Snap Spectacles and Oculus VR headsets. Their excitement, questions, and rapid intuition with these tools affirmed that when immersive technology is introduced through culturally resonant art, it becomes inviting rather than intimidating.

  • Accessibility is an ethical design choice, not just a UX feature
  • Artists engage more deeply when AI tools are transparent and constrained
  • Simple controls paired with strong defaults outperform complex systems
  • Multimodal AR works best when artists, not algorithms, lead decisions

What’s next for Breathe Life AR

Next, we plan to:

  • Expand the no-code toolset for muralists and community artists
  • Develop workshops to train artists in AI-assisted AR creation
  • Share this workflow as an open model for ethical public art augmentation

Breathe Life Mural AR points toward a future where AI lowers barriers instead of raising them, enabling more artists to bring their work to life in ways that honor place, history, and intent.

Built With

  • blackterminusar
  • grok
Share this project:

Updates