Inspiration

Inspired by the word of Brené Brown in 'Atlas of the Heart' link and the work of Alan Cowan in mapping emotion link and semantic space theory link, we wanted to extend this using the behaviour science of intent and habit loops to help people form everyday emotional wellbeing and personal reflection practices. We envisage the immersive XR headset having the same role in wellbeing practice as putting on your running shoes to make running more attainable and overcome the intention-action gap link.

What it does

With Project Luna you can experience a new kind of reflective journaling. Instead of journaling with pen and paper, you will experience a responsive visualisation that reflects your emotional sentiment as you respond to questions about your day. Through leveraging a sentiment analysis model we update the immersive XR visuals to encourage reflective thought from the speaker and develop emotional literacy. This encourages perspective on the individual's interpretation of everyday life events and a deeper understanding of the inner-worlds we all inhabit. We envisage this becoming a daily habit for our audience, creating a wider reflection practice and a diary showcasing change over time.

Example: You have an argument with a colleague about a piece of work you failed to complete and you feel angry for the rest of the day.

Using Project Luna, you speak your story and the sentiment analysis model suggests various emotions beyond anger you may be feeling; a mixture of guilt, envy, shame, anxiety, and pride that you did not realise. By engaging with the immersive experience, you can begin to label these emotions and better understand your emotional self, recognising triggers and affecting behavioural change from undesirable patterns making you more resilient for future encounters.

Over time you build up a diary of emotional data that can be used to identify situations and contexts that result in improved wellbeing.

How we built it

Project Luna is built in Zapworks' Mattercraft link and leverages the Emphatic Voice Interface (EVI) API from HumeAI link for the sentiment analysis component.

The user flow:

  • User intends to reflect
  • User takes action by putting on the XR headset and launching Project Luna
  • The user launches a reflection session and is prompted to share how they are feeling today
  • The XR Headset Mic will record the user’s response and stream this via websocket connection with the HumeAI EVI api.
  • The HumeAI EVI api returns sentiment data to the WebXR project on the headset
  • This data is then interpreted to affect the appearance and activity of a set of emotional visualisations.
  • As the user continues to speak, the visuals will adapt to their sentiment as they move deeper into their storytelling.
  • When the user is finished, the session is saved for review at a later time.

Challenges we ran into

Working with a new software presents a good opportunity to identify and overcome challenges:

  • Becoming familiar with a new tool and workflow.
  • Small bugs with Mattercraft that blocked progress.
  • Debugging the experience through Meta Quest Developer Hub.
  • Dealing with hardware & software awkwardness, the devices themselves didn't work out of the box.
  • Developing a clear understandings of the opportunities & constraints of Mattercraft as a tool.
  • Finding & adapting an API designed for traditional web experience into an immersive experience on the Mattercraft platform.

Accomplishments that we're proud of

  • Successfully integrating HumeAI into a WebXR experience in Mattercraft.
  • Custom visuals built to respond to this sentiment data.
  • An immersive experience reflective of speech based sentiment data.

What we learned

  • How to develop in Mattercraft.
  • Working with hardware & software in a WebXR context
  • Interpreting sentiment data into a WebXR three.js visualisation.
  • Collaborating on a new workflow using both Mattercraft & Git.
  • Proved the Project Luna concept in WebXR.

What's next for Frontier - Project Luna

  • Exploring further visualisations from user sentiment
  • Integrating other sentiment based datasources
  • Trend analysis overtime to improve the benefits from journaling overtime
  • Improved journaling interactions
  • Streamlining visuals
  • Extending the Mobile WebXR compatibility

Built With

Share this project:

Updates