Inspiration
While photos and videos capture life's precious moments, they often lack the emotional depth and dimension for virtual and mixed reality spaces. We explored a new method of capturing memories, particularly the subjective emotions tied to life events, through the capabilities of the Quest 3. Our MR app reimagines diaries as personal, ever-evolving spaces that enable self-reflection and allow users to relive memories on a multi-sensory level.
What it does
Our app is an immersive diary which translates user memories and emotions into flora. A charming AI companion engages users in conversations about their recent experiences and analyzes their shared memories to create procedurally generated plants. These plants visually reflect the users' words through their distinct shapes and gradually populate their physical surroundings. Users grow a virtual a garden of memories as they use the app over time.
How we built it
We're using our established dev pipeline with Assets Creation in Blender, and Unity / C#, together with the OpenAI API and Meta Presence Platform SDK. User input is transcribes from voice-to-text, and then analyzed with a custom ChatGPT Assistant. The bot's reply is eventually translated from text-to-speech. We are combining voice commands with pre-set expressions for the bot.
Challenges we ran into
Crafting the right prompts for our assistant's LLM to engage effectively with users and accurately measure their emotions proved challenging. It was difficult to sequence commands for fluid dialogue interaction.
Accomplishments that we're proud of
Our app offers a truly immersive experience that harnesses the unique intimacy of Mixed Reality and the technical capabilities of the Meta Presence Platform, along with the latest advancements in conversational Large Language Models (LLMs).
What we learned
We learned the importance of nuanced language processing to accurately reflect users' emotions through procedurally generated flora. This project underscored the potential of blending AI with immersive technology for a more personalized and emotionally resonant experience, suggesting broader applications in therapeutic and artistic virtual spaces.
What's next for Limbic Creations
We are refining our demo and progressing towards an MVP that will be released in Beta to gather feedback for further iteration. Additionally, we plan to enhance the procedural plant generation by incorporating more sensory data, such as voice tonality and gesturing, to enrich the user experience.

Log in or sign up for Devpost to join the conversation.