Inspiration

Learning a language can be challenging and it´s hard to motivate oneself to learn for example spanish when in the cold winter in Germany. Wouldn´t it be nice to just go to Spain and learn it there?

What it does

VisuLingo makes language learning immersive. You step into AI-generated worlds — like a Japanese market or a French café — and the environment becomes your teacher. Look around, tap what you’re curious about, hear the translation, and then speak it yourself. An AI assistant gives instant feedback so you improve naturally. VisuLingo connects words to real places and moments, making learning intuitive, memorable, and fun.

How we built it

Unity with MetaXR SDK and APIs for the AI tools (like Elevenlabs for TTS and OpenAI for feedback on user answers).

Challenges we ran into

Getting gaussian splats to render properly in VR (especially Quest Standalone) was a significant technical challenge for us, especially as non Unity developers with limited time!

Accomplishments that we're proud of

We learned some spanish and will use this to keep learning ourselves. Essentially, we made a product that solves our own problem.

What we learned

Don´t try to optimize for Standalone during a hackaton when you really only want to prove an idea provides value, optimization takes time and can always be done after the idea is deemed to be good.

What's next for VisuLingo

We want to support camera passthrough and object detection so users can walk around in the real world, tap objects they want to learn and we then add the same functionality as we do in the VR version (but AI generated).

Built With

  • elevenlabs
  • gaussiansplatting
  • metaquest
  • metaxr
  • openai
  • unity
  • worldlabs
Share this project:

Updates