JARVIS: WHAT THEY YAPPIN ABT 👓🌍

🚀 Inspiration

Inspired by Jarvis from Iron Man - the intelligent assistant capable of voice interaction, contextual awareness, and immersive control - we set out to build a real-world version that helps people connect across language barriers. With the rise of mixed reality, we imagined JARVIS not just as a voice assistant, but as a wearable translator that works in real time through both AR and VR.

💬 What It Does

JARVIS enables real-time translated conversations through Meta Quest 3. Users can speak naturally, and translations appear within their mixed reality view, breaking down language walls with ease.

🛠️ How We Built It

We built the system on Unity to seamlessly integrate with the Meta Quest 3.

  • Developed interactive 3D elements within Unity
  • Scripted functionality using C#
  • Integrated multimodal inputs to enhance user experience
  • Leveraged OpenAI’s Whisper model for accurate speech-to-text translation

🚧 Challenges We Ran Into

  • First time working with hardware (Meta Quest 3), especially in only 24 hours
  • New to Unity and C#

🏆 Accomplishments We're Proud Of

  • Built a working prototype from scratch within the 24-hour time limit
  • Successfully integrated software and hardware
  • Gained hands-on experience with AR/VR development

📚 What We Learned

  • Navigating and building in Unity
  • Basics of C# scripting for interactive environments
  • Using Whisper for multilingual transcription
  • How to rapidly prototype with unfamiliar tools and platforms

🔮 What’s Next for JARVIS?

We envision evolving JARVIS into a wearable, AI-powered assistant, something like EDITH glasses from Spider-Man and Iron Man. Our goal is to enable global, real-time communication with future hardware, bringing sci-fi dreams closer to reality.

Built With

Share this project:

Updates