Originally we wanted to make a game like Pokemon Go with the Oculus Rift headset and Myo armband to simulate battles, like the classic Pokemon games. When we found out that Disney was releasing its Marvel API we decided to forgo Pokemon for Marvel characters instead. However, seeing that the API was more suited for informational purposes than for things like abilities and stats, we shifted gears and decided to make a Pokedex-like application instead.

What it does

We kept the Oculus Rift and Myo, but instead of battling each other we'd have a virtual reality database to look up information about various characters in the Marvel universe. The "home page" of the virtual reality screen would be a list of character thumbnails that have been accessed, and the Myo would control a cursor that would select a character to expand details about that character, like a description and list of different series the character appears in, which would show up on the "back wall", 180 degrees around from the current view.

We also decided to include speech recognition as a way to search for other characters in the database. The "search page" can be accessed by turning to the right in the virtual reality sphere, and the user speaks the name of the character they wish to look up. This essentially performs the same action as clicking on a character in the main page, which queries the Marvel database and also brings up the details of the selected character on the "back wall".

How we built it

Once we had a clear idea of what we wanted to do, it was obvious that there were four main tasks: the Oculus and environment built in Unity, the Myo gesture control, speech recognition, and the Marvel API. Since there were four members we each took up a different task and tried to develop as far as we could on our respective APIs independently until we needed input from others.

Once everyone roughly had each part of the program working independently, we started putting it together, with the Oculus and Unity as the foundation. We first added Myo gesture controls to the VR environment, to ensure we could navigate around the application, and then added Marvel API support on top of that, rendering the character thumbnails in Unity. Speech recognition was saved for last, as there were default characters to start the applications and the majority of the functionality was there without it.

Challenges we ran into

One of the biggest challenges we found was that Oculus recently stopped supporting OSX so the newest version of Unity did not support a lot of the plugins that Oculus used, so we had to downgrade the version of Unity we used on a Macbook. There was also a slight problem when Microsoft's Speech API always crashed on a Macbook and had to be done on a PC, which only one member of our group had. We also ran into minor issues as only two members of our group had used C# before, so there was a slight learning curve when programming the Myo and Marvel API. Along a similar vein, only one member of our group had used Unity before, and almost everybody had to work with Unity at some point or another.

Overall, we were able to work through the issues by splitting resources, sharing the one PC we had and renting a Surface from the hardware table. We found that on both PCs it wasn't possible to download the Oculus Runtime environment, so we had to abandon the use of the physical Oculus and run a simulation in Unity. A lot of the issues with C# and Unity were resolved through a combination of Google and help from mentors, and though we ran into a slight time shortage in the end, we were able to come out with a relatively cohesive program.

Accomplishments that we're proud of

  • learning new software, like C#, Unity, and REST API
  • getting the different parts of each individual project to work and coordinate with each other
  • being able to work around problems and trouble spots without hitting a roadblock that stops all progress

What we learned

  • to be flexible with using different kinds of software and different languages we're not used to
  • that sometimes things won't work and we have to go with plan B...or C
  • approximately 4 hours of sleep over two nights probably isn't the best idea but we'll do it anyway

What's next for MARVIS

  • getting the full project up and working
  • implementing machine learning to make suggestions for characters to look up
  • functionality to be able to move the thumbnails around on the screen, allowing the user to organize the database

Built With

Share this project: