How we built it
There are two types of NPC, stationary and moving. The stationary NPC explains what the player can do at the place when the player gets closer. The movable NPC walks around and waves its hand when the player gets closer to it. As we introduced various features into one space, we ensured that the player could get guidance from the NPCs that walk around and talk to you. When the player reaches out to the surrounding npc to try out one of our features, the npc gives a brief introduction to how it can be used.
The first application that is introduced in our project is the live translator in virtual reality. Through the Azure speech-to-text application, this grabbable window frame allows the user to instantaneously convert their speech into text, with an additional feature of translating that text into multiple languages. The purpose of this application is to erase the boundary between the people due to language barriers and therefore make the metaverse open and interactive to anyone around the world.
The second application introduced is the background music-controlled via a virtual jukebox. The interface allows the user to change the music and the album corresponding to the music played. The background music audio is adjusted to be heard anywhere in the virtual world.
The third application is a VR keyboard with a Youtube player that works on a virtual screen. Through using Youtube API V3 users can type and search keywords which makes the video player to show a list of videos they searched.
The final application is a drawing board with 3 different interactive pens. The player can grab markers with different colors and draw lines onto the board, which was done by understanding the line rendering system in Unity.
Challenges we ran into
This implementation of the virtual environment was done by each of the members working on individual features and merging them into one program. However, the majority of us didn’t have Oculus Quest or devices to run the environment. Thus, being unable to test each of our features individually within the virtual environment was a huge challenge.
Accomplishments that we're proud of
For all of us, it was our first Hackathon competition participated and we were successful in completing the tasks we assigned to each other, resulting in a satisfying project result.
What we learned
Through this Hackathon, we have developed teamwork skillsets and further deepened our knowledge in unity. The algorithms we have learned and implemented were stationary and movable NPCs, video surfing platform, jukebox, drawable whiteboard, and language translation.
What's next for Into the Metaverse
This integration of interacting features lets people be more familiar with the virtual reality environment. Also, it gives a suggestion for a VR tutorial that lets the player try out different features, whereas Oculus only gives a brief introduction on how to use the controller. Despite the challenges of implementing a virtual reality environment, our goal of introducing the Metaverse to more people kept us going forward. From this experience, we hope to let people have a better understanding of what is, and can be, available within the Metaverse.
Log in or sign up for Devpost to join the conversation.