As someone who is very interested in both computer science and game design, I like developing tools to enable what we are capable of with games. Today, play any RPG and enter a dialogue. You'll have 2, maybe 3 or 4 choices. You click on one, and the game continues. This experience can be breaking to the immersion -- you're being pulled out of the game and back into a user interface. I wanted to create a simple tool that anyone can use, even non-programmers, so that game designers can keep their players engaged in an immersive environment by using players' voices to drive dialogue.
What it does
This is a dialogue engine to enable designers to create branching, nonlinear game dialogues. The mechanism for this is the voice control by the player. The player will speak to an NPC, and then uses Rev.ai to transcribe the spoken text into a form the NPC can understand. Designers use a visual editor and XML files to configure what the NPC can respond to and it will respond accordingly. The NPC is driven by an algorithm that takes the user input as an input and looks through all its preconfigured dialogue lines to find the response to best address the player. Designers can also configure optional audio to play alongside the NPC, so you can integrate voice acting by just dragging and dropping.
Want an NPC to give your players quests? This can do that. What about an NPC to answer player questions and guide them around the village, or to the next quest? This can do that. NPCs can be configured to remember what players told them and what they said in the past, or with scripting, NPCs can also respond to actions outside of dialogues that occur.
With the power of RevNPC, you have a fully-functional dialogue engine you can include in your game. All you need is a Rev.ai Access Token.
How I built it
Unity and C# for game-related functionality. Rev.ai for player voice transcription
I built the Demo Project on top of the Unity Standard Assets Example First Person project that comes with the Engine. The focus of this project was AI and Dialogue, not gameplay, but I wanted a demo so I used the Unity starter project and assets.
Challenges I ran into
I ran into a bug with the Unity Engine. It is supposed to support Ogg Vorbis audio clips but when I try to import voice acting for demos the Unity Engine displays an error that it is unable to import. This error also comes for every other audio format. As a result, the voice acting planned for the demo could not be integrated.
Accomplishments that I'm proud of
Getting Rev.ai to work with Unity. It's a really cool tool but Unity isn't always easy to integrate with new tools but with help from Rev.ai we were able to get it to work.
What I learned
Rev.ai, it was my first time working with it and it was very accurate
What's next for RevNPC
As of right now, this is currently on Github. You can clone it and get started with it in your own project. Just get a Rev.ai and you will see the prompt inside Unity to enter it. In the future, I'm going to put this on the Asset Store to make it simpler to import into Unity.
Also, I think this would be a super fun way to play RPGs in a VR setting. I'd like to make a fully-featured VR game with this. I know RevNPC is capable of it, I just will need to set aside enough time to make a full game.