Inspiration

American sign language seemed like a natural fit for the LeapMotion, which can easily track hand movement. The Oculus VR integration was a suitable addition, as the LeapMotion hands are perfect for interacting with virtual environments.

What it does

MutePoint models the user's hands using LeapMotion and Oculus Rift. In a VR Unity environment, the user can set gestures for specific words/actions. After they are calibrated, gestures can be used by simply making the gesture with your hands. Gestures can be tracked separately for both hands, or can be two handed. Gestures can be used to spell words by letter, with the output being displayed live to the screen.

How we built it

We used C# scripting in Unity with the LeapMotion API libraries. Environments were made in Unity and objects were created as needed. Scripting was done to handle gesture calibration, recognition, and resulting actions in text/3d object interaction.

Challenges we ran into

Getting Oculus and Leapmotion to work successfully with initial hardware proved to be difficult. Along the way, developing an effective heuristic to evaluate gesture similarity based on hand position and angle was challenging. Also, Leapmotion Oculus mounts were not available, leading us to use duct tape to attach the two together.

Accomplishments that we're proud of

Finally getting the project to work with a heuristic that is fairly accurate at recognizing hand gestures.

What we learned

How to use Oculus, C#, Unity, LeapMotion, and how to have a good time.

What's next for ASL MutePoint

The world. The future of interactive environments will be enriched by enhanced body recognition, breaking down the barriers between traditional controllers and media.

Share this project:
×

Updates