After giving thousands of demos and going to VR conferences, meetups, and meetings globally, I've realized that first time Vive users would benefit from easing into VR experiences.

What it does

We didn't quite finish the project, but it properly IDs a Vive controller using Watson Vis. Recognition.

Ultimately we definitely wanted a tutorial on the button layout to be triggered once the controller is recognized.

How we built it

Starting with a Watson SDK example and building off of that.

Challenges we ran into

USB port and unofficial support for the front camera.

Accomplishments that we're proud of

Learning how to trigger calls to Visual Recognition based on the front camera buffer data.

What we learned

How to use Watson's Visual Recognition.

What's next for Front Camera Fun

Making it easy to actually place this as a tutorial at the beginning of your VR app would be the next step. Making it easy to learn the button layout of the Vive for anyone.

Built With

Share this project: