Who doesn't want to see those futuristic sci-fi techs become reality?

What it does

A futuristic virtual touch HUD with hand motion and voice control capabilities that scours the "world" for popular venues using the Foursquare API

How we built it

We used the Unity Engine, Foursquare API, LeapMotion hardware, and Windows10 speech-to-text.

Challenges we ran into

We wanted to incorporate Google Cloud Speech API but it would not work with Unity. We also wanted to attach the HUD to a camera, to emulate future AR glasses, but due to compatibility complications, it was not possible.

Accomplishments that we're proud of

Developed an immersive and aesthetic space visuals with working utilization of live data fetching from Foursquare along with custom input using voice control and dynamic hand gestures.

What we learned

How to APIs and the dangers of feature creep.

What's next for Starship Enterprise HUD

It is scalable to be used in any setting, from AR devices to car windshields.

Share this project: