Inspiration

We believe that wearables such as Focals by the North could be the future of human-computer interaction. Especially with the rapid growth of machine learning technologies such as image detection and hand gesture control, we believe that there are many unexplored possibilities. This inspired us in this attempt to push the limit of our imagination in order to understand what's possible in the future.

What it does

Our application allows users to use their hand gesture to control the camera to zoom, capture an image and perform image detection. We envision this feature to be used with wearables such as the smart glasses, which lack the typical touch screen interface. When using smart glasses, the user would have a strong preference for seamless interactions such as using hand gestures as opposed to using additional physical peripherals.

How we built it

  • Using React-Native with Expo to build a mobile application
  • Connect the application to Firebase to enable the use of Google Cloud Vision API
  • Use Clarifai API to train a custom model to recognize specific hand gestures
  • Integrate the Clarifai API into the application
  • Design a way to use the detected hand gesture to trigger the capture of a screenshot that becomes the input for image detection using Cloud Vision.

Challenges we ran into

  • Decide on a way to train a model to detect hand gesture given the time constraint and limited dataset
  • Integrate Clarifai API to the expo-camera component to detect hand gesture using live camera
  • Find ways to incorporate different expo components to compliment the functionalities of the camera feature in the app

  • Limited sleep

Accomplishments that we're proud of

Produce a functioning MVP that is able to:
1/ Send images to the Google Cloud Vision API and return the object detected
2/ Recognize trained hand gestures
3/ Teamwork

What we learned

  • Using Firebase for data storage
  • Using Google Cloud Vision API for image detection
  • Training custom image detection model using Clairifai
  • Using expo and its libraries to build React Native apps for mobile applications
  • Perseverance

What's next for Snapp

While our demo is built as a mobile application, we ultimately want this feature to be a basic interaction for advanced wearables and smart glasses in the future. We would love to one day use Snapp to quickly search for the stylish cars we see out there or help us find out where we can buy the awesome pair of shoes we saw in the shop. All can be done through a few simple hand gestures.

Built With

Share this project:
×

Updates