Inspiration

The inspiration for Gesture stems from both team members coming from very busy families growing up. From the start, Jibo is a family member with its emotes, but what if a parent is inevitably busy and running late? We wanted to create a product that streamlines the solutions to these situations with hand gestures to alert people near Jibo even when the person doing the gestures is far away.

What it does

Gesture allows you to perform a stream of gestures that maps to a task for Jibo. This could be emoting I love you or simply telling everyone to be ready to leave in 5 minutes. Further, to extend past pure utility, we offer fun tools such as real-time tracking of how fast you are spinning your finger and having Jibo point in the direction you move your hand.

How we built it

We built the Gesture backend with NodeJS, which enabled us to read the gestures from the Leap Motion and make decisions about how to act. Using a HTML webpage, we connect via WebSocket to our Heroku server and push our relevant information for Jibo. Then, Jibo receives the message from the web socket and follows a decision tree to the correct behavior.

Challenges we ran into

It was extremely difficult setting up our web socket as neither of us had done any work with them before. It required a lot of trial and error, as well as changing syntax around to allow sending and receiving messages.

Another challenge was making sure that we only picked up one gesture at a time. This is because the Leap Motion is constantly looping within the program and you need to enable pausing upon recognizing a gesture. Ultimately, this required hard coding in a minimum time delay between gestures so that all your gestures are recognized after a slight pause.

Accomplishments that we're proud of

We are extremely proud of the product that we put together in under 24 hours. We successfully can transmit to Jibo from 60 feet away with minimal latency and our gestures, though finicky at times, are fairly easy to conduct. Gesture adds a whole new dimension to Jibo that is not easily supported with its current sensor setup.

What's next for Gesture

The limiting factor in the product right now is the Leap Motion, both in the gestures it recognizes and the accuracy of recognizing gestures. In the future, we would like to experiment with an upgrade at this position as well as expanding the library of accepted gestures. Further, we would like to add portability to the motion sending device, as right now it is limited by needed to be at a computer.

Built With

Share this project:

Updates