Expression is through more than just words. With Express Me, you can express yourself through gestures as well as through spoken text.

What it does

The user makes gestures, which are captured by the leap motion and have associated phrases. The phrases are read aloud when the user performs the matching gesture thanks to the text-to-speech api used. The user can go into the android app for this project and edit the phrases listed for each gesture, allowing the user to link anything they want with the gestures they or someone else performs. There are also options in the app to enable some contextual information that uses the Walmart and Capital One apis, and this information is provided for certain gestures.

How I built it

The leap-motion portion of the project is made with python. The android app is made with java. The data between the two pieces is managed with firebase, and text is converted to speech using an api.

Challenges I ran into

Making custom leap-motion gestures and playing media files using python.

Accomplishments that I'm proud of

Putting together 4 technologies I had little or no experience with to make one comprehensive hack.

What I learned

Firebase is great and easy to use for a lot of realtime data, and being able to combine hardware and software persistently makes for an interesting project.

What's next for Express Me

I probably won't have a leap motion after this event ends, so retirement is on the line for this project.

Share this project: