What it does

Adds gesture capabilities to existing voice services, specifically to aid mute users in using voice services.

How we built it

We used to the Leap Motion Gesture Tracker to capture and identify gestures that could then be translated into alexa voice command using a text to speech system. This would enable people who would be otherwise unable to communicate with the voice services that are ubiquitous today, perhaps due to a disability.

Challenges we ran into

Our original idea was to expand it beyond the use case of helping mute users, by bypassing the alexa voice service when required and calling the Echo backend directly. Turns, out it isn't as straighforward as it sounds on paper, partly because of the closed nature of the existing Amazon skills.

Accomplishments that we're proud of

We never shied away from pivoting rapidly even in the middle of the night. After pivoting a gazzilion times, turns out we still managed to execute the original idea that we set out with.

What we learned

A lot about using and interfacing types of hardware ( and how bad we are at using them). We learnt about the opportunities and limits of gesture based technology.

What's next for alexa-accessible

+ 2 more
Share this project: