In sci-fi movies, characters don't use trackpads, they use gesture commands to navigate their computer. We have the technology, why not bring a part of sci-fi to life?

What it does

With built in commands, users can open custom webpages, scroll through browsers, switch tabs, open custom playlists just with their hand gestures

How I built it

We trained our own dataset of hands and their respective gestures using an open-source machine learning framework, Tensorflow, and made it interact with gestures to call commands on laptop.

Challenges I ran into

We initially tried to build it so writing letters in the air would open webpages (draw "T" to open twitter, "R" to open reddit), but the poor accuracy rate made us scrap that approach. Using different numbers of fingers proved a winning approach.

Accomplishments that I'm proud of

We didn't have the most time, but we got the program to work!

What I learned

How useful machine learning is, and how it could lead to more innovations in the future

What's next for Gesture Commands for Windows

At the moment it already has all the utility I envisioned - it can cycle through tabs, scroll through websites, open my most frequently used webpages and music playlists. Adding facial authentication to autofill certain websites (like gmail, facebook) is an idea I was thinking about.

Built With

Share this project: