Head-mounted displays mean you no longer need to look at your phone, but you still need to use it to input data. I wanted to see whether it was possible to build a chorded keyboard (or even just do gesture recognition) using FLIR technology.

How it works

A Lepton is connected to a Beaglebone. Both are mounted on a wristband and the Lepton points at the hand. The Beaglebone runs a computer vision algorithm to recognize the pose (and hence keypress) of the hand.

I experimented with both an Eigenfaces-based algorithm (which I implemented myself) and OpenCV contour/inflection point algorithms. I eventually settled on Eigenfaces, with a bit of tweaking.

Challenges I ran into

Cold fingertips. Fingertips aren't as warm as your face or the palms of your hands. As a result, they don't stand out quite as well as I'd hoped.

Accomplishments that I'm proud of

I designed and had 3D printed a mounting bracket for the camera. This is the first time I'd designed for a 3d printer (and the first time I'd done any CAD since GCSEs). Apart from the mounting posts being slightly thicker than I might have liked (fixable with a file), it all went smoothly.

I also implemented a (slightly simplified) version of the Eigenfaces algorithm for the pose detection. Although I'd read material on this, I'd never previously implemented it, and I'm glad I got it working.

What I learned

Lots. About...

  • Lepton - this was the first time I'd used one
  • Beaglebone - again, not used one before
  • 3D printing (and FreeCAD)
  • Eigenfaces computer vision algorithm.

...and also that the idea works in principle (although quite a lot of refinement is still required).

What's next for flirkey

I've shown that the idea works in principle, but there's a lot more refinement required to make it both more reliable and to recognize more gestures. I'm keen to continue working on this, although I suspect I'll need to take a step back and make sure the foundations I'm building on are solid.

Built With

Share this project: