Surely You Gest

The Open-Source Toolkit for Radar-Based Gesture Recognition


Surely, you must...Communication is key and gestures are a big part of conveying information. So I say again, surely you must gest(ure). The improvements of modern-day radar have put the sensor itself in a special position, allowing people to discover entirely new applications. One currently experimental application of radar is the recognition of gestures using the superior range and velocity resolutions radar gives. However, unlike camera, there isn't a staight-forward route from beginning to end. Radar data must be carefully processed with complex math including methods like beaming/detection theory and FFTs. Namely, since every step in the process isn't set in stone, it allows us to customize based on our preferences and needs.

What it does

Right now, the project goes from raw analog-to-digital converted (ADC) data from a Texas Instruments mmWave radar all the way to predicting from a set of gestures, with realtime capabilities. The idea is that there is an organized and modular base pipeline, and developers/companies can alter the pipeline by removing, adding, and changing steps in the pipeline to fit their own application.

How I built it

This project uses signal processing and deep learning to work. Radar-signal processing was hastened by the use of the OpenRadar package, allowing for some complex mathematics to be performed.

Challenges I ran into

Data collection for training and demonstration purposes took a while, but it was crucial to showing the validity of the project. The general process was also hard to come up with since I wanted to show something within the given time. I spent a lot of time planning before I could even do anything.

Accomplishments that I'm proud of

The dataset was a last minute decision to make, and it ended up working very well.

What's next for Surely You Gest

I'm thinking big, and this is just a start. I would like Surely You Gest to turn into the goto open-source platform for solving gesture recognition problems. With radars becoming a mainstream sensor, imagine the time when it is a common occurrence for phones, watches, etc to have them casually built-in. Us as developer can broaden their uses and even customize them based on what we want. Personal gestures? Secret gestures? Helping improve sign language translation? Done.

As for a real use case, imagine integrating Surely You Jest into every phone with the Google Soli radar. This could help the exploration of new features built by the open-source community, and improve the user experience of the phone itself.

Surely You Gest would allow somebody with an idea to do what they want, with or without knowing everything about radar-signal processing, ML/DL, or maybe even without even knowing how to code.

Built With

Share this project: