Typing websites and competitions can only do so much with helping you learn to type quickly; they'll be able to see that you're pressing the right keys, but what's to stop you from using the same finger to handle half the keyboard? Not only are these bad habits are difficult to get rid of, but they are also difficult to track and realize.
Keypo utilizes an optical hand-tracking module, the Leap Motion Controller, as well as the touch typing method, to detect finger-pressing gestures while you type to help you correct bad habits and learn to type faster, faster.
What it does
Keypo tracks the position, velocity, and acceleration of your fingers to detect a key press motion as you type in the air. No keyboard is necessary to learn touch typing with Keypo, but a mock-keyboard setup is provided for users to are just learning to type and need a reference. Phrases to type are shown on the screen, and you can advance by typing with the correct finger. Additionally, the 5 slowest keys you type (on average) are displayed so that you know what to improve on.
How we built it
Finger position data were obtained from the Leap Motion Controller via the Leap Motion SDK and a Leap Motion for Processing Library. We used a sliding-window approach to calculate stable velocity and acceleration data for each finger at a given time, and used the data to estimate key tapping motions from specific fingers. We also built a user interface in Processing to display the detected finger positions, as well as a prompt for the user to practice typing. Constant feedback, including moving/color changing letters, finger animations, and key pressing statistics, allow the user to feel fully immersed in the typing experience and be able to learn from it.
Challenges we ran into
It was our first time using any motion detection or motion controllers, so we had some trouble setting up the Leap Motion controller and interfacing it with our code at first, especially since the newer SDK doesn't support our computers and many programming languages. However, after following some tutorials and troubleshooting, we were able to obtain data from the Leap Motion controller.
We also had several challenges with processing the motion data. The Leap Motion SDK and library does come with methods to get a finger's velocity or detect tapping gestures, but while testing, we realized that the method was very inaccurate and noisy for a small scale application like key tapping, which made it difficult to track the finger with the greatest motion. To resolve this issue, we developed our own method of calculating velocity and acceleration by using a sliding window approach with queues. This allowed us to be able to calculate the change in position and change in velocity over a longer time period (greater accuracy), while still having a smooth, continuous amount of instantaneous data points. In addition to this, there were several adjustments/additions to our pipeline while testing, such as adjusting an acceleration cutoff to account for noise, and using a queue to make sure a finger has been accelerating significantly for at least 3 cycles before detecting a true key press.
Accomplishments that we're proud of
This was our first time using Leap Motion controller and processing, so it was a new experience and we were excited to see everything we were able to accomplish during a short amount of time. We were able to resolve the many challenges we came across when learning and using these new technologies and produce a user-friendly application.
What we learned
We learned a lot about integrating motion detectors like the Leap Motion controllers into our projects. As this was the first time using a Leap Motion controller, we were pretty mind blown by the amount of data we could get from it and its applications. Besides interfacing the motion controller with our project, we also learned a lot about data processing and visualization, especially the challenges that come with real-time data processing.
What's next for Keypo
- More accurate key press detection, perhaps by training a ML model
- Since the current system can determine which finger pressed a key, it would be cool to be able to determine if the finger was in the correct position on a keyboard as well (of course, this would involve a lot more calibration and precision).
- Support for more actions, like pressing multiple keys at once (e.g. shift+letter to capitalize)
- Support for progressive difficulty and personalized training in the UI.