Inspiration

We came to hackBCA with no idea what to do, so we picked up some interesting hardware, namely the Myo armband. We looked through the capabilities of the hardware and discovered its capability to recognize hand gestures, when properly calibrated. Making the gestures was trivial for us, but I recalled seeing many stroke victims in the senior homes I had visited having trouble with simple hand-eye coordination activities. And thus it occurred to me that stroke victims might be able to improve their mapping of sensory inputs to motor outputs by using a program centered around these gestures.

What it does

The application displays a series of images indicating which of the five simple, well-defined Myo gestures should be performed. Each image displays in a different part of the screen, which is thought to increase the level of hand-eye coordination needed, as people cannot simply stare into one part of the screen. When a gesture is inputted by the user, the image corresponding to that gesture appears in the bottom right of the screen, and if the gesture was correct, a corresponding sound plays, the score on the screen increases, and the armband vibrates for a longer period of time. If the gesture was performed wrongly, an error sound plays, the score on the screen decreases, and the armband vibrates for a short period of time. By having all these means of feedback, we are able to engage three of the five sensory inputs of a stroke victim - hearing, touch, and eyesight. Thus, we improve not only the user's hand-eye coordination, but his or her overall sensory-motor coordination.

Another use for this project that we had not planned is teaching young children these same sensor-motor skills. The gestures are not difficult to perform but children may find it entertaining to keep up with the pace of the application, while also improving their coordination and strengthening their wrists.

How we built it

We built this application mainly in C++ with a few parts in C that interfaced with the lower-level functions of the armband. We wrote the GUI in the Qt framework to ensure portability.

Challenges we ran into

The armband was difficult to calibrate, and gestures were sometimes not recognized or recognized wrongly. In addition, it was difficult to read the other properties provided by the armband accurately while also reading gestures, so we could not combine the two.

What we learned

We learned that we should have written cleaner code because it would have saved us a lot of time in debugging.

What's next for StrokeRehab

We could perhaps find a way to incorporate the other capabilities of the armband into the app, such as the acceleration and orientation readings.

Built With

Share this project:

Updates