At our school, we had a friend whose fine motor skills weren't very advanced and so pressing certain keys were difficult because you could hit another key by accident. So we created this to allow someone who doesn't have controlled fine motor skills to play these games using their arms and in a more fluid motion.
What it does
The Leap Motion checks your pitch, roll and yaw to identify certain hand gestures. Those hand gestures can then be used to simulate key presses. Those simulated key presses then initate certain actions in the game the way that a game normally accepts keyboard input.
How we built it
We used the LeapMotion sdk to accept the gestures. Then pyautogui was used to simulate the key presses. The key presses were then processed using pygame and pygame had determined the mechanics and actual gameplay.
Challenges we ran into
- Trying to get Leap Motion working
- Trying to recognize certain gestures
- Trying to simulate the keypresses outside of the terminal
- Trying to receive the keyboard simulations and initiate actions
- Trying to make the whole process smoother
Accomplishments that we're proud of
We're very proud of the fact that we were able to use the LeapMotion sdk. We are also very happy that the keypresses work.
What I learned
We learnt how to implement sdks properly and how motion tracking works. We learnt how to simulate keypresses and how to use github.
What's next for Motionboard
Motion board can learn to recognize more gestures and give more inputs. Motion board can become more efficient so that there is less latency. Motion board can try to implement the keypresses so that the clicking motions isn't required. Motion board can try to be implemented to more games.