Inspiration

Sometimes it's hard to get sit down and work in your desk. We wanted to be able to control our cursor without physically touching our computer, and so, we created HelpingHands.

What it does

Allows the user to use their right hand in order to control their cursor with functionality bound to each finger. When fingers are bent past a certain threshold angle, they are activated. The fingers have the following controls:

  • Thumb: Locks User Inputs
  • Index Finger: Left Click
  • Middle Finger: Right Click
  • Ring Finger: Scroll Down
  • Pinky: Scroll Up
  • Fist: Quit Program

How we built it

Position of the hand was found using the default computer webcam. We used the MediaPipe package from Google to detect various joints ('landmarks') on the hand, and depending on the position of the hand relative to a bounding box on the screen, the cursor is moved on the screen. We used the 'pyautogui' package to integrate controls with the activation of fingers.

Challenges we ran into

Our movement controls were jittery and inconsistent during our first prototype. However, we developed a smoothing algorithm that allowed for a cleaner cursor movement. Additionally, we were planning on using an eye tracker to control the movements of the cursor, but we found this to be highly inaccurate and low in terms of user friendliness.

Accomplishments that we're proud of

We are proud that our program can be used as a realistic alternative to a mouse. We developed an algorithm to smooth out cursor movement, as well as creating functional commands for each finger.

What we learned

We took our first steps into computer vision. We learned applications for linear algebra, coordinate math, and computer vision to create our program.

What's next for HelpingHands

We plan to develop our program for additional device assistance. For example, we want to develop eye tracking software, facial recognition, and body mapping.

Built With

+ 5 more
Share this project:

Updates