One of our members brought two giant keyboards, a mouse, and a ton of cables to the hackathon. Depending on lots of hardware to control a computer is cumbersome and sometimes even difficult. We wanted a way to control our computers that was lightweight and intuitive.
Also, we're too poor to buy touchscreen devices.
What it does
Using only computer vision, the program allows you to move the cursor by moving a stylus, type by drawing characters with the stylus, and click by closing your hand (as if to grab something)
How we built it
We used openCV to process the images (remove background noise, dilate them, etc) and detect the objects. We used pyTesser to detect characters in the movement of the stylus, and we used pyAutoGUI to control the keyboard and cursor.
Challenges we ran into
We tried to train our own cascade classifier for the closed hand, but we could not create an accurate one. We instead had to find a pre-trained classifier.
Accomplishments that we're proud of
- Successfully using openCV to manipulate and analyze the images
- Coming up with a way to detect characters drawn by an always-moving stylus
What we've learned
Python isn't our go-to language, so we learned a lot about the language and libraries throughout the project. We also learned about multi-threaded programming and its uses.
What's next for jAIden
- Make it compatible for OS X or even mobile devices