Inspiration
The future is here! Keyboards and mice are a thing of the past and hand gestures have the potential to expand our capabilities! Tony Stark in Iron Man 1 showed the potential of this software in the iconic scene where he engineered his suit without a keyboard or mouse! This motivated us to help bring the technology we take for granted in movies, to the real world.
What it does
We developed three main functions, mouse mode, draw mode, and short-cut mode. All functions rely on AI recognition with the webcam. - Mouse Mode: Uses finger to drag, click, and multi-select with mouse pointer - Draw Mode: User draws letters, numbers, and special characters and transforms them into text which is typed out - Short-Cut Mode: Hand signals relate to short-cuts on the keyboard
How we built it
- Pure Python! Why? It's the best!
- Used Miniconda to create consistent environment between
our group during development
- We only used python, and a ton of its libraries; Python has libraries for practically everything all in one language: UI, Screen Capture, Mouse/Key Interaction, AI Models, etc.
Challenges we ran into
- Having the AI model turn our hand signals into the appropriate actions (many symbols overlap)
- Drawing straight lines with finger
- Converting the image into text with the AI model
- Making the mouse move smoothly with the user's hand
Accomplishments that we're proud of
- Learning how to use AI visual recognition with different points on the hand
- Implementing computer vision and deep learning model
- Permits the user to have a fully functioning keyboard and mouse with just their hand motions
What we learned
- Double checking AI models for their intended purpose will prevent us from misusing them in the future.
- Free reign brainstorming helps produce many creative ideas instead of having a filter on new ideas
- Planning out what we want to do in manageable sections keeps us organized and focused
What's next for Air Board
- Compact the project into an exe, and make the app run on startup so you can use your hands from the moment you boot your computer
- Introducing sign language in order to expand the number of short-cuts
- Offer multi-language text detection
- Advance to moving gestures instead of the still gestures we have now.
- Custom made gestures the user can create for a more unique experience
Built With
- cv2
- mediapipe
- pyautogui
- pyqt5
- python
- trocrprocessor

Log in or sign up for Devpost to join the conversation.