Inspiration

Realized the difficulty educators face when drawing on projector boards and zoom lecture. Mouses feels too clunky and not every smart/projector board set up includes drawing pad. We decide to create something that may be useful for drawing on smart boards and/or on zoom without the hassle of new gear.

What it does

maps 1 to 1 movement of hand gesture using laptop/computer camera onto a virtual whiteboard, simulating an actually whiteboard.

How we built it

used open cv object detection which captures camera frames that is fed into mediapipe to detect position of hand. Then using the position of hand capture, we determined if the hand position meets the criteria of predefined hand gestures that symbolizes certain actions ( drawing, erasing, color changing of ink). Drawing works by mapping a centralized point for each frame to a white board. It is connected to the next previous dots, creating literate depictions.

Challenges we ran into

creating constrains that accurately depicts the gesture we want to execute upon (we wanted a unique hand gesture that executes drawing, erasing and color changing), however sometimes these command execute due to not restrictive enough constraints.

Accomplishments that we're proud of

Creating a website that deliveries on the main goal of our project, smooth lines drawings, got the video capture to work simultaneously with websocket video capture for our website, and restrictive gesture control with minimum amount of false positives.

What we learned

Learned how to use mediapipe framework for hands , open cv, multithreading for video capture through flask, manipulation of data captured from mediapipe using opencv frame capture.

What's next for AR WhiteBoard

improve on hand gesture capture, add new features that indicate command changes.

Built With

Share this project:

Updates