Inspiration
We wanted to make an interactive and educative game targeting younger youths, but also being able to provide an enjoyable learning experience. Our inspiration comes from the people who have hard time learning. We wanted to make it more accessible worldwide. We believe our product is going to help numerous young students and teachers. We get inspired from their continuous effort despite many challenges.
What it does
It detects hand movement and users can draw just by putting their index finger and thumb together (like holding an invisible pen). Finally, the ultimate goal is to display the drawing in the canvas in real time and predict the object in the canvas that user drew. So whatever users draw, drawit knows!
How we built it
We started with computer vision by experimenting with the features that OpenCV has to offer. We then used mediapipe for detecting hand movements and finger detection. We slowly moved towards more advanced approach by drawing from moving fingers in a canvas. We used math library in python to calculate basic calculation like hand movement and determining when to draw. After that, we trained our model with intensive data using tensorflow.
Challenges we ran into
We had to come up with solution to use hand movement and drawing only when the two fingers are together. Training large amount of data to a model was itself a big challenge because of the intensity.
Accomplishments that we're proud of
We managed to track hand movement properly along with drawing with specific movement. We are proud for training our model using tensorflow which is supposed to predict anything users draw.
What we learned
We learned about computer vision, various python libraries like tensorflow, mediapy, opencv, streamlit, numpy, PIL
What's next for Draw It Right
Being able to advance our model to detect by training with large sets of data for higher accuracy in the shortest amount of time.
Log in or sign up for Devpost to join the conversation.