Who Are We?
We are a team of four Electrical Engineering Students, three with specializations in Machine Learning and one in Circuits and Systems. We respectively are made up of three fourth years and one third year student.
Our Project
Utilizing Computer Vision and Machine Learning, we developed a plate dispenser that dispenses plates and sets up a dining table with a simple gesture of your hand. This could be a thumbs up, peace sign, or anything of desirable and realistic hand gestures you can make. It uses servo motors that help us not only perform our plate dispensing mechanism, but also to move the cart across the table, allowing for ease of setup as the plates will be dispensed across your dining table.
Our Inspiration
Our main inspiration for making this project to happen was through one of our member’s ideas, Anton, to use computer vision and machine learning to detect hand gestures and ultimately help us perform certain tasks when presenting different hand gestures. We wanted something with a mix of not just software but also a combination of hardware components, specifically mechanical/physical movements that perform actions for convenience or for health-related reasons. We finally decided to make something that isn’t too mechanically demanding while also complementing our
What it does
Driving cart that dispenses plates across a table responding from inputs of hand gestures
How we built it
Utilizing an arduino uno, motor driver L298N, and multiple servo motors to help us perform our mechanical components.
Challenges we ran into
Problems with consistent plate dispensing (i.e. multiple plates being dispensed at once/no plates being dispensed at all) Servo problems, especially when getting the cart to move, hardware/component issues. Difficulties with implementing wifi modules to communicate with arduino to translate hand gestures into commands and then use that information to perform our functions.
Accomplishments that we're proud of
Anton was able to train and create a computer vision algorithm that detects different hand gestures which are then utilized and mapped to different commands that we could perform through the arduino.
What we learned
Found how to communicate with arduino using this computer vision algorithm by using an ESP32 to communicate with our board.
What's next for Gesturatric
You can be creative as you want with the hand gestures, just as long as it is trainable and usable in the algorithm. Some possible ideas were using the (calling on the phone) gesture with your hand to potentially make a call without having to dial anyone’s number or even ask siri to do.
Built With
- arduino
- computer-vision
- machine-learning
- python
- servo-motors
Log in or sign up for Devpost to join the conversation.