Background

The personal application of at home robotics has always remained niche, as tried and true home appliance have dominated consumer interest. While this has led to strong competition fostering innovation in the market, rarely do companies attempt to apply fields of robotics less known to the general home to their products. We wanted to challenge this norm and show what could be possible through an under utilized robotics application for this field, the robotic arm. Our goal was to create a robotic arm that could be reproduced at a cheap and efficient rate for the average consumer, utilizing 3d printed parts easily applicable to different motor hardware options. Through this, we aim to showcase what a Helping Hand really can do for you.

What it does

Our Helping Hand was engineered with a focus on usage in the kitchen, leading to it's name being Gordon. Gordon is a low weight 3 link robotic arm that can be controlled via mirroring its users actions solely through computer vision. Our control solution is unique given robotic arm manipulators are typically piloted via an external specific control pad/haptic hardware, which increases costs and barrier of entry. By solely using computer vision, any person with a camera can freely control our arm. In the kitchen, this form of control comes especially useful for individuals with hand tremors allowing safe, precise application of kitchen utensils and tools that they were previously unable to use. Controlling a robotic arm without the usage of external costly hardware is a heavily unexplored field, and our mirroring application pushes the bar of entry down for everybody.

How we built it

For the hardware, we built a 4 degrees of freedom 3 link arm controlled by 5 servos of varying torques. We control the servo monitor an Arduino Uno over PWM. We then translate kinematic values into servo controls in our python backend. On the software side, we built a simple camera based ui interface in Python via cv2. We use voice activated commands from the user to switch between off and mirroring mode, which we achieved by utilizing Faster-Whisper, Silero-Vad, and sentence embedding with cosine similarity for real time audio transcription and understanding. In mirroring mode, we track the users right arm and hand joints via mediapipe allowing us to calculate proper kinematic equations to move and control our arm through our main pc + Arduino Uno control loop setup.

Challenges we ran into

We came into this project with limited knowledge, pre planning, and most importantly tools/hardware. A large majority of our collective time was spent learning and debugging how our hardware actually worked (mainly our servo motors), which in turn led to a large amount of issues during our time building and setting up controls for our manipulator. Given more prep time and access to more resources, more software applications could have been implemented within our 36 hour period showing more of the potential our Helping Hand Gordon has.

Accomplishments that we're proud of

Despite our challenges, we are still extremely proud to have built a functional 3 link robotic arm within a 36 hours period with limited prior resources and experience while staying true to our goal of making it cost friendly and reproducible. It was a rocky road, but we continuously managed to push through our doubts and inexperience to achieve something we could not have done alone. Additionally, using pure computer vision with pose estimation to achieve our control flow with voice commands was an unexplored mode of control for robotic arms we had to extensively tweak and configure to get to a usable state. We constantly worked through murky unknown territory, and have pride for coming out with what our project is from that.

What we learned

All of us worked on new technologies that we were previously unfamiliar with. We had to learn how to work with our servo monitors with Arduino controls to properly translate our absolutely positioned locations to our limited manipulator workspace. Building the arm itself also posed a challenge, as we had to learn how to work with various power and physical constraints from our limited resources. We also learned how to properly translate pose estimated joint coordinates into kinematic motion equations to estimate various angles needed to mirror user motion to our arm.

What's next for us?

Gordon is a step in the direction of getting "Helping Hand's" both more known and accessible to the general public. Our focus was on improving the kitchen, however any possible future use cases for our domestic light weight robotic arm will help shape how widely used our idea of a Helping Hand will be. On our end, we would love to further develop Gordon to have the ability to record and playback actions under specified period amount and durations, allowing Gordon to be truly autonomous and help users multitask in the kitchen.

Built With

Share this project:

Updates