Inspiration
While brainstorming, we drew inspiration from Doctor Octopus from Spider-Man, who uses robotic limbs to maneuver and assist himself in everyday tasks. That led to a simple idea: if two arms limit what you can do, adding another could expand your capability. Instead of relying on a second person, why not have an additional arm that works alongside you to give more freedom and control?
What it does
We built a wearable robotic limb mounted on the chest that adds an extra arm to the body. It is controlled through voice commands and powered by a vision-language-action model, allowing it to understand intent and act in real time helping you hold, assist, and accomplish more tasks simultaneously.
How we built it
Our robotics limb system is based on the SO-101 robotics arm system mounted on a 3d printed chest plate. It is powered by an AMD AI Mini PC running a VLA model through LeRobot in order for the limbs to move in regards to user intent.
The VLA model is trained on MI300X GPU provided by AMD through cloud service.
Challenges we ran into
- Broke a servo, leaving us with almost no demo for judging (can’t confirm full impact at the time of writing).
- We also found that fine-tuning a policy for the third arm is extremely difficult, since the human is constantly moving, forcing the arm into a wide range of orientations
Accomplishments that we're proud of
The demo worked pretty and well and was super cool as the arm was able to manipulate and grasp object on it own.
What we learned
Be careful with robots...they can break very easily
What's next for ZeroShot ExoArm
Fix and continue it for another upcoming competition endorsed by AMD
Built With
- amd
- lerobot
- python
- so-101
- vla
Log in or sign up for Devpost to join the conversation.