Inspiration

One of the biggest inspirations for developing this project was seeing how the work spaces in which I worked in gather clutter, including right now with parts all over the table. Of course, this can be extrapolated to environments where humans do not have the time to clear space around them or should not touch the space around them, for example, within a commercial kitchen.

What it does

Our project is robot arm that picks up tools/item and reorganizes them. It uses machine learning to recognize a tool and place it in its dedicated homes.

How we built it

Our CAD designers CAD designed and 3D printed arm parts. Our software team downloaded and fine tuned then trained own YOLOv5 object detection model. Using edgeimpulse the model was flashed to a Rubik PI 3 board. Assembly happened afterwards, with integration of software and hardware parts happening throughout the hackathon time period.

Challenges we ran into

3D printing parts takes a really long time and if a mistake is made the entire part needs to be remade. Our original idea was to use a belt driven design for the robot arm. However, with the limitations to certain hardware resources, we had to pivot quickly and change our designs to a more gear based design.

Accomplishments that we're proud of

3D printed own robot arm. Trained own object recognition model. Designed own arm base.

What we learned

Being able to iteratively design results allows for combined designs of successful features while eliminating ineffective components.

What's next for ToolBot

Add more user customization for different tools/workspaces. Make the robot arm more mobile/easier to install. Improving general dataset to work outside a mechanical engineer/software workspace.

Built With

  • edgeimpulse
  • nx
  • onshape
  • python
  • rubixpi
Share this project:

Updates