The goal of Mike Robot is to upgrade all objects to be smart objects through personal robotics.

What it does

Mike Robot recognizes intuitive gestures to control the movement of a robotic arm.

How we built it

Mike uses a leap motion controller for motion capture. A nodejs client reads data from the sensor, and send it over the network to a NodeJS server on a raspberry Pi, which controls a robotic arm via USB. A 3D printed case is mounted to an RC car for locomotion.

Challenges we ran into

Combining multiple simultaneous sensor inputs to precisely control the robot. 3D printing errors.

Accomplishments that we're proud of

High-fidelity motion control is intuitive and "feels" good - as though you are using "the force" to control an object. The motion capture is highly accurate and responsive.

What we learned

How to combine different sensor inputs. How to 3D print custom parts.

What's next for Mike Robot

Upgrade all hardware components, including the mechanical arm and robotic base. Add "eyes" - a 3D sensor with object recognition so that interact automatically with real-world objects.

Built With

+ 3 more
Share this project: