Our love for human-machine natural gestures (Leap motion)

What it does

The vehicle arm is controlled by natural movements detected by a leap motion sensor. The arm can extend, retract in two different elbows and can grab and release a claw. We can also translate and rotate the rover with a wireless controller. There is a camera on the rover that let us see the rover environment

How we built it

We used the leap motion websocket service, this service is accessed by a raspberry py attached to the rover. The raspberry py get data from the leap motion service and then transmits this data to a arduino micro controller. The arduino handles 3 servo-motors and 2 DC-motors

Challenges we ran into

Communication between leap motion service and raspberry py was hard to set up. We got less material for physical parts that we thought we would have.

Accomplishments that we're proud of

Make all the systems communicate and be able to grab an object successfully using only human input and seeing the result on the robot arm

What we learned

Leap motion, arduino, ssh

What's next for 17 - MasterSpace Robot

Increase robot ability to be able to repair an object without physical human intervention

Built With

Share this project: