Inspiration

We do not want anything to be out of reach. If one(i.e. elderly, handicapped, etc.) cannot get something, he or she can always use our robots to transport the object remotely.

How it works

The Leap motion can be used to control the robot. However, with our future addition of autonomous mode, the robot will be able to drive itself to the desired object. The current progress at the end of this hackathon features hand motion control, object recognition through a webcam(while it is fully functioning and programmed, it needs an autonomous mode to be used at its full potential), and picking up and putting down of small, light objects.

Challenges I ran into

Hardware hacking was a big challenge because only one of our team member was experienced, forcing us to step out of our normal fields to compensate for the development of the robot. Also, the motors only turned 180 degrees so we had to disassemble the servos and manually adjust them. Programming went very smoothly and well, although it took us a bit of time to learn about new servers and internet connections.

Accomplishments that I'm proud of

Each member had his/her own personal strength and varying experiences, but at the end of the day, we all learned something new and proved to be quick learners. We had to learn and apply brand new API's and robotics techniques(sensory image, actuators, etc.), but through this experience we gained confidence to try something novel.

What I learned

From a new language(for some of us Python, C++, etc.), mechanical engineering concepts, to using IBMWatson.

What's next for Edisee

With the addition of camera and faster motors, we can allow the users to operate remotely. This is an addition to the drones that are helping us in the air. Edisee will allow drone's functionalities to be implemented on the ground.

We do not want anything to be out of reach, and Edisee will see that you are connected.

Share this project:
×

Updates