Wanted to do camera pitch/yaw integration with ioioboard, switch to raspberry pi and leap motion sensor. Can be used for image tracking if we replace the leap motion sensor with a camera
What it does
Takes hand position above leap motion sensor and calculates pitch and yaw of right hand and correlates the data obtained to servo radian position.
How I built it
Soda can and paper clip structure for support, laser pointer powered by GPIO pins on the raspberry pi which is enabled via Python script, communication over wifi using PutNub API to send data from computer with leap motion sensor to the raspberry pi. Motorshield allows for more servo capabilities.
Challenges I ran into
Intel Edison board did not allow for OS installation, IOIO board was difficult to develop with due to outdated API/Repositories, did not have wifi usb dongle for raspberry pi, attempted bridge connection amongst other methods of internet to raspberry pi connections, no arduino bluetooth sensor
Accomplishments that I'm proud of
Innovative approach to servo mounting and hardware hacking for powersupply, motoshield connection, and learning how API usage works
What I learned
How to use APIs, Python, some Java, transistor connections to power high current devices, python debugging.
What's next for Laser-Can-On
OpenCV or a different computer vision application to do object tracking, adding projectile and possibly moving base with motor controls for WASD movement planned, but not achieved in the 24 hour window. More to come!