What it does

It's an app that relays to the user relevant information about whatever they point their phones at. It determines what they are pointed at by small, cheap, 3d printed Arduino AR tokens that can be placed around the world, or integrated into other objects (we built one into a coat hanger). That information can be anything, from web pages, to native content (like Braintree integration!).

In the future, this technology could be mixed with Google glasses to provide anything from directions on a path, to controlling home appliances (stare at your heater to have it's control appear in front of you virtually), to overlaying information around the world. Imagine going to the F1 and watching name tags and stats follow the cars around the field so you are never wondering who just rushed past you.

How I built it

Blood, sweat and tears. It's an iOS app (designed for iPad, but easily ported to iPhone in the future). Standard Xcode for an IDE, UIKit, Braintree, OpenCV being the only two libraries. I experimented with some graph libraries for debugging (measuring pixel channels in real time), but none offered the performance I needed so I chucked my own together in raw Quartz. It was a solo project, so everything was done by me.

The 3D printing was all designed in OpenSCAD and printed on a ME3D, which was a first time pickup for me. I had some help from Cameron on the actual execution once I had gotten my designs down.

For the electronics I used an Arduino Gemma. I've had a little experience with arduinos in the past, but this was the first time I've used the Gemma, which is considerably weaker, but also smaller and has an amazing battery life.

Challenges I ran into

Learning how to 3D print at 2am was a particular highlight, but was surprisingly easy with OpenSCAD. The code style design that it uses was pretty easy to pick up.

The main challenge was just getting the detection into a working site. The tracking of an LED that is: A)Flashing different colours B)Moving C)Has an imperfect mapping of RGB to the mapping on the iPads camera.

was not an easy task.

Accomplishments that I'm proud of

Working a full week, then getting on a plane, getting to my hostel at 12:30, then getting through a complete weekend and getting the project to a good state. When the LED tracker nailed it's first code, that was the moment I was most proud of. Getting the Arduino and the iPad in sync took a few hours, but getting over that first step was amazing.

What I learned

With handwork, patience, and enough Coke Zero to put down a small cow, anything is possible.

Seriously, I found that my focus has improved since I began full time work. I was much more productive at this Hackagon than I was at last years.

What's next for The Beaconing

I think the project has a ton of potential. In the future with Google glasses every where, Augmented Reality is going to be incredibly useful. I think the service has the potential to evolve into a sort of "DNS" for everything.

I'll be continuing with the project, because it has a ton of opportunities for learning inside. Next step I think will be to replace the tracking code with some sort of neural network system, but that might take a while.

Built With

Share this project:
×

Updates