Inspiration

Ever looked at a light and wanted it to be dimmer? Even once you have your IoT devices connected to your network, controlling them or getting information back from them can be an a chore. What light am I trying to control? I can see it, why do I have to dig through a menu to interact with it?

What it does

Using the eye-tracking and localization features of the Magic Leap AR headset, locations of IoT devices are recorded then made interactive using eye-gestures. End result: Fixate your gaze on a smart lightbulb and a menu will appear floating near it. Fixate on one of the new floating control buttons and the physical light will respond!

How I built it

Magic Leap headset sends eye vectors into world space, Unity keeps track of where IoT devices are located, a python script converts JSON requests into device specific commands, and then your IoT device executes those commands.

Challenges I ran into

Looking forward to building it

Accomplishments that I'm proud of

I've been utilizing python libraries for home automation using 2D interfaces. Now I am going to bring the controls into AR.

What's next for Ion Eye

Building the eye-tracked AR interface at the hackathon!

Built With

Share this project:

Updates