Inspiration

Using IoT connected devices often requires separate applications, set up and registration to use and interact with them. This makes them hard to implement in public or shared spaces where there are more than just one user. Breaking down the barriers to use these products would allow for a much more connected and usable service.

What it does

We created a IoT framework that utilizes AWS Greengrass to host a core server with MQTT topics that our IoT devices subscribe to. Each IoT device is connected to a Raspberry Pi that will update a QR code displayed on a e-paper display that contains current status information of the device. A user then can open the ARmote application and a camera will appear that scans valid QR codes. Once the QR code is scanned, current status of device will be displayed and the user can interact with that device like turning it on or off, changing the color of the light, adding songs to a queue, etc. Anchors are created in 3D space that allow the user to interact with devices previously scanned. We implemented this service with a LIFX smart lightbulb and tested out an adafruit BMP280 temperature sensor as well.

Challenges Faced

Permissions in AWS IoT core and Greengrass were difficult to get right to integrate our system together. It was also Andrew's first time coding in swift, so there were some bugs with developing the ios application. We also were using hardware in our project as well, so we had to figure out how to interface over spi with the e-paper displays to correctly display a QR code and i2c with the temperature sensor.

What we're proud of

We're really proud that we got a real working demo with the smart light. We can communicate with the bulb to change its color, brightness. It pulled together all of our work on the back end and the front end and it took a lot of coordinating between the two.

Future steps

In the future we hope to replace the QR code with a LiFi interface that will allow us to interact with devices over longer distances in a non-intrusive way. LiFi uses LED blinking at a high frequency and captured by a high frame rate camera, like one on smart phones, to transmit data. The high frequency blinking is not noticeable by the human eye, but these cameras can decode the blinking into meaningful information about the device.

Team members:

Andrew Tu, Megan Sapack, Tom Harmon, Jason Booth. We are located in the Proejct conference room along the back wall.

Share this project:
×

Updates