We all can get pretty lazy when it comes to doing every day tasks in the house. Maybe it's too early to deal with using that coffee maker, or maybe we are too lazy to go turn off that light. With technology rapidly making our daily tasks more simple, why not try to simplify it even more by using something we all have? A phone. By using our smart phones and using other technology around us we thought, if we connect all of this technology and simplify it something awesome can happen.
What it does
The ARemote, is a Augmented Reality scanner. It can scan a QR code that would be place on an object and then remotely give command options for that object. For example, the lights in a room. With a smart phone you would scan this code and then a menu would appear on the screen. The menu would have a list of options for the specific object such dim lights, turn off lights or turn on lights. For example purposes we have set up multiple QR codes with objects that the user can scan and interact with. Once the scan has identified the object a menu with appear and the Arduino will light up signaling the command has completed successfully.
How we built it
Compiled and uploaded to the android device though ADB. Then the Arduino code is imported to the IDE and the server is ran on any python environment.
Challenges we ran into
Setting up the amazon server hosting. Serial communication between python and the Arduino. Identifying the QR code location within the camera.
Accomplishments that we're proud of
Getting the QR code tracking working and setting up the python web server.
What we learned
How to set up a web server using python.How QR codes works within Android.
What's next for ARemote
For now ARemote scans QR codes, eventually the QR codes would ideally be replaced with simpler codes or symbols that would aesthetically be incorporated throughout the house. ARemote currently needs to be relatively close to the QR code in order for it to recognize and scan. Eventually it would be great if the code could be picked up at a greater distances, such as across the room. In the future a more usable app would be assembled in order to have users set up their own options for objects. Such as when scanning a speaker, the user could set up an option to immediately play their favorite song. For now the application works on touch commands, eventually the application can have the option for voice commands and even hand gestures.