We were inspired by the design of a smart home hub's ability to assist user's facing with accessibility needs and to

What it does

Using camera vision through OpenCV, we are able to detect the different frequencies at which light is emitted. Depending on the query sent to us by the user, we are able to select the LED blinking corresponding to the appropriate object that is queried. Through our database, we match the object with it's location, for which Alexa responds.

Example: "Hey Alexa, where are my keys?". Using the OpenCV program, Amazon's DynamoDB database and an Arduino to blink the LEDs to control and manage the query, Alexa may give the following response: "Your keys are at your nightstand in your bedroom."

How we built it

Using a computer interface, we utilized a Computer Vision to detect blinking lights using a Logitech webcam. Using query calls from Amazon Alexa, we can locate a user's object by observing which colour of light associates with the user's particular request (refer to more information above).

Challenges we ran into

Configuration of Alexa to read queries from a database we set up is difficult, as the "Alexa Skills Kit" did not specify how we can achieve that. Further, the reading of queries is slower than a typical Alexa command.

Accomplishments that we're proud of

Our camera controlled through OpenCV (Open Computer Vision) is able to detect blinking LEDs as a means of identifying different objects.

What we learned

We learned the various software and hardware required to use camera vision to distinguish between blinking lights.

What's next for Alexeye

Extending the feature to work for objects attached to an IR emitter instead of using LEDs. Further, we could improve the Computer Vision to recognize objects without using any blinking emitter; objects will be recognized based on a database filled with object information.

Share this project: