Inspiration

We always manage to lose things that were in our hands a minute ago, so we made glasses to remember for us.

What it does

Shopifind tracks and identifies tools that are being held by the user with computer vision. It also remembers where the user let go of the tool. At the touch of a button, the user is then guided to the spot where they left the tool with the use of LED's mounted on the glasses.

How we built it

We first assembled the general circuit layout for the glasses on a breadboard, then began 3D-modeling (in Solidworks) the pieces to mount the electrical components. While those pieces were printing, we used Arduino IDE to read orientation from a gyroscopic sensor to control a non-obstructive LED display on the glasses, then tracked objects in the environment by streaming a live camera feed for processing and object identification using OpenCV in Python.

Challenges we ran into

Lack of wireless connectivity in the Arduino Uno caused us to have to use a serial connection from a laptop to communicate with the glasses.

Extremely high latency with the live camera feed meant we sometimes had to use a laptop webcam for computer vision.

Accomplishments that we're proud of

We successfully completed everything we wanted in an MVP for Shopifind!

What we learned

What's next for Shopifind

Fully wireless and more compact electronics.

Voice commands to request tools to be found.

Incorporation of TensorFlow for more robust object identification.

Built With

Share this project:

Updates