Inspiration

With the global COVID-19 pandemic raging on, there has been an emphasis placed on solutions to flatten the curve. With grocery shopping remaining an essential errand, we were inspired to try and make self-checkouts at lines as contactless and safe as possible.

What it does

Enter Touch Out, which takes advantage of computer vision to allow grocery store customers to navigate the Checkout UI's/machines without making contact. Our solution tracks customers' hands as the "cursor".

How we built it

We used CVZone, OpenCV and Mediapipe libraries in order to track user's hands, hand gestures and hand movements with the webcam in Python. To connect user hand movements/gestures to our mock Checkout UI (which we developed using HTML, CSS, JavaScript), we used the PyAutoGUI library.

Challenges we ran into

Our team had very little experience working with the Computer Vision libraries. So there was a steep learning curve for the team to implement our solution.

Accomplishments that we're proud of

We are especially proud that we were able to write a program in Python that uses the Webcam and can track hand movements/gestures as well as interact with a graphical overlay.

What we learned

  1. Functional understanding of CVZone, OpenCV, Mediapipe
  2. How to use PyAutoGui to allow our hands to interact with our UI

What's next for Touch Out

As we only had the time and resources to test our project on a mock Self-Checkout system, we hope to apply our work to actual checkout machines. We're also looking for ways to make our project more and more versatile, such that its use could be applied to many more touch screen applications.

Built With

Share this project:

Updates