Inspiration

Covid is spreading through breathing and talking, that's why we wear a mask. In the air the droplets can live up to 3 hours. The other way it can infect us is by our own hand. On most of the surfaces it can live up to 3-5 days! We touch a lot of things, then we touch our face which is a great opportunity for the virus to get us ill. There are researches which suggest we touch our face 16 times in just an hour!

What it does

With machine learning your tizen os based smartwatch(galaxy watch) will notify you, when your hand is getting closer to your face.

How I built it

I built a "service" app which is written in pure C#(and runs in the background constantly). The app runs in the background constantly and monitors user's motion(basically the accelerometer's data). On the other hand I trained a machine learning model, which can detect certain changes in accelerometer and classify them as a "face-touch-gesture" or other motion. Basically I used scikit's linear svc training method to create the machine learning model and transpiled it to C then C#, so it can run natively on a battery restricted device(e.g. on my samsung galaxy watch).

Challenges I ran into

  • Currently, there is no tizen support in the most popular machine learning frameworks, so I had to figure out how to use the trained model
  • Machine learning model training is hard, because there are a lot of possible hand gesture which could lead to a "face touch" event and not easy to detect them all
  • Tizen provides a lot of movement data but, figure out which would be useful takes time and a lot of experiment

Accomplishments that I'm proud of

What I learned

  • Hand gesture recognition
  • Scikit
  • Tizen Service apps
  • Wrestling with Python2 VS Python3

What's next for face touch

  • Improve the model to be able to detect every "face touch" gesture
Share this project:

Updates