Inspiration
When I was watching a local movie(some militant movie) where they generated the morse code using eyelids and they decoded it manually. So I thought why don't we capture that using AI. Capturing morse code with eyelids is a bit difficult task so I decided to do that with fingers and then improve this in future with eyelids. I thought it can help many people to generate morse code with fingers in front of the camera instead of pressing the buttons or morse code machine. It can make the work easy and can also help in militant activities.
What it does
It tries to capture the Morse code pattern and detect the character based on the observed pattern. When we run the Morse_code_generation.py file, Your camera will be opened and you will see a blue bounding box. when you press b it will capture the background without your hand first in the blue bounding box, and it opens 5 windows (Mask, Blur, Ori, Predicted, Test) Mask window will show you the masked image of your hand by removing the background. Blur Window will show you the blur grey image of your hand by removing the background. Ori Window will show you the Threshold image of your hand. Predicted Window will show the predicted number of fingers opened. Test Window will show the predicted text from morse code.
If you open one finger it is small beep(Dot in Morse code). If you open Two fingers it is Long beep(Dash in Morse code). If you open Three fingers it is to start capturing for a new character or resetting the current character. If you open Five fingers it is to Stop capturing and predict the character from morse code pattern. If you open Four fingers it is to add space between characters.
How I built it
It is a two-step process. First, I built a model to detect no of fingers opened by using TensorFlow 2.0 and a dataset called Fingers from kaggle.
The second step is to generate the morse code. Here the first step is to detect the hand in the video and the next step is to take that and predict how many fingers are opened using the pre-trained model which is generated from the first step.
I used the background subtraction method and HSV segmentation from OpenCV to detect the hand. Then each frame is sent through two filters one is a grey image and another is threshold image. We will be using this threshold image and pass it to the pre-trained model to detect no of fingers opened.
Based on the no of fingers opened Morse code will be generated.
Challenges I ran into
There are two main challenges one is to detect the hand lzane made it easy for me.
The second challenge is to predict the morse code from the large no of frames. I solved it by reducing the frame rate and by introducing other things like the start of the new character and the end of the current character. It made easy to capture the hand gesture and predict through the model. And here zero fingers played a major role to create the difference between two deep sounds.
Accomplishments that I'm proud of
I'm really proud that I did this which is not there on the internet yet Generating Morse code with fingers.
What I learned
I learned a lot about image segmentation and background substractions method to detect the hand. And convolutional neural networks to detect the no of fingers opened. I learned more about Morse code and its history
What's next for Generating Morse code with Fingers
The next steps of this project are to improve the model to make it faster at detecting the pattern. And an Android application or Web application by using TensorFlow lite or TensorFlow JS to take this global and bring better communication. And Making a new model which will generate morse code with eyelids
Log in or sign up for Devpost to join the conversation.