We have big love for hardware and programming and if we're able to implement both while helping someone, we feel extra excitement and motivation. During our lives we met several deaf people which were super nice and kind but there was always the problem of communication. We would need to speak by text or just try to make as many funny and ridiculously exaggerated signs to make the other person understand you.

So to be able to communicate with our own def friends and help people all around the world with their jobs, studies and social live, we thought about building a glove that could translate sing language.

What it does

We made a glove that translates the Latin alphabet in Catalan sign language detecting your hand and fingers position. It prints the corresponding letter into a PC application.

How we built it

That's the part that we are very proud of. We designed our own flex sensors and used a couple of old gloves. The cost of each sensor is less than 0,1€. The resistor of the sensor is a piece of paper painted with a pencil (carbon is a conductor material) and with a couple of aluminium sheets on each side of the paper connected to a a cable. This way, when the sensor bends the resistance increases. By adding one of those sensors to each finger we can track all the movements. Additionally we had to use a gyroscope to know the orientation of the hand.

The coding consisted mainly in the detection of the sensors, the communication with python and the artificial intelligence for machine learning, which was the most difficult and challenging part.

Challenges we ran into

The design of the sensors took a long time at the beginning because we wanted to increase the range of values, so we tried different widths and lengths and also we solved problems that occurred during the first hours.

The most difficult part with no doubt was the artificial intelligence, our knowledge was really little and we got stuck there for a long time.

Accomplishments that we're proud of

Creating a fully developed IA model. Creating and designing our fully functional flex sensors. Also getting a final result.

What we learned

We learned IA and how to import analogical information from Arduino sensors to python 3

What's next for GloveWord

  • First of all, add an accelerometer to be able to detect movement and therefore translate more signs.
  • Add a GUI to visualise the translations in a more elegant way.
  • Add a second glove. In this hackathon we didn't have enough hardware resources.
  • Add a speaker to translate sign language to voice.
  • Improve the accuracy of the artificial intelligence.

Built With

Share this project:


posted an update

AI improvement

We kept working on the machine learning and after trying different algorithms we obtained a maximum accuracy of 80 % with random forest.

Here are the different algorithms we tried:

  • Knearest neighbour (Accuracy: 71.11 %)
  • Centroids (Accuracy: 38.27)
  • Gaussian Naïve Bayes (Accuracy: 66.30 %)
  • Linear Discriminant Analisis (Accuracy: 58.89 %)
  • Decision trees (Accuracy: 70.12 %)
  • Bagging with Knearest neighbour (Accuracy: 72.22 %) - Random Forest (Accuracy: 77.40 %)
  • AdaBoost (Accuracy: 8.89 %)
  • Gradient Boosting (Accuracy: 70.74 %)
  • Support Vector Machines (Accuracy: 74.69 %)

Log in or sign up for Devpost to join the conversation.