Inspiration

Member of our team have been hit with loved ones suffering speech impairment or a speech disability following an injury. This can completely change a person as it makes them feel like their independence has been taken away. In order to help these individuals through the start of their rehabilitation, we have created VerbaTek's HandUp software in order to provide them with basic communication through hand gestures.

It is projected that approximately 80,000 Canadians will require the use of an intubation system for assisted breathing between 2018-2019. This restricts them from using words to communicate. As most of these incidents requiring the intubation happen quickly, there is relatively little time to prepare for the intubation period by adopting a visual language such as American Sign Language (ASL).

Applications For This Technology: Intubation/Mechanical Ventilation Patients, Stroke Victims, Smoking-Related Larynx-Removal Surgery, Intellectual Disabilities, Birth Defect Of The Mouth And/Or Larynx, Selective Mutism, Autism

What it does

VerbaTech's HandUp software provides an easy interface for gesture-controlled communication. This system has been designed to be as simple as possible for the user. Therefore, with four natural movements, you can control the interface to scroll through the words.

Using a .tech domain, we have created a website that contains the user-friendly interface for the communication tools. Once on the website, the user is then brought to their list of word categories. Examples of this include categories such as food, drinks, medicine, names and entertainment. They also have the option to add more categories. Once a category is selected, they are brought to their list of words for that category. An example for the entertainment category would be "Turn TV On", "Turn TV Off", "Sports" and "News". They also have the option to add more words to each category.

How we built it

Using the build-react-app API, we built a webpage that displays the company information, as well as the list functionality. MyoMapper was used to map motions from the EMG and IMU sensors to interact with the list on the website. To deploy this, we used Surge, and the .tech domain for our domain name.

Challenges we ran into

The original plan was to use machine learning with TensorFlow. However, we quickly realized that this wasn't feasible within the given time constraints. Instead, we decided to use Javascript and MyoMapper to provide gesture control for the software. We were faced with a challenge of taking raw EMG and IMU data at the beginning. However, as we weren't familiar with this technology, we performed research and eventually were able to take the raw data from the Myo armband. We faced challenges when mapping the myo emg values to control the interface and troubleshooting the errors involved. However, in order to get through this problem, we mapped the myo gestures to keyboard input and the keyboard input into the website. Therefore, we were able to troubleshoot if it was the myo band or the website that was creating the inaccuracies by being able to test through keyboard input as well.

Accomplishments that we're proud of

We are proud of being able to integrate the gesture control with a flexible interface. This was done by brushing off old skills, learning new skills and utilizing new technologies.

As an additional note, we're proud of Rohil Gupta for making it on Saturday. We weren't sure if he was going to pull through, but he sure did.

What we learned

We learned the importance of time management in an event like a hackathon as you sometimes need to compromise the final version in order to accomplish a viable product within the allotted time frame. We also learned how to take raw values from EMG and IMU sensors. As some of the group members are interested in biomedical integration of technology, this will be beneficial going into the future.

What's next for VerbaTek

Going forward, we hope to look more into mapping unique gestures based on the user and storing these gestures under the classification of their meaning. We hope to integrate machine learning in order to accurately map user-specific gestures by training the system with test input. This has applications in rehabilitation and military fields.

Website: VerbaTek.tech

Challenges That We Are Running For: .tech - Category Prize Sketch - Project With Best UI Design

Share this project:

Updates