Inspiration

We want to further break down the barrier between American Sign Language and Spoken English.

What it does

Decipher ASL Gestures into written text in real-time.

How we built it

We used an Open source machine learning code to decipher ASL gestures into text in real-time. MRTK3 for the Halolens 2 implementation running on Unity. The website was developed using HTML, CSS, and JavaScript.

Challenges we ran into

Unity on macOS. 2 of our members did not have a compatible device for the halolens. Outdated Computer hardware. Outdated packages. Constant bugs within the source code.

Accomplishments that we're proud of

All members were exposed to new coding languages. Able to implement Agile methodologies in a project environment and able to use new technologies to achieve a proper concept.

What we learned

Nelson learned Website design, video editing, HTML, Github. Mark Gained experience debugging the source code. AI and machine learning. Makenson Learned different methods of using Python. Learned more about machine learning, Learned about halolens, and MRTK3 Development on Unity. Christian learned Agile Methodologies and implemented them into the project. Able to dive into some Python and Website design. Able to maintain set boundaries and schedules. Out of the 4 members, 3 of them are first-time hackers.

What's next for Insight

We are hoping that this concept will be able to further open the door towards inclusion for the hearing impaired and hope that the insight we provide, would further inspire future developers and engineers to invest and develop this concept into the mainstream market.

Share this project:

Updates