Inspiration

Our member has a deaf family member who loves sports but is unable to fully enjoy it due to the emotional and cultural limitations of closed captions.

What it does

Signify allows us to upload a video and then converts the video into american sign language. We hope to further develop this project so that it can convert live through a character model to portray emotion.

How we built it

We built Signify using Python and Javascript. We used Python libraries and AI tools to transcribe speech to text then to ASL.

Challenges we ran into

We didn't have enough time to create a 3-D model overlay and had to temporary use a dictionary of sign language symbols to convey our project.

Accomplishments that we're proud of

Our UI is almost fully developed and functional and our upload feature allows the upload and transcription of any length video.

What we learned

How to utilize AI libraries and how we can future utilize them to generate relevant text and images.

What's next for Signify

Upgrading our hand ASL models to a full upper body model!

Built With

Share this project:

Updates