American Sign Language is a language that not a lot of people think about. By implementing the Twilio API and Co:here Natural Language Processing, we are able to bridge the gap so that ASL is more accessible. The user is first prompted to either log into an existing account or register for a new one with their phone number. They are then directed to a page where they can input text they would like translated, and a chatbot is ready to send images that match what the user would like to say in ASL to their number. Twilio API was used to construct a chatbot that automates interactions between itself and the user. Co:here is in place in case the user makes any grammatical or spelling errors, and CockroachDB will store login information as well as images for displaying ASL. Implementing the backend of the website along with creating a functional chatbot were both challenges that we had to work with and spent hours debugging. In the end, we are proud of what we have learned. This is our first time making a website that runs javascript code, and our first time using an SQL database.
We also tackled the CBRE challenge. Based on images passed in, we used the Python tesseract library to extract text from images. We used streamlit to create a website which allows for a user to upload an image and get the text from that image. We had to learn how to use tesseract and streamlit in order to build the project. Since we were all new to these technologies, we had to spend time familiarizing ourselves with them. Overall, it was a great learning experience and we all experimented with python libraries and extracting data.
Log in or sign up for Devpost to join the conversation.