Inspiration## Inspiration
Our inspiration behind making HackAI is wanting to involve the visually impaired community in such fast paced moving technology such as ChatGPT.
What it does
Our app basically works based off of voice commands and gives an output/answer based off of the question. Since this can also be used by people with visibility issues, there is a button which you can use for the app to read out its answer to you.
How we built it
We have used MIT App Inventor as well as several components inside the app such as textbox's, buttons, labels, and more.
Challenges we ran into
While coding this app, some challenges that we ran into were trying to figure out how to incorporate the AI into our design itself. It was a bit of a challenge for us to figure out how to include our private API Key into the code for the output; but at last, we ended up using an incorporated Chatbot.
Accomplishments that we're proud of
A few accomplishments that we are proud of is being able to successfully run the app and it's sheer clarity when it comes to capturing the users voice. It barely messed up and recorded the words.
What we learned
We learned how to incorporate API keys into our coding for the app as well as using different UI features/invisible components such as the "text-to-speech" components and the speaker component.
What's next for HackAI
We would like to take HackAI to the next level by not only developing a model for visibly impaired people but also train and develop a model for hearing impaired people as well. By doing this we would be able to help and accomplish the UN SDG goal number 10 "Reduced Inequalities". We would like to make this product available world and eventually trained in multiple languages in order to decrease such unequal opportunities.

Log in or sign up for Devpost to join the conversation.