💡 Inspiration
- Our experience with language classes in high school has been that they tend to miss the point of language learning: to use it in everyday life. We wanted a practical and simple way of learning new languages that go beyond the classroom and have real-world use.
💬 What it does
- The chatbot takes in a prompt of a certain real-world scenario (in English) from the user, or the user can choose a default prompt given by the program. Once the bot has a prompt, it begins to ask leading questions to the user in whichever language the user chooses to learn, allowing the user to experience situations they may come across in the real world, such as ordering coffee or asking for attractive tourist destinations. The program displays the questions through both audio and text, and the user can answer the questions through either audio or text.
🛠️ How we built it
We used OpenAI, Next.js, and Microsoft Azure Language Model
- OpenAI was used to generate the questions Bingus translator would give based on the prompt the user inputted
- Microsoft Azure was used to getting the Text-to-Speech audio for both the user input and the output of the Bingus translator
- Next.js was used to create the UI as well as connect OpenAI with Microsoft Azure
🤔 Challenges we ran into
- Connecting the OpenAI API and Microsoft Azure through next js was difficult, as we were dealing with three different components (with completely different syntaxes) that all had to connect perfectly, with both the UI and the backend.
- Dealing with the user wishing to input their own prompts
🎉 Accomplishments that we're proud of
- Creating a chatbot that is able to hold a real conversation with another person (in many different languages) both through text and audio was incredible to witness, considering that we had made it in only one hackathon.
- Successful collaborative coding, as all group members knew what parts of the project they had to work on, resulting in minimal conflicts
📚 What we learned
- Effective ways to troubleshoot and solve problems, as well as how to use completely separate APIs in conjunction with each other to create a cohesive project.
🌟 What's next for Bingus Translator
- Most likely adding more languages, as well as the ability for the user to switch the prompt they want mid-session, that way the user does not have to restart the program every time they wish to change prompts.
- Being able to switch languages mid-session (although this doesn't seem like too much of a priority, usually you don't learn two languages at once).
Built With
- git
- microsoftazure
- next.js
- openai
- typescript
- vscode

Log in or sign up for Devpost to join the conversation.