An often overlooked aspect of society is the struggle that those who are deaf or have auditory impairments go through to accomplish trivial tasks. In particular, dining in, or even ordering from a fast food or cafe can be a tricky task. Conventional means of communication through sign language and pointing at the menu are far too cumbersome, and there is a serious need for greater technological intervention in this area. Menuvoice, a text-to-speech chatbot mobile app, provides quick and easy access to a restaurant's menu, aiming to re-empower deaf people by giving them the ability to quickly choose menu items and turn their selection into a speech-based order.

What it does

Menuvoice is able to determine which restaurant the user is in via geolocation coordinates in relation to other nearby restaurants - this automates the restaurant selection process. Next, the user is prompted to enter a keyword related to a menu item they want to purchase. For example, if the user is at Tim Horton's, and they want to buy a pack of timbits of a specific variety, they simply only need to type in the keyword 'Timbit'. From there, the chatbot lists suggestions of all menu items related to that keyword. The user can select from one of the options and also view basic nutritional details, such as calories. Next, the chatbot prompts the user to enter a couple more details to personalize the order, including the packaging method of their order and the quantity. Menuvoice is able to take this information and produce a text-to-speech order that the user can play to the waiter or cashier.

How I built it

The app was built using react-native on top of the expo platform. Google Places API was used to collect the geolocation of the nearest restaurants. The Nutritionix API was used to gather information about menu items from a particular restaurant. The expo-speech package was used to perform the text-to-speech.

Challenges I ran into

There were some challenges in using these APIs for the first time, and making sure everything was configured correctly.

Accomplishments that I'm proud of

I'm proud that I was able to build something on my own in a short space of time.

What I learned

Learned more about Google's Places API and building a chatbot for the first time.

What's next for menuvoice

Some future advancements include being able to support users beyond just restaurant settings. Also, the use of machine learning to make suggestions for even vaguer keywords.

Share this project: