Our inspiration

In New York City, there are more than 360,000 citizens of the who are visually impaired, and 60,000 of them are legally blind. They always have difficulties when they want to order food in a restaurant, and they have to rely on other people to help them. Our group is inspired to help them to be more independent by creating an app with accessibility in mind that reads the restaurant’s menu in detail to them, allowing them to order the food by themselves.

What it does

Eyeat is an app for the visually impaired people that helps them when they are ordering food in a restaurant, by using IBM Watson’s technologies. Our app will provide options for the user to search the nearby restaurant using coordinates we get from NYC Open Data. It will give a list of all of the restaurants nearby and with the restaurants' description: the cleanliness, the grades they got, etc from NYC Open Data. If the user is at a restaurant, we provide menu options for our user to choose from by reading it out loud. Our app will scan NFC tag that is embedded on the restaurant menu, and then it will read to the users, allowing them to get food information without assistance from a companion.

How we built it

We use React Native to build the app because that way we can build the app for iOS and Android devices as well to cover as many people as we can. We use NYC Open Data to get the restaurants' grade, description, coordinates of the restaurants and compare it with the user’s position. We use NFC technology to store menu information. The information would be displayed to the users and read out loud for them using IBM Watson’s Text-to-Speech API.

Challenges we ran into

We have to understand the pinpoints someone with visual impaired might have. And since they use their smartphones in a different way than we usually do, we had to do research so we could design an accessible interface that works for them. We had to learn to use and implement IBM Watson’s API and combine it with the data we got from NYC Open Data.

Accomplishments that we are proud of

We were able to make a useful app for a visually impaired person, which we hope will help improve their lives.

What we learned

We learned that there are a lot of visually impaired people and there is not much technology currently that helps them order food at restaurants. And we also learned that visually impaired people can also use smartphones, sometimes even more proficiently than non-visually impaired people. In addition to learning React Native, we also learn how to implement Watson’s Text-to-Speech and Speech-to-Text APIs, along with learning how to use NFC and QR code scanners.

What's next for EYEAT

We are looking forward to adding new features, including one to allow the user to adjust the voice speed as they needed, either to make it slower or faster.

We would implement Watson’s Machine Learning technology to give food suggestion based on their preferences and food they usually ordered.

We already did some research on some of the options we have to implement OCR, but were prevented from building it due to the limited amount of time at a Hackathon. However, we are excited to slowly learn about OCR technology and the different APIs that can help us use OCR to automatically read menu data from a picture of a menu. This would allow us to ask for users to take photos of the menu of their restaurant, and add that restaurant's menu to our database so that the next visually impaired user can have the same menu read out loud to them immediately.

Another feature we have in mind is to allow restaurants to subscribe to our service, which will have their menu constantly updated in our database, and we will provide a bundle of NFC tags for them to add to their menus with configuration tutorials included.

We also plan to cooperate with companies like Yelp, to have a "Visually Impaired Friendly" tag as a restaurant's perk. This can improve the restaurant's brand to attract more visually impaired customers and their companions to the restaurant. We believe that if more and more restaurants are aware of how important being visually impaired accessible is, this would result in hopefully a world where all restaurants and businesses and buildings support visually impaired people.

Share this project:
×

Updates