Inspiration

Imagine you are in a sushi restaurant somewhere in Japan and you can't speak Japanese. As usual there is no English version of the menu. You try hard to understand what is on position 3 in the menu that is "焼き鳥". Hmm, you can't really type it. Well you might use google image translate which will show "yakitori"... But you still have no clue what it is.

Introducing ... MenuAR - know what you order!

Luckily you have Manu AR installed, you simply scan the menu and the app visualises the food. It provides you with nutritional and allergens information too! In any language you want.

Simply scan the menu, and tap a position to show the details. It's as simple as that.

What it does

  • menu detection in 3d space
  • ar rendering to show useful info
  • menu interaction, just tap on the menu option

How we built it

  1. Created with android studio
  2. we use ar library - arcore
  3. google translate api to translate the text on the fly
  4. some models created in Maya - 3d modelling software, other models downloaded from internet

Challenges I ran into

  1. Image detection in 3d space
  2. 3d model rotation in 3d space (quaternions???????? ... still no idea how they work)
  3. user interaction with object in 3d space, e.g you click on menu on your phone screen, but the menu is on the table
  4. creating custom 3d models -> menu buttons

Accomplishments that I'm proud of

  1. AR works!!!!!
  2. Nice layout of info about food, nutrients, allergens, visualisation of food
  3. translations to many languages, to make the app helpful for everyone

What I learned

  1. Basics about ARcore, it has cool libraries
  2. how to work with android studio

What's next for MenuAR - the ultimate food menu helper

  1. Restaurants can submit their own 3d models of food, to max the user experience
  2. Make it work with menus with different layout, eg grid, allow restaurant owners to specify it and submit it using our apps portal for resraurant owners
Share this project:
×

Updates