Imagine you are in a sushi restaurant somewhere in Japan and you can't speak Japanese. As usual there is no English version of the menu. You try hard to understand what is on position 3 in the menu that is "焼き鳥". Hmm, you can't really type it. Well you might use google image translate which will show "yakitori"... But you still have no clue what it is.
Introducing ... MenuAR - know what you order!
Luckily you have Manu AR installed, you simply scan the menu and the app visualises the food. It provides you with nutritional and allergens information too! In any language you want.
Simply scan the menu, and tap a position to show the details. It's as simple as that.
What it does
- menu detection in 3d space
- ar rendering to show useful info
- menu interaction, just tap on the menu option
How we built it
- Created with android studio
- we use ar library - arcore
- google translate api to translate the text on the fly
- some models created in Maya - 3d modelling software, other models downloaded from internet
Challenges I ran into
- Image detection in 3d space
- 3d model rotation in 3d space (quaternions???????? ... still no idea how they work)
- user interaction with object in 3d space, e.g you click on menu on your phone screen, but the menu is on the table
- creating custom 3d models -> menu buttons
Accomplishments that I'm proud of
- AR works!!!!!
- Nice layout of info about food, nutrients, allergens, visualisation of food
- translations to many languages, to make the app helpful for everyone
What I learned
- Basics about ARcore, it has cool libraries
- how to work with android studio
What's next for MenuAR - the ultimate food menu helper
- Restaurants can submit their own 3d models of food, to max the user experience
- Make it work with menus with different layout, eg grid, allow restaurant owners to specify it and submit it using our apps portal for resraurant owners