📱 Menu Translate AI - MenuSnap

🚀 Inspiration

This project was inspired by my need when I travel. Last month I went to Japan, and sometimes there were restaurants where it was only in Japanese or in english but without a picture to image what would you get. I wanted to address this problem and create this app to help me decide what to eat when I don't know what to expect in a foreign country.

🛠️ How I Built It

I built this project using:

  • React Native with EXPO.
  • For vision model I used Gemini and for image generation I used Flux.

The app’s main features include:

  • Scan any menu and translated in any language you want (we support 12 right now).
  • Get estimated calories of each dish.
  • Get images of each dish.

The development process involved designing UI components, integrating backend services, and testing across various devices.

📚 What I Learned

Throughout this project, I learned:

  • How to use EAS to manage builds and submissions.
  • How to develop in React Native and EXPO as it was my first time doing it.
  • How to ask for feedback in X and improve my app based on feedback.

⚔️ Challenges Faced

Some of the biggest challenges were:

  • Configuring Apple provisioning profiles correctly for EAS builds
  • Managing different build targets for iOS devices vs. the simulator
  • Optimizing performance to reduce load times.
  • Handling edge cases with Flux.

Each of these challenges taught me valuable lessons and made the final product more robust.

🌟 Conclusion

This project helped me grow as a developer and gave me confidence in deploying production-grade apps with Expo and React Native. I’m excited to keep improving it and sharing it with others!

Built With

Share this project:

Updates