Inspiration

In my mother language, Misu Bala means the eye that sees everything. The inspiration for Misu Bala Alimentation came from a simple but alarming observation: I saw my own family members struggle to manage their sugar intake because labels were too hard to read. We have never had more access to food, yet we have never been more confused about what we are actually eating. Most consumers feel overwhelmed by complex ingredient lists, hidden additives (like E-numbers), and misleading marketing. We realized that for people managing specific health conditions—like Diabetes or Hypertension—this confusion isn't just frustrating; it’s dangerous.

The idea clicked when we looked at the current tools available. Most nutritional apps rely on scanning barcodes. But what happens when you are at a local market, a bakery, or looking at a product with a damaged label? The barcode isn't enough. We wanted to build a vision-first assistant that could "see" and "understand" food like a human nutritionist would.

Our Core Mission We were inspired to bridge the gap between complex biochemical data and everyday decision-making. We wanted to create an app that:

Empowers the User: Turning a smartphone camera into a tool for health literacy.

Personalizes Health: Moving away from "one size fits all" nutrition to advice tailored to specific chronic conditions.

Promotes Transparency: Highlighting the impact of ultra-processed additives that often go unnoticed.

What it does

Misu Bala Alimentation is more than just a barcode scanner. It's a powerful educational tool that uses artificial intelligence to instantly decipher the composition of your food from a simple photo. Whether you're grocery shopping or already at the table, learn to eat mindfully.

How we built it

The architecture of Misu Bala Alimentation is a hybrid ecosystem designed for speed and accessibility:

The Brain: We integrated the Gemini 1.5 Flash API to handle multimodal inputs (image + text). This allows the app to "see" food items and labels without needing a barcode database.

The Backend: We used Google Apps Script as a lightweight serverless backend to securely communicate with the Gemini API and handle data processing.

The Frontend: A responsive HTML5/JavaScript interface hosted on GitHub Pages, designed with a "mobile-first" approach for a native app feel.

The Mobile Shell: We developed a custom Android WebView wrapper in Java. This was crucial for accessing native hardware features like the camera and file system while maintaining the flexibility of a web-based UI.

Challenges we ran into

No project is without its "bugs," and our biggest hurdle was Hardware Interoperability:

The WebView Camera Gap: We discovered that standard WebViews often block camera access for security. We had to implement a custom WebChromeClient and a FileProvider in Java to bridge the gap between the web code and the Android camera hardware.

Security Sandboxing: Running an app inside a Google Apps Script iframe created "cross-origin" challenges. We solved this by implementing a precise Permissions-Policy and using the HtmlService.XFrameOptionsMode.ALLOWALL setting.

Prompt Engineering: Fine-tuning the AI to give accurate medical-grade advice (like checking for hidden sugars for diabetics) required multiple iterations of system instructions.

The implementation of the voice speaker from the android WebView to JS on apps scripts to facilitate the use of the app for old people and people with eyes issues.

Accomplishments that we're proud of

Seamless Integration: We successfully turned a simple web script into a functional Android application that feels like a native tool.

Contextual Intelligence: The app doesn't just say "this is an apple"; it explains why that apple is good for your specific heart-health profile.

Accessibility: By removing the requirement for barcodes, we’ve made nutritional analysis possible for fresh foods and local markets, not just industrial products.

What we learned

We learned that User Experience (UX) is a security challenge. Every time we wanted to make the app easier to use (like opening the camera automatically), we had to find a secure way to grant those permissions. We also gained deep experience in Prompt Engineering, learning how to constrain an AI to be professional, empathetic, and scientifically accurate.

What's next for Misu Bala Alimentation

Monetization & Sustainability: Integrating AdMob or a premium "Pro" tier for detailed weekly health reports.

Offline Mode: Caching common food analyses so users can get basic info even without an internet connection.

Multilingual Support: Expanding the AI's ability to read food labels in multiple languages for travelers.

Built With

  • ai:-gemini-api
  • css3
  • generative-ai-mobile:-android
  • github-pages-backend:-google-apps-script
  • google-cloud-ai
  • html5
  • java
  • large-language-models-(llm)
  • webview-web:-javascript
Share this project:

Updates

posted an update

New Feature: AGASSA Non-Conformity Guide Integration Content: "I have integrated the official Guide des Non-Conformités from AGASSA directly into the app. This feature provides users with a clear, categorized database of food safety violations—ranging from hygiene breaches to cold chain failures. By making these standards accessible, it is empowering both consumers and operators to recognize risks and maintain the highest level of sanitary safety in Gabon."

Log in or sign up for Devpost to join the conversation.

posted an update

Project Update: Integration of the 2024 Gabon Mercuriale.

We have reached a major milestone for the social impact of the application in Gabon. Misu Bala Alimentation now integrates the official price caps regulated by the State to actively fight against the high cost of living.

Log in or sign up for Devpost to join the conversation.