Traditional e-commerce platforms often lack intelligent, conversational assistance that can truly understand customer intent and deliver personalized recommendations in real time. Our project reimagines online shopping by integrating AI-driven conversational search with a scalable Kubernetes-native microservices architecture.

Inspiration

We were inspired by the rapid shift in consumer behavior toward AI-first interactions—from chatbots to recommendation engines. Traditional e-commerce sites felt static, with rigid search bars and limited personalization.

We asked ourselves: What if shopping online could feel like talking to a trusted store assistant who knows your tastes, context, and intent?

This question led us to combine Google’s Online Boutique demo (a cloud-native microservices reference app) with Gemini AI to create something truly interactive.

What it does

  • Customers can chat naturally with an AI assistant to find products.
  • Gemini AI provides personalized recommendations in real time, based on intent and preferences.
  • A cloud-native microservices architecture ensures scalability, resilience, and enterprise readiness.

How we built it

  1. Foundation – Deployed Google’s Online Boutique reference app on Google Kubernetes Engine (GKE), with 11+ independent microservices.
  2. AI Layer – Built a custom Gemini AI service to handle:
    • Natural language product queries
    • Personalized recommendations
    • Conversational context tracking
  3. Frontend Chat UI – Integrated a real-time chat interface where users interact with the AI assistant.
  4. Cloud-Native Deployment – Containerized all services, orchestrated via Kubernetes, ensuring autoscaling and fault isolation.

Challenges we ran into

  1. Context Management – Keeping multi-turn conversations relevant to both user preferences and product catalog.
  2. Deployment – Debugging Kubernetes YAML, IAM roles, and ensuring smooth CI/CD pipelines.

Accomplishments that we're proud of

  1. Built a conversational shopping assistant that feels natural and engaging.
  2. Achieved scalable, resilient deployment on GKE, capable of handling production-level workloads.
  3. Created a solution that bridges AI innovation with enterprise-grade cloud systems.

What we learned

  1. How to bring LLMs and microservices together to solve real business problems.
  2. Why good observability (logs, metrics, tracing) is key when debugging a system with many moving parts.
  3. Ways to make AI responses faster without losing quality.
  4. That designing with an AI-first mindset can make online shopping feel much more natural and human.

What's next for BOTiqAI

  1. Multi-modal AI – Letting shoppers search with images (e.g., “show me shoes like this”).
  2. Smarter personalization – Learning more about each customer’s style and preferences over time.
  3. Enterprise-ready add-ons – Easy integrations with platforms like Shopify and Magento.

Built With

  • ai-studio)-containerization-&-orchestration:-docker
  • asp.net-core
  • c#-(.net-9.0)
  • cloud-build
  • cloud-profiler
  • cloud-trace
  • cors-frontend-technologies:-html5
  • css3
  • docker-compose
  • environment-variables
  • express.js-ai-&-machine-learning:-google-gemini-ai
  • go-templates
  • google-cloud-spanner
  • google-generative-ai-databases-&-caching:-redis
  • gorilla-mux
  • helm
  • http/rest
  • java-frameworks-&-libraries:-flask
  • javascript
  • json
  • kubernetes
  • kustomize-communication-&-apis:-grpc
  • logrus-(go)
  • ngrok
  • node.js
  • pino-(node.js)-security-&-configuration:-google-cloud-secret-manager
  • postgresql-cloud-platform-&-services:-google-cloud-platform-(gke
  • programming-languages:-go-(golang)
  • protocol-buffers
  • python
  • responsive-design-development-&-deployment-tools:-skaffold
  • secret-manager
  • terraform
  • xml-observability-&-monitoring:-opentelemetry
Share this project:

Updates