Traditional e-commerce platforms often lack intelligent, conversational assistance that can truly understand customer intent and deliver personalized recommendations in real time. Our project reimagines online shopping by integrating AI-driven conversational search with a scalable Kubernetes-native microservices architecture.
Inspiration
We were inspired by the rapid shift in consumer behavior toward AI-first interactions—from chatbots to recommendation engines. Traditional e-commerce sites felt static, with rigid search bars and limited personalization.
We asked ourselves: What if shopping online could feel like talking to a trusted store assistant who knows your tastes, context, and intent?
This question led us to combine Google’s Online Boutique demo (a cloud-native microservices reference app) with Gemini AI to create something truly interactive.
What it does
- Customers can chat naturally with an AI assistant to find products.
- Gemini AI provides personalized recommendations in real time, based on intent and preferences.
- A cloud-native microservices architecture ensures scalability, resilience, and enterprise readiness.
How we built it
- Foundation – Deployed Google’s Online Boutique reference app on Google Kubernetes Engine (GKE), with 11+ independent microservices.
- AI Layer – Built a custom Gemini AI service to handle:
- Natural language product queries
- Personalized recommendations
- Conversational context tracking
- Natural language product queries
- Frontend Chat UI – Integrated a real-time chat interface where users interact with the AI assistant.
- Cloud-Native Deployment – Containerized all services, orchestrated via Kubernetes, ensuring autoscaling and fault isolation.
Challenges we ran into
- Context Management – Keeping multi-turn conversations relevant to both user preferences and product catalog.
- Deployment – Debugging Kubernetes YAML, IAM roles, and ensuring smooth CI/CD pipelines.
Accomplishments that we're proud of
- Built a conversational shopping assistant that feels natural and engaging.
- Achieved scalable, resilient deployment on GKE, capable of handling production-level workloads.
- Created a solution that bridges AI innovation with enterprise-grade cloud systems.
What we learned
- How to bring LLMs and microservices together to solve real business problems.
- Why good observability (logs, metrics, tracing) is key when debugging a system with many moving parts.
- Ways to make AI responses faster without losing quality.
- That designing with an AI-first mindset can make online shopping feel much more natural and human.
What's next for BOTiqAI
- Multi-modal AI – Letting shoppers search with images (e.g., “show me shoes like this”).
- Smarter personalization – Learning more about each customer’s style and preferences over time.
- Enterprise-ready add-ons – Easy integrations with platforms like Shopify and Magento.
Built With
- ai-studio)-containerization-&-orchestration:-docker
- asp.net-core
- c#-(.net-9.0)
- cloud-build
- cloud-profiler
- cloud-trace
- cors-frontend-technologies:-html5
- css3
- docker-compose
- environment-variables
- express.js-ai-&-machine-learning:-google-gemini-ai
- go-templates
- google-cloud-spanner
- google-generative-ai-databases-&-caching:-redis
- gorilla-mux
- helm
- http/rest
- java-frameworks-&-libraries:-flask
- javascript
- json
- kubernetes
- kustomize-communication-&-apis:-grpc
- logrus-(go)
- ngrok
- node.js
- pino-(node.js)-security-&-configuration:-google-cloud-secret-manager
- postgresql-cloud-platform-&-services:-google-cloud-platform-(gke
- programming-languages:-go-(golang)
- protocol-buffers
- python
- responsive-design-development-&-deployment-tools:-skaffold
- secret-manager
- terraform
- xml-observability-&-monitoring:-opentelemetry
Log in or sign up for Devpost to join the conversation.