Inspiration: Manual support triage is a massive bottleneck. We saw that support agents were solving the same technical problems repeatedly because they didn't have a fast way to search historical resolutions. We wanted to build a "long-term memory" for support teams using the native Elastic AI ecosystem.
What it does: SupportFlow-AI is an intelligent agent that: Analyzes incoming tickets for category, priority, and sentiment. Discovers Solutions by using custom ES|QL tools to find similar "Resolved" tickets in Elasticsearch. Drafts Responses for customers based on historical "Gold Standard" data. Provides Analytics to managers using real-time aggregations of the support queue.
How we built it: We leveraged the Elasticsearch Serverless platform for high-performance data storage. Instead of a brittle custom backend, we used the Elastic Agent Builder to orchestrate the AI’s reasoning. We engineered two primary tools using ES|QL: A Similarity Search Tool using the QSTR function for text matching. A Statistical Tool using STATS to aggregate ticket volume.
Challenges we overcame: Our biggest challenge was pivoting from a traditional custom backend (NestJS) to a 100% Native Elastic approach. We realized that by using the Agent Builder and ES|QL, we could achieve better performance, lower latency, and a much cleaner architecture than a manual LLM integration. Accomplishments that we're proud of Successfully creating a "Grounded" AI agent that doesn't hallucinate but bases every answer on real ticket data. Writing efficient ES|QL queries that handle both semantic search and data aggregation. Building a production-ready solution with zero custom middleware.
What we learned: We learned that the Elastic Agent Builder is incredibly powerful for RAG (Retrieval-Augmented Generation) workflows. We also deepened our knowledge of ES|QL, specifically how to use it for complex multi-field searches.
Built With
- artificial
- database
- elasticsearch
- es|ql
- intelligence
- kibana
- search
- vector
Log in or sign up for Devpost to join the conversation.