Skip to main content

← All work

K-AI - Conversational Analytics Assistant

One chat for sourced answers and conversational analytics

Owned initiative

Summary

K-AI gives teams one place for sourced answers from internal docs and the dashboard catalogue, plus conversational analytics when the question is really about numbers. Routing is tuned so people land on trustworthy explores, long answers run reliably behind corporate proxies, and analytics owners could tighten behaviour before wider rollout.

Context

Internal / Enablement

What I built

Delivered a production chat assistant on the company analytics website. For general questions it searches internal documentation and dashboard catalogue material and answers with citations. When the question is really about numbers or metrics, it hands off into Looker's conversational analytics in the right context. Combines keyword and meaning-based search, AI-powered routing when a question could belong to more than one business area, and careful matching so users do not land on the wrong dashboard. Long answers run as background jobs with polling so sessions stay reliable behind corporate proxies. Uses standard cloud APIs for orchestration and optional storage for job history. Rolled out in stages with configuration so behaviour could be tightened over time.

Skills demonstrated

  • Built ingestion and embedding pipelines for documentation, Looker-derived metadata, and curated knowledge into LanceDB
  • Implemented jobs plus polling and optional BigQuery job persistence for long-running assistant responses
  • Integrated embedded Looker Conversational Analytics alongside knowledge cards on the company analytics portal
  • Implemented source citations and traceability from chat responses back to catalogue and doc chunks
  • Delivered FastAPI orchestration with modular Looker conversational and routing components
  • Implemented BigQuery-backed chat history and preferences with scalable patterns
  • Used Vertex AI (Gemini) for routing, improved questions, and answers with hybrid LanceDB retrieval
  • Applied explore-first planning, stream-aware catalogue filtering, and metadata-driven agent selection for Looker CA
  • Leveraged BigQuery snapshots for agents, explores, fields, optional field usage weighting, and dashboard elements
  • Structured indexed knowledge for fast hybrid retrieval across sources
  • Unified natural-language access so users cite knowledge or open the right CA thread instead of guessing explores
  • Aligned routing behaviour with operational domains (for example warehouse versus last mile) through clarification and gates

Impact

  • One assistant for sourced answers from docs and conversational analytics for metric questions
  • Question routing tuned so people hit the right topic and avoid misleading chart picks
  • Rollout supported by config and pipelines to load docs and internal knowledge
  • Iteration with analytics and portal owners before wider release

Technologies

FastAPIVertex AILanceDBBigQueryLooker APINext.jsGCP