Knowledge Assistant Analytics Dashboards: KPI Guide

Get Knowledge Assistant

Do you want access to Knowledge Assistant? Contact your Zingtree Account Manager or our Support Team.

This guide explains every metric across the Knowledge Assistant Analytics dashboards—what it measures, why it matters, and what good performance looks like.

 

How to Read This Guide

This guide is designed for business stakeholders. Each dashboard section explains:

  • What the dashboard is used for
  • The key question it answers
  • A breakdown of each metric (KPI), including targets and filters

Accessing the Knowledge Assistant Dashboards

If you have configured Knowledge Assistant and they have been used, you can access the dashboards from Knowledge Assistant > Overview:

Data Freshness

Freshness Label Meaning
Real-time Reflects events within seconds of occurring
Updated hourly Refreshed approximately once per hour

Filters

Filter Description Default
Date range Restrict results to a calendar date range Last 7 days
Intent Focus on a specific user intent/topic All intents
Assistant Focus on a specific Knowledge Assistant All assistants
AI Model Filter by AI provider/model All models

Note: Date filters operate at the day level (not time-of-day). Any time selection is ignored.


Downloading the Filtered Data

Knowledge Assistant Analytics can be downloaded to a PDF file from the top-right of the Dashboards:

 

Dashboard 1: Executive Summary

Purpose: High-level operating health
Key Question: How is the AI platform performing overall?

Card KPI(s) What It Measures Target / Benchmark Data Freshness Filters
Total Conversations Total CAI Sessions Total AI-assisted customer conversations handled. Indicates adoption and volume trends. >5% MoM growth Updated hourly Date, Org
Total KA Sessions Total Knowledge Assistant Sessions Number of self-service answers delivered. Proxy for deflection (no human agent needed). ~30% of conversations Updated hourly Date, Org
Authenticated Users Authenticated Users Number of identified (logged-in) users interacting with the AI. Growing over time Updated hourly Date, Org
Anonymous Sessions Anonymous Sessions Sessions with no identifiable user. Used alongside authenticated users to estimate reach. Declining proportion over time Updated hourly Date, Org
Daily Sessions Trend CAI, KA, Contained sessions Day-by-day trends across volume, Knowledge Assistant usage, and containment. Upward trend; ≥70% containment Updated hourly Date, Org
Containment Rate Overall, KA-assisted, non-KA % of conversations fully resolved by AI without human handoff. >70% Updated hourly Date, Org
Organization Performance Sessions, containment rate Breakdown by Organization to identify high and low performers. >70% across all Organizations Updated hourly Date, Org
AI Error Rate Trend Error rate % by channel % of AI calls that returned errors (e.g. throttling or outages). <1% per channel Updated hourly Date, Org
KA Containment Impact KA-assisted vs baseline Measures whether Knowledge Assistant improves containment vs no-KA baseline. KA-assisted > non-KA Updated hourly Date, Org
Session KPIs by Date Full KPI table Detailed table for daily analysis and root-cause investigation. N/A Updated hourly Date, Org
Weekly KPI Comparison WoW % change Compares the last 7 days with the previous 7 days to identify trends. Positive growth; declining anonymous share Updated hourly Org

 

Dashboard 2: Intent Performance

Purpose: Evaluates the AI’s ability to understand and classify user requests
Key Question: Does the AI understand what customers are asking?

Card KPI(s) What It Measures Target / Benchmark Data Freshness Filters
Top Intents by Volume Recognition rate, satisfaction, low-confidence count Most common requests and how well they are handled. >85% recognition Updated hourly Date, Org
Recognition Rate Trend % over time Tracks whether understanding is improving or declining. ≥85% stable Updated hourly Date, Intent, Org
Confidence Score Distribution HIGH / CLARIFY / LOW How confident the AI is when classifying requests. HIGH >60%; LOW <10% Real-time Date, Intent, Org
Response Time by Intent Avg and p95 time Speed of classification per intent. p95 < 2,500 ms Updated hourly Date, Intent, Org
Intent Satisfaction Rate Positive feedback % User-perceived answer quality per intent. 90% positive Updated hourly Date, Intent, Org
Hourly Activity Pattern Volume by hour When users are most active (capacity planning). N/A Real-time Date, Intent, Org
Model Performance Comparison Accuracy, confidence, response time Compares AI models to identify best performer. Highest accuracy; lowest latency Real-time Date, Intent, Org
Active Intent Count Distinct intents used How many configured intents are actively used. ~80% active Updated hourly Date, Org
Intent Category Distribution % by category Distribution of traffic across business domains. N/A Updated hourly Date, Org
Intent Confirmation Funnel Auto-confirm vs explicit confirm Measures friction in interactions. Auto >50%; explicit <30% Updated hourly Date, Intent, Org
AI Error Rate Over Time Error % Tracks classification stability. <1% Updated hourly Date, Org

 

Dashboard 3: Knowledge Assistant

Purpose: Measures quality, speed, and reliability of AI-powered self-service answers
Key Question: Is the Knowledge Assistant providing good, fast answers?

Card KPI(s) What It Measures Target / Benchmark Data Freshness Filters
Assistant Performance Overview Queries, answer rate, error rate, response time, satisfaction Overall health and performance of each assistant. >99% answer; <1% error; >85% satisfaction Updated hourly Date, Org, Assistant
Daily Assistant Trends Volume, answered vs failed Reliability trends over time. ≥99% answered; failures near zero Updated hourly Date, Org, Assistant
KA Session Detail Session ID, turns, duration, status Detailed session-level troubleshooting and QA view. N/A Real-time Date, Org, Assistant
Response Time by Assistant Avg and p95 time Speed of answer generation. p95 < 3,500 ms Updated hourly Date, Org, Assistant
Model Usage in KA Volume and latency per model Which models are used and how they perform. p95 < 3,500 ms Real-time Date, Org, Assistant
Top Queries by Assistant Most frequent queries Identifies user demand and content gaps. N/A Real-time Date, Org, Assistant
Avg Turns per Session Average turns Interaction efficiency. <3 turns Real-time Date, Org, Assistant
p95 Turns per Session 95th percentile Worst-case interaction complexity. <6 turns Real-time Date, Org, Assistant
Avg Session Duration Seconds Total time to resolution. <120 seconds Real-time Date, Org, Assistant
Distinct Assistants Used Count How many assistants receive traffic. All configured assistants active Real-time Date, Org
AI Error Breakdown Error type, endpoint, model Breakdown of failures for troubleshooting. <1% Real-time Date, Org

 

Glossary

Term Definition
Session A complete interaction (conversation or Knowledge Assistant)
Turn One user message and one AI response
Channel Where AI is used (Conversations, Knowledge Assistant, Shared)
Containment Resolution without a human agent
Containment Rate % resolved by AI (>70% target)
KA Containment Lift Improvement vs baseline without KA
Authenticated Users Logged-in users
Anonymous Sessions Unidentified sessions
Total Reach Authenticated users + anonymous sessions
Decision Band Confidence level (HIGH, CLARIFY, LOW, FALLBACK)
Auto-confirm AI acts without user confirmation
Explicit Confirm User confirmation required before action
Answer Rate % of queries answered
KA Error Rate % of failed Knowledge Assistant queries
AI Error Rate % of failed AI API calls
p95 Response Time Time experienced by 95% of users
Week-over-Week (WoW) Comparison of current vs previous 7 days
Recognition Rate % correctly classified requests
Satisfaction Rate % positive feedback

Was this article helpful?

0 out of 0 found this helpful

Have more questions? Submit a request

Didn't find what you need?

Our friendly customer support team is here to help

Submit a Request

Looking for help or advice?

Reach out to our knowledgeable community of users.

Zingtree Community