Analyze dashboards with AI: summary, anomalies, what to check next

Baseline Apache Superset has no native AI analysis layer — analysts read each chart by eye and wire up insights themselves. Our fork adds an Analyze Data with AI button to the dashboard header that opens a side panel: a structured read of what the dashboard is showing right now, what looks normal, what looks unusual, and what to look at next — plus follow-up questions and conversation history.

Side-by-side

See the difference

This capability has no Apache Superset 4.1 equivalent — there's nothing to slide against. Open the fork screenshot for a pixel-level look.

Our forkDrafted fork

Guided tour

How the AI Analysis panel is built

Hover or tap a pin to see how each block of the panel maps to a real analyst question. The output is deliberately structured — Summary, What is happening, What looks unusual, What to check next — so the panel reads like a briefing, not a chatbot transcript.

Drafted fork

Entry point

Where AI analysis lives in the dashboard header

The trigger is a first-class header action — not a hidden command, not a chatbot bubble. Owners and analysts see the same ANALYZE DATA WITH AI button next to Share, in the same row as the dashboard's other top-level actions. The scope is implicit: the current dashboard, current filters, current time window.

Our forkDrafted fork

Inside the header

How the AI trigger sits in the dashboard

Each annotation explains how the trigger integrates with the existing dashboard header. The button is intentionally promoted into the same cluster as Share so analysts treat it as part of the workflow, not a separate tool.

Drafted fork

Context

Why this matters

Clicking ANALYZE DATA WITH AI in the dashboard header opens an AI Analysis side panel scoped to the current dashboard, the active filters, and the current time window. The model reads the same data the user is looking at — no SQL, no copy-paste into ChatGPT, no separate dashboard for AI summaries. It produces a four-block answer: Summary, What is happening, What looks unusual, and What to check next, each grounded in concrete numbers from the visible charts.

The Summary block is the headline a stakeholder would skim first — week-over-week movement, dominant categories, share-of-mix shifts. What is happening describes behavior across the visible time window: which segment is driving volume, where unit economics are stable, where they're drifting. What looks unusual is the anomaly layer: a metric that broke its band, a conversion drop despite rising leads, a center whose utilization fell out of its normal range. What to check next turns the analysis into a numbered checklist of follow-up cuts — lead sources, SLA breakdowns, cohort splits — so the next dashboard or query is obvious.

Below the analysis, the panel offers one-click follow-up chips that jump to related dashboards (Margin by category, Sales by channel, Suppliers and procurement) and a free-form input for ad-hoc questions. The related dashboards aren't picked by hand or by tags — they come from the AI dashboard search engine (a separate Enterprise feature, AI search and descriptions, listed in the related section below): the same index that powers portal-wide search runs against the current analysis and surfaces neighboring boards by meaning, not by title. A HISTORY button at the top of the panel keeps the running conversation per user and per dashboard, so analysts can return to a previous AI session instead of re-prompting from scratch. The whole flow is a first-class part of the dashboard — not a chatbot bolted onto the side of the product.

Related features

Capabilities that usually ship alongside this one. Package tags tell you where each feature lives in our delivery plans.