Skip to main content
The Analytics section gives you a unified view of how your brand performs across AI platforms. It is organized into four tabs — Visibility, Mentions, Sources, and Sentiment — each focusing on a different dimension of your AI presence. A shared filter bar at the top of the page applies to all four tabs, so you can slice the data consistently as you explore. All Analytics data is derived from Mentionpath’s prompt runs: the periodic queries your tracked prompts send to AI platforms across your configured locations.

Shared filters

The filter bar at the top of the Analytics page is persistent across tab switches. Changes you make in one tab carry over when you switch to another.
Choose from preset windows: 7 days, 14 days, 30 days, 90 days, or 1 year. For a custom range, use the Custom date picker. The comparison label shown on KPI cards (e.g. “vs. previous 30 days”) reflects the selected window.

Visibility tab

Visibility measures how often your brand appears in AI responses as a percentage of total prompt runs. A visibility rate of 40% means your brand was mentioned in 40 out of every 100 AI responses for your tracked prompts. KPI strip — Six headline metrics at the top:
MetricDescription
AI VisibilityPercentage of runs where your brand was mentioned
Share of VoiceYour brand’s share of all brand mentions across responses
Rec RateHow often AI platforms actively recommend your brand, not just mention it
Citation RateHow often AI responses include a direct link to your domain
Avg. PositionWhere your brand tends to appear in responses (lower is better)
Mention QualityComposite score reflecting position, citation rate, and recommendation quality
Insights strip — Auto-generated callouts highlight your best-performing platform, biggest location gain, top-driving prompt, and most cited source type. Visibility over time chart — A line chart showing your visibility rate and competitor rates over the selected date range. Use this to spot trends, identify drops, and correlate changes with events. Brand Rankings — A ranked table of all brands appearing in AI responses for your tracked prompts. Click any competitor row to open a detail sheet with their visibility trend, platform breakdown, and prompt-level performance. Platform comparison — A compact table showing visibility rate by AI platform. Use this to identify platforms where you are strongest or where you need to improve. Share of Voice chart — A horizontal bar chart comparing your brand’s share of all mentions against competitors. The accompanying SoV ranking table provides the same data in sortable tabular form. Source Domains — A table of the domains most frequently cited by AI platforms in your tracked prompts. Click a source domain to see which specific URLs from that domain are being cited. Source Types donut — A breakdown of cited sources by content type (blog, documentation, news, etc.). Top Prompts — The prompts driving the most visibility for your brand, ranked by visibility rate.

Mentions tab

The Mentions tab tracks the raw count of brand mentions in AI responses, alongside citation counts and position data. KPI strip — Five headline metrics:
MetricDescription
Total MentionsTotal runs where your brand appeared
Avg. PositionAverage position in responses (lower is better)
Total CitationsTotal times your domain was cited as a source
Citation RateCitations as a percentage of runs
Citation CoveragePercentage of prompts where your brand was cited at least once
Mentions over time and Citations over time — Side-by-side line charts showing how mention and citation counts trend over the selected period, with competitor overlays. Position Distribution — A chart showing where your brand and competitors tend to appear in AI responses (first mention, second, third, etc.). Click a brand row to open a drill-down sheet. Platform performance — A summary card and detail table showing mention rates and citation counts broken down by AI provider. Co-occurrence — Which competitor brands most frequently appear alongside yours in the same AI responses. Click a co-occurrence row to see shared prompts and platform breakdown. New vs. Lost — A card tracking prompts where your brand newly gained or lost mentions and citations compared to the previous period. Click a prompt pill to see the platform-level breakdown. Topics performance — Mention and citation rates broken down by topic. Each topic card shows your top performing and underperforming prompts, plus the top competitor brands in that topic. Mentions and Citations table — A full list of your tracked prompts with mention rate, mention count, citation count, and position for each. Toggle between All prompts and Prompts with citations. Sort by any column, search by title, and enable gap analysis to highlight prompts where competitors are mentioned but you are not.

Sources tab

The Sources tab focuses specifically on which URLs from your domain — and from competitor domains — are being cited by AI platforms. KPI strip — Five headline metrics:
MetricDescription
Total SourcesTimes your pages were cited as a source
Source ShareYour citations as a percentage of all citations across tracked responses
Unique Pages CitedNumber of distinct pages from your domain cited
Avg PositionAverage position of your cited pages in source lists
Platform ReachHow many of your tracked AI platforms cited your domain
Sources over time chart — Citation volume over the selected date range, with a trend line for context. New vs. Lost Sources — Which specific URLs newly appeared in or disappeared from AI citations during the period. Click a URL to open its detail sheet. Brand Sources table — Your top cited pages, sortable by citation count. Search by URL, enable gap analysis to highlight pages cited for competitors but not for you, and click any row to open the URL detail sheet. Full Source List — Toggle between a domain-level view (which domains are being cited most) and a URL-level view (specific pages). Both lists support infinite scroll and sortable columns. Source Types donut — A breakdown of cited source types to understand what content formats AI platforms prefer to cite.

Sentiment tab

The Sentiment tab analyzes the tone and quality of AI responses about your brand, going beyond whether you are mentioned to assess how you are described. KPI strip — Four headline metrics:
MetricDescription
SentimentAverage sentiment score (0–100) across all mentions
RecommendationHow strongly AI platforms endorse your brand (0–100)
SalienceHow central your brand is in responses (0–100)
Positive SharePercentage of runs classified as positive sentiment
Sentiment distribution — A horizontal bar showing the breakdown of positive, neutral, and negative responses. The best and worst performing platforms are highlighted below the bar, each clickable for a full platform drill-down. Sentiment over time chart — Your sentiment score trend over the selected date range, with competitor overlays. Platform comparison — A list of AI platforms sorted by sentiment score, with strengths and weaknesses extracted from their responses. Click a platform to see prompt-level sentiment, the platforms’ sentiment trend, and competitor comparisons. Head-to-Head comparison — A table comparing your brand’s sentiment, recommendation score, visibility, and position against all tracked competitors. Click a competitor row to open a detail sheet with evidence, trends, and a prompt-level breakdown. Topic Analysis — Sentiment broken down by topic. Each topic card shows sentiment, recommendation, and salience scores, your top prompts within that topic, and the top competitors in that space. Brand Facts — AI-extracted factual claims about your brand from across all responses. Review each claim, mark it as correct, incorrect (and provide the correction), or ignored. Corrections you submit feed back into Mentionpath’s monitoring to flag future inaccuracies.
Use the Sentiment tab’s Brand Facts section regularly to catch and correct inaccurate claims that AI platforms are repeating about your brand. Marking a claim as incorrect and providing the corrected text helps Mentionpath surface related opportunities to fix the underlying content.