Can Scrunch show what sources are being cited by AI models in their responses?

Yes. Scrunch shows exactly which sources AI models cite in their responses—including your brand, competitors, and third-party sites—via its Citations feature. Each time an AI platform (e.g., ChatGPT, Gemini, Perplexity, Claude, Google AI Overviews) answers a tracked prompt, Scrunch records the webpages referenced and analyzes them.

In the Citations tab, you can see citation share by owner (your brand, competitors, third parties) and top-cited domains. For each source (filterable by domain or URL), Scrunch reports: - Brand or competitor mentions - Brand-relevant topic coverage - Unique prompt count - Total citations - Citation consistency rate - Influence Score (percentage of AI responses that cited the source x unique number of prompts)

You can filter citation data by timeframe, branded vs. non-branded prompts, custom prompt tags, persona, country, prompt topic, citation topic, AI platform, funnel stage, and citation owner.

Example

If you want to understand which sources shape AI answers for your tracked prompts, open Citations to: - Compare citation share by owner and see the top domains/URLs influencing responses - Sort by Influence Score to prioritize the domains and pages that move the needle most - Filter by persona, market, topic, funnel stage, and AI platform to get precise, go-to-market aligned insights

What counts as a citation in Scrunch?

A citation is any URL an AI platform cites when answering a tracked prompt. For every AI response collected, Scrunch captures the full list of cited URLs, identifies which pages contributed to the answer, checks for your brand or competitor presence on those pages, and maps the content to your Key Topics.

If your brand should be present on a cited page but isn’t detected, it may be due to JavaScript-only content, bot blocking, or a temporary retrieval error.


How Scrunch measures impact on brand visibility vs. competitors

Scrunch provides analytics that let you benchmark and track your position in AI search over time and against competitors: - Brand presence: Mentions and citations across AI responses, with breakdowns by persona, country, topic, platform, funnel stage, and custom tags - Citations: Frequency and sources of citations for your key prompts, plus Influence Scores to prioritize which sources to target - Share of voice: Your brand presence and citations versus competitors on the prompts that matter - Referral traffic: Traffic coming from AI responses to target prompts - AI agent traffic: Visits from LLM agents that access your site for training, indexing, or retrieval

To spot real trends, Scrunch recommends monitoring these metrics in consistent 2–3 week windows and comparing periods (e.g., Q2 vs. Q1). You can visualize brand presence alongside competitors and drill into the sources that are shifting results. Learn more about benchmarks and baselines in the related FAQ: What benchmarks or baselines are useful when evaluating AI search performance?

Cross-model coverage and reliability in citation tracking

Scrunch tracks prompts across eight major AI platforms and LLMs (including ChatGPT, Claude, Gemini, Perplexity, Google AI Mode, Google AI Overviews, Meta AI, and Microsoft Copilot; Grok support is coming). Data is collected using a combination of browser automation and official platform APIs, and you can: - Monitor prompts in aggregate across platforms - Filter by individual AI platform to see where performance diverges - Compare citations, brand presence, and share of voice by model

This gives you consistent, cross-model visibility into which sources influence answers—and where to focus improvements.

What Scrunch tracks that traditional SEO tools don’t

Scrunch is purpose-built for AI search. It focuses on how AI systems answer questions and which sources they trust, not just what appears in web SERPs. You can: - See exactly which domains/URLs AI models cite for your prompts - Track brand mentions and citations in AI answers by persona, platform, topic, country, and funnel stage - Measure share of voice versus competitors within AI-generated responses - Prioritize source outreach and content updates with Influence Score - Analyze AI agent traffic visiting your site for training, indexing, and retrieval - Pair monitoring with action using AXP to deliver AI-optimized content to AI agents visiting your website

Traditional keyword tools estimate web search volume and SERP rankings; Scrunch shows how AI systems construct answers and which sources drive those answers.

How Scrunch measures AI search “trend volume” vs. keyword tools

AI search doesn’t have universal “search volume” like traditional keywords. Instead, Scrunch measures trend momentum across your tracked prompts by: - Monitoring brand presence and citation frequency over time - Comparing performance periods (e.g., 2–3 week windows) to separate real trends from one-off changes - Segmenting by branded vs. non-branded prompts, personas, topics, markets, and platforms

This approach establishes a baseline for your business-relevant prompts and tracks meaningful changes in how often you appear and which sources AI relies on.

How Scrunch compares to other platforms for optimizing AI visibility

Compared to traditional SEO and analytics tools, Scrunch offers: - Direct visibility into AI responses and their citations across major models - Cross-platform comparisons for the same prompts - Source-level prioritization with Influence Score to guide content and partnership efforts - Integrated actions via AXP to serve AI-friendly content to visiting agents

If you’re optimizing for AI-driven discovery, Scrunch complements your existing SEO stack by revealing the AI layer—who’s being cited, how often, and where to focus to improve visibility in AI-generated answers.

Want to explore your citations and Influence Scores? Start a 7-day free trial or book a demo with our team.