AI traffic analysis

Understand, measure, and improve how AI platforms interact with your website—both when bots crawl your content and when people arrive after seeing you in AI answers.

What counts as AI traffic?

AI traffic includes two distinct streams: - AI crawler traffic: Visits from AI agents that train on, index, or retrieve your content to power answers in products like ChatGPT, Perplexity, and Gemini. - AI-driven referral traffic: Human sessions that originate from AI platforms after a user clicks through from an AI result.

Scrunch tracks both so you can see the full picture: which models are using your content and how that activity turns into visits and conversions.

Quick start: Detect AI bot traffic on your site

Use this to verify that AI agents can access your content and to identify which pages and bots drive your AI search visibility. For step-by-step instructions, read the guide: How to track if AI bots are visiting your website.

Track AI-driven referral traffic

Get setup details in this FAQ: How does Scrunch track AI search visits to my website?

How Scrunch detects AI traffic vs GA4 or server logs

Learn more in this FAQ: How does Scrunch detect AI bot traffic visiting my site?

Crawler traffic vs referral traffic

Yes—Scrunch distinguishes between the two: - Agent Traffic shows AI crawlers accessing your site content. - AI Referrals shows human visitors coming from AI platforms via GA4.

Both views work together to connect model behavior to real outcomes.

Supported AI platforms and models

Scrunch currently tracks eight major AI platforms: - ChatGPT - Claude - Gemini - Perplexity - Google AI Mode - Google AI Overviews - Meta AI - Microsoft Copilot

Support for Grok is coming soon. See the full list: Which AI platforms and LLMs can Scrunch track and monitor?

Run an AI site audit

Before optimizing content, ensure AI agents can reliably access and understand it. - Scrunch’s site audit helps you fix access, delivery, and content issues that hurt AI visibility. - Prioritize essentials like allowing legitimate AI bots in robots.txt and ensuring key pages are fetchable and readable by agents.

This focuses on AI agent accessibility and usage signals, complementing a traditional SEO crawl that’s centered on search engine indexing. Start here: Run a site audit.

Helpful resources