Top 5 AI Content Optimization Problems

At Scrunch we work with teams across sizes, industries, and stacks to help them understand and improve how their brand shows up in AI platforms. Because many AI systems now retrieve content from the web in real time, success often comes down to how accessible and understandable your site is to AI user agents like ChatGPT, Perplexity, and Google AI Overviews.

AI optimization is fast-evolving, but these five issues come up most often. Fixing them improves AI visibility and often helps traditional SEO and human UX too.

1. Over‑aggressive or misconfigured bot blocking (robots.txt, hosts, WAFs)

To be findable in web search, you need to allow search engine indexing bots like Googlebot and Bingbot. Most platforms avoid blocking these by default. AI platforms, however, also use real-time retrieval bots that fetch pages synchronously when a user asks something relevant. These can be unintentionally blocked by default firewall or anti-bot settings.

We also see teams block all bots from providers like OpenAI to avoid model training. OpenAI’s documentation states they do not train on content retrieved on behalf of users, so you can allow their real-time retrieval bot while blocking training crawlers.

How to check: Review firewall rules and robots.txt. Ask brand-specific questions in ChatGPT and see whether your content is cited. Or use Scrunch’s Site Audit to scan and flag bot-blocking patterns across pages.

How to fix: Update rules in your web host, WAF, or anti-bot provider (e.g., AWS WAF Bot Control, Cloudflare, Datadome, Imperva) to permit reputable AI retrieval user agents.

More detail: Read our overview of AI user agents.

2. Pages that require JavaScript to show content

Current-generation AI retrieval bots don’t execute JavaScript. They consume the “web 1.0” server response—before scripts run. If your primary content is injected by JS (common with SPAs/app shells), AI agents won’t see it and won’t cite you.

Googlebot does execute JS for indexing, but AI retrieval is different: it’s synchronous and time-budgeted. ChatGPT and others fetch several sources in parallel and can’t wait for heavy client-side rendering.

How to check: In Chrome DevTools, enable “Disable JavaScript” and load your pages. Scrunch’s Site Audit compares the “human” (JS-enabled) and “AI” (server-rendered) views to spot gaps.

How to fix: Prefer server-rendered or pre-rendered HTML for meaningful content (articles, product details, FAQs). If a refactor isn’t feasible—or you want fine-grained control—use Scrunch’s Agent Experience Platform (AXP). AXP detects AI agents and serves clean, server-rendered HTML without JS dependencies.

3. The wrong amount of on-page text (too little or too much)

AI search thrives on plain, well-structured text. If pages are mostly embedded media (videos, diagrams, audio) with little explanatory copy, AI agents can’t extract useful facts. Conversely, extremely long pages can dilute key information and exceed practical token budgets.

A good rule of thumb: think “feature article,” not “novel.” If your page reads like James Joyce, it’s probably too long for AI retrieval.

How to check: Review pages manually or use Site Audit’s content-length checks.

How to fix:
- For thin pages, add relevant prose: summaries, captions, transcripts, and clear FAQs.
- For very long pages, segment logically and ensure prominent summaries with scannable sections. Avoid simple pagination that splits a single article into multiple URLs.

4. Content obscured by technical markup structure

Even when content is server-rendered, some markup patterns fragment meaning when converted to plain text—the form most AI systems use internally. Complex layouts, visual builders, and table-heavy data can get mangled.

Example: neat-looking pricing tables can degrade into jumbled text the model struggles to interpret.

How to check: Preview with Reader Mode or examine the DOM with JS disabled. In Scrunch’s Site Audit, the AI Preview shows how agents likely “see” your page, and Content Clarity flags risky patterns.

How to fix:
- Adjust HTML structure and content order so important meaning survives when formatting is stripped.
- Provide prose summaries for structured data that AI might misread (e.g., a short “Pricing at a glance” section).

5. Expecting schema alone to fix AI results

Schema (JSON-LD) is valuable for Google’s rich results, but AI search is still driven primarily by unstructured text. Don’t rely on structured data in lieu of readable copy.

How to fix: Keep your schema, but ensure the page itself includes the same facts in clear prose. Review the four items above to make the text AI-friendly.


How Scrunch’s AI Site Audit differs from a traditional SEO crawl

Traditional crawlers focus on search indexing and UX signals for engines like Google: internal links, meta tags, canonicalization, sitemaps, and JS rendering as Googlebot might do.

Scrunch’s Site Audit is built for AI retrieval and citation:
- Simulates AI user agents and fetches the server-rendered “no-JS” view that ChatGPT, Perplexity, and others consume.
- Identifies robots.txt, WAF, and host-level blocks for AI bots specifically.
- Compares “human” vs. “AI” views to detect content that disappears without JS.
- Checks content length, clarity, and markup that can confuse AI summarization.
- Surfaces page-level issues correlated with citation likelihood across AI platforms.

Pair this with Scrunch Monitoring & Insights to see brand presence, citations, and competitor benchmarks across major AI platforms—data traditional SEO tools don’t provide.

How accurately Scrunch detects JavaScript or rendering issues

Scrunch audits fetch pages the way AI agents do, then compare:
- Detects when primary content is missing from the server-rendered output.
- Flags reliance on client-side rendering, dynamic components, and heavy JS that prevents AI access.
- Validates agent access by testing known AI user agents and reporting blocked responses.
- Confirms AXP delivery by verifying that AI agents receive clean, server-rendered HTML.

This approach reliably highlights pages where AI crawlers cannot access or interpret content—even when those pages look fine in a normal browser or pass a traditional SEO crawl.

What to look for in an AI search visibility tool

If you’re evaluating solutions for SEO or content strategy, prioritize:
- Cross-platform monitoring of AI answers and citations (ChatGPT, Perplexity, Google AI Overviews, Gemini, Claude, Copilot, Meta AI).
- Prompt-level tracking with personas and geographies for realistic, decision-stage queries.
- Citation and link analysis, including unlinked brand mentions.
- Auditing for AI-specific blockers: robots/WAF rules, JS dependency, content clarity, and content length.
- An “AI view” or preview of how agents parse your pages.
- Actionable recommendations and page-level fixes.
- Optional delivery of AI-optimized content to agents without changing the human site (e.g., Agent Experience Platform (AXP)).
- Change tracking, reporting, and integrations for content ops.

Practical steps to make your site show up more in AI platforms

What Scrunch tracks that traditional SEO tools don’t

To go deeper on measurement and action, explore Scrunch’s Monitoring & Insights and Agent Experience Platform.