There are four vectors through which to technically optimize a page:
| Vector | Definition | Example |
|---|---|---|
| Access controls | Configuring website to allow AI access to webpages | Fixing robots.txt file |
| Content delivery | Delivering content to AI with necessary technical hygiene and speed | Serving content to AI without JavaScript |
| Content quality | Making sure content is complete and in an AI-optimized format | Ensuring content is succinct enough to be completely read by AI |
| Content alignment | Ensuring content aligns with target prompts | Updating content to better match AI queries |
🔌 Shameless plug: Scrunch’s Deep AI Audit feature highlights these issues for you automatically. Run any URL through an audit to identify problems related to access controls, content delivery, content quality, or content alignment that may need attention.
1. Access controls
The big question: Is your website configured to allow AI platforms to access webpages?
If an LLM can’t index or retrieve content from your site, for all practical purposes, your site is invisible.
The most common issue is related to robots.txt files. Make sure your robots.txt is telling AI user agents that they can access the right pages. You can dig more into our recommendations here.
Keep in mind: If you’re missing a robots.txt file completely, that can also create ambiguity about how AI platforms should interact with your site.
It’s also worth checking that you’re not using any tools or configurations that may unintentionally be blocking access, like:
Consider access controls an always-on priority.
2. Content delivery
The big question: Is your content delivered with the necessary technical hygiene and speed?
Put another way: Is there meaningful content on your page that can be delivered without JavaScript and does the page load in under five seconds?
LLMs read raw code, not what happens after it runs. Lots of companies use JavaScript to dynamically display content after a page loads, but LLMs can’t execute it to see what actually appears on the screen.
If content on your site loads in stages—or is hidden behind code—AI will only get a snapshot of what’s really there.
Pages with lots of superfluous code and content eat up AI tokens. The ultimate goal is to create a token-light page that’s as easy as possible for LLMs to consume.
Meanwhile, crawlers are known to deprioritize or skip slow-loading pages. That makes quick content delivery key to visibility.
3. Content quality
The big question: Is your content complete and in a format optimized for AI platforms?
This is where code starts to bleed into creative.
For instance, pages need to be short enough to be completely read by LLMs. Page titles and descriptions need to be relevant to the content of the page. And if you have a non-JavaScript version of the page, it needs to substantially be the same as the JavaScript version.
We’ll dig deeper into content optimization below. Just remember: LLMs want answers, not clickbait. Clear, relevant, and accessible content wins the day.
4. Content alignment
The big question: Does your content actually answer the questions AI users are asking?
This is more content update than technical update, but the AI search product you’re using (Scrunch or otherwise) should be able to tell you how well page content matches the prompts you care about.
Our advice: Invest in a tool that not only tells you where your brand appears in AI chatbot responses, but also where it should appear and why it doesn’t.
Data science methodology makes it fairly easy to highlight prompts that should be answerable by a specific page on your site.
There are two ways to update content for AI search—optimize existing content or create net new content:
1. Optimize existing content
The most straightforward way to do this is to analyze the content already being cited for the prompts you care about.
Investigate how it differs from your existing content in terms of:
In general, the following are highly correlated with better performance (even if not confirmed by LLMs themselves):
Pro tip: Ask AI. You can tell your favorite LLM which prompts you’re trying to show up for, run your content through it, and ask for augmentation suggestions.
Our advice is to experiment, measure, and learn in order to understand what works and what doesn’t.
🔌 Another shameless plug: Scrunch takes the guesswork out of content updates with our Optimizer feature. We’ll analyze any page you’re auditing to understand its intent, let you choose a specific persona to target for optimization, suggest which prompts to optimize the page for, and recommend other pages on your website you may want to pull content from.
Once you review and approve (or edit), we’ll generate an optimized version of the page in markdown that you can manually add to your CMS or deliver directly to LLMs via our Agent Experience Platform (AXP).
2. Create net new content
After you’ve picked the proverbial low-hanging fruit of existing content, you can turn to net new content creation.
We recommend starting with greenfield prompts (aka prompts where your competitors don’t show up). These are open opportunities.
Next, move to prompts where competitors are present but your brand is not. It’s smart to prioritize based on substance-light content (i.e., lower-quality competitor content that will be easier to beat).
🔌 Another shameless plug: Scrunch makes it easy to find both. You can filter prompts by the presence (or lack thereof) of your brand and your competitors’ brands.
Don’t forget about content not on your website
Updating content to improve AI search performance can go well beyond your own website (get a quick refresher on securing placement in third-party sources and scaling offsite citation strategy in chapter 3).
AI search is still an emerging space, which means there’s a lot of misinformation about what works and what doesn’t for AEO/GEO:
The No. 1 takeaway for brands is this: Providing clean HTML—sent from your server, from your existing website URLs, and optimized with AI in mind—is the single most evidence-backed way to improve how AI agents consume your content.
What does good HTML look like? It has:
As far as AI is concerned, an ideal version of the web is one where websites exist as simple, semantic text repositories without any of the JavaScript bells and whistles.
The problem for brands is squaring what AI wants with what humans want when they land on your site: inherently complex, interactive, and aesthetic experiences.
The solution? Give them both what they want without letting one experience interfere with the other.
The simple answer: Provide AI agents with a dedicated site experience that serves differentiated, agent-optimized HTML.
To do this without disrupting the human web experience, you need to detect known AI user agents and route them to a different experience at your CDN or load balancer.
That's exactly what Scrunch's Agent Experience Platform (AXP) is built for.
AXP follows a three-step process:
1. Intercept AI traffic
AXP auto-detects and reroutes AI traffic visiting your website (get a quick refresher on AI agent monitoring in chapter 2) through platforms like Akamai, Cloudflare, and Vercel.
2. Translate site content
AXP automatically strips away unnecessary code that AI doesn’t value and restructures webpages into server-side rendered HTML based on your configuration.
3. Serve optimized content to AI
AXP delivers compressed, structured, and optimized content to AI traffic at the CDN level without changing the human-facing website at all.
In short, AXP scrunches (hence the name) your site specifically for AI consumption.
The result: Human visitors see your normal site and AI agents get content they can actually read and recommend.
Think of AXP as an AI-friendly layer on top of what already exists on your site. The content is already live; AXP just makes sure AI agents can actually use it.
Is AXP cloaking?
No.
Cloaking means serving different content to search crawlers and human visitors with deceptive intent. The goal is to manipulate rankings and mislead users. Google explicitly prohibits it.
AXP does something fundamentally different. It targets LLM bots only—not Googlebot, not Bingbot—and serves them structured, contextualized content to improve comprehension and accuracy. It’s the same information in a clearer format. No manipulation, no bait-and-switch.
Google's own guidelines make this distinction explicit: Technical optimizations that improve content understandability are acceptable. AXP falls squarely in that bucket.
Will AXP impact my traditional search performance?
No.
AXP is only used for real-time AI retrieval bots, not search indexing bots. Google and Bing web crawlers aren’t routed to AXP—they crawl and index the user’s website normally.
The net result is a win for everyone: AI platforms get structured data that improves answer quality, users get more accurate responses, and your brand gets better representation in AI search results.
Does AXP work?
Yes.
Scrunch has deployed AXP at very large enterprises across B2B SaaS, ecommerce, and entertainment (and across technology platforms like Akamai, AWS, and Cloudflare) with strong results.
For instance, Akamai saw an 85% increase in citations and 364% increase in brand presence for non-branded prompts after deploying AXP.
And that’s the end of chapter 4.
This guide is a living, breathing document. We’ll be updating it regularly as the AI search space continues to evolve.
Got questions we didn’t answer? Get in touch.
Until next time, thanks for reading and happy optimizing.
Start reaching more customers on AI platforms. Spin up a free account or see Scrunch in action today.