Agent Readiness Score: How to Check and Optimize Your Site for AI Agents
In April 2026, Cloudflare launched isitagentready.com - a free tool that tells you whether your website is ready for AI agents to access, read, and interact with. This article explains what the Agent Readiness Score measures, why it matters now, and how to improve it step by step.
Since mid-2025, users have increasingly delegated tasks to AI agents - searching for information, comparing products, booking tickets, making purchases. Instead of humans manually typing URLs into a browser, an AI agent browses the web, reads data, and acts on their behalf. That raises a concrete question: is your website actually designed to be read and understood by AI?
What is the Agent Readiness Score?
The Agent Readiness Score is a numerical measure of how “AI-agent friendly” a website is. It does not measure content quality or visual design - it measures whether the site follows emerging technical standards that allow AI agents to discover, read, and interact with it.
The tool that produces this score is isitagentready.com, built by Cloudflare and launched on April 17, 2026. Enter a URL, run a scan, and you receive an overall score plus a detailed breakdown across five categories.
The score is not just a vanity metric. It is a technical map showing exactly which gaps your site needs to close to avoid being overlooked by AI agents in the Agentic AI era.
Why Agent Readiness Matters in 2026
In the old internet model, you optimized for two audiences: human users and Google’s crawler. In 2026, a third audience has emerged that is equally important: AI agents.
AI agents do not browse the web the way humans do. They do not appreciate a beautiful interface or skim banner ads. Instead, they:
- Look for metadata, config files, and technical signals to understand what a site does
- Read content as plain Markdown or structured data rather than rendering HTML
- Search for API endpoints, OAuth flows, or MCP servers to perform transactions
- Check robots.txt to understand what they are and are not allowed to do
If your website does not provide these signals, an AI agent will either skip it, misread it, or fail to complete any task on it. This is why the Agent Readiness Score is getting attention fast - Cloudflare’s announcement tweet hit 1.1M views within two days of going live.
The 5 Categories Evaluated
isitagentready.com evaluates websites across five main categories:
1. Discoverability
This category checks whether AI agents can find your website in the first place:
- robots.txt: Does the file exist and does it declare a sitemap?
- Sitemap: Does a sitemap.xml exist to help AI understand the site structure?
- Link headers: Do the HTTP response headers expose useful metadata for crawlers?
This is the easiest category to improve because it only requires configuring static files.
2. Content Accessibility
- Markdown negotiation: Can the site return content as plain Markdown (instead of HTML) when an AI agent requests it? This is a new standard - AI reads Markdown far more accurately and efficiently than it parses HTML.
3. Bot Access Control
- AI bot rules in robots.txt: Have you explicitly declared rules for specific AI bots (GPTBot, ClaudeBot, PerplexityBot, etc.)?
- Content Signals: Does the site declare preferences for how AI can use its content (training, search, input)?
- Web Bot Auth: Is there a mechanism for bots to cryptographically identify themselves?
This category lets you control which AI agents can do what on your site - important if you have paywalled content or sensitive data.
4. Protocol Discovery
This is the most technically involved category. It covers the site’s ability to let AI agents interact and transact:
- MCP Server Card: Does the site expose an MCP server so AI agents can connect and call tools?
- Agent Skills: Are the site’s available “skills” - actions agents can perform - declared and discoverable?
- WebMCP: Does the site support the experimental WebMCP protocol for real-time browser-based agent interaction?
- API Catalog: Is there a published list of available APIs?
- OAuth discovery / Protected Resource: Does the site support OAuth flows so agents can act on behalf of authenticated users?
This category matters most for SaaS products, e-commerce platforms, or any site where you want AI agents to genuinely do things - log in, place orders, call APIs.
5. Commerce
The most forward-looking category, evaluating support for agentic commerce - purchases and payments made by AI agents:
- x402: A micropayment protocol that lets AI agents autonomously pay small fees to access gated content
- UCP (Universal Commerce Protocol): A general commerce standard for agent transactions
- ACP (Agent Commerce Protocol): A protocol designed specifically for agent-to-agent commerce
Most websites today will score 0/3 here. This is the frontier of web development in 2026.
How to Use isitagentready.com - Step by Step
Step 1: Go to isitagentready.com
Visit https://isitagentready.com - the interface is minimal, just a URL input.
Step 2: Enter your URL and select a site type
Paste the URL you want to check. You can customize the scan by:
- Site type: All / Content / API / Application
- Customize checks: Select specific categories if you don’t need a full audit
Step 3: Read the results
After the scan you receive:
- An overall score
- Pass/fail for each individual check
- An explanation of why something failed and what to fix
Step 4: Use “Copy all instructions” to fix your score
This is the tool’s most useful feature. The “Copy all instructions” button generates a complete set of technical fix instructions ready to paste into a coding agent (Claude Code, Cursor, Windsurf, GitHub Copilot) - and let the AI fix the gaps automatically. No need to debug each check manually.
Quick Wins - How to Improve Your Score Fast
If you want to move the needle without a major refactor, focus on these first:
- Complete robots.txt: Declare a
Sitemap:directive and add explicit rules for major AI bots (GPTBot, ClaudeBot, PerplexityBot, Bytespider). This single file affects multiple checks and takes minutes to update. - Valid sitemap.xml: Make sure it exists and is referenced from robots.txt.
- Markdown endpoint: If you use Next.js, Astro, or another modern framework, adding a route that returns Markdown when
Accept: text/markdownis requested is straightforward. Astro makes this especially easy to implement. - Link response headers: Add a
Linkheader to your homepage HTTP response to point agents toward your sitemap or API catalog.
The first three items alone are enough to meaningfully improve your Discoverability and Content Accessibility scores without touching backend logic.
Agent Readiness and GEO - Two Sides of the Same Problem
The Agent Readiness Score looks at the technical layer. But in the context of GEO (Generative Engine Optimization), it is the infrastructure layer that GEO depends on.
GEO asks whether your content gets cited by AI. Agent Readiness asks whether AI agents can reach your content in the first place. Both are necessary: outstanding content on a site that is invisible to AI agents will still be missed.
Think of it this way: GEO is the content, Agent Readiness is the infrastructure. You need both.
Frequently Asked Questions
Is isitagentready.com free to use?
Completely free, no account required. Go to the site, enter a URL, and scan immediately.
Will a low Agent Readiness Score hurt my Google rankings?
Not directly - as of now, Agent Readiness Score is not a Google ranking factor. However, as Google and other search engines integrate AI agents deeper into their crawl and evaluation pipelines, these technical standards are likely to become ranking signals. Getting ahead of this now is a reasonable investment.
Do I need deep technical knowledge to improve my score?
Not necessarily. The “Copy all instructions” feature on isitagentready.com generates implementation-ready instructions you can paste directly into Claude Code or Cursor. The coding agent will analyze your codebase and implement the changes for you - no manual debugging required. This works even if you are not a developer.
What is an MCP Server Card and do I need one?
An MCP Server Card is a JSON file that describes the tools and capabilities your site exposes to AI agents via the Model Context Protocol (MCP). If your site is a content blog or personal portfolio, you probably do not need one. But if you have a SaaS product, public API, or any service you want AI agents to interact with programmatically, this becomes an important check to pass.
Should I block AI bots from my site?
It depends on your goals. If you want AI search engines (Perplexity, ChatGPT Search, Google AI Overviews) to find, cite, and surface your content, you should allow AI bots. If you have paywalled or proprietary content, you can block selectively. The important thing is to declare your intent explicitly in robots.txt rather than leaving it blank - explicit declarations earn you points in the Bot Access Control category.
Summary
isitagentready.com is a simple but genuinely useful tool for auditing a technical dimension of your website that most site owners have not thought about yet. In the Agentic AI era, your website does not just need to look good to human visitors - it needs to be readable by AI agents. Spend 10 minutes running the scan, use the “Copy instructions” feature with Claude Code or Cursor to patch the quick wins, and improve the rest incrementally as these standards mature.
NateCue