A step-by-step guide to making your website visible to AI agents
of all web traffic is now bots - more than human visitors
daily prompts on ChatGPT alone - 31% trigger web searches
growth in AI agent traffic in the last year
AI agents like ChatGPT, Gemini, Perplexity and Claude are becoming how people find businesses. When someone asks "best barber in Austin" or "dentist near me", AI searches the web for answers. If your site isn't AI-readable, you're invisible to a growing share of potential customers.
Only 12% of all domains have Schema.org markup - the #1 factor for AI visibility. Google and Microsoft both confirmed they use structured data for their AI features. Sites with Schema.org and FAQ markup get 44% more AI citations. By adding it, you're ahead of most of your competitors.
Your score is 0-100, based on signals that research shows actually impact AI visibility:
What: A JSON-LD script in your HTML that describes your business in a structured way.
Why: Research shows sites with Schema.org and FAQ markup get 44% more AI citations. This is the single biggest factor.
How to add: Add this to your HTML <head>:
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "LocalBusiness",
"name": "Your Business Name",
"description": "What you do",
"telephone": "+1-234-567-890",
"address": {
"@type": "PostalAddress",
"streetAddress": "123 Main St",
"addressLocality": "Your City"
}
}
</script>
Use the right @type: Restaurant, BarberShop, Dentist, LegalService, etc. Full list here. Add a second FAQPage schema for bonus points.
Don't want to write it yourself? Use our free Schema.org generator - enter your URL and we create the code for you.
What: Detailed, well-structured text content on your page.
Why: Pages with 5,000+ characters, headings, lists, and tables get up to 4x more AI citations than short pages.
How to improve:
What: Description and social sharing tags in your HTML head.
Why: AI agents use these for context when structured data is incomplete.
How to add:
<meta name="description" content="Short description of your business"> <meta property="og:title" content="Your Business Name"> <meta property="og:description" content="What you do"> <meta property="og:image" content="https://yoursite.com/image.jpg">
What: A plain text file at yoursite.com/llms.txt that describes your business for AI agents.
Why: An emerging standard that gives AI agents a clean summary of your business. Not widely used by AI bots yet, but growing and easy to add.
How to add:
Not technical? Contact us and we'll help you set it up.
What: A file that tells bots which parts of your site they can access.
Why: Some sites accidentally block AI crawlers. If GPTBot or ClaudeBot is blocked, AI can't read your site.
How to fix: Create yoursite.com/robots.txt with:
User-agent: * Allow: /
Make sure you don't have lines blocking GPTBot, ClaudeBot, or Anthropic-AI.
These are basics of a well-maintained website:
Small bonus points for extra files:
If you use Cloudflare, it may be blocking AI crawlers by default. Cloudflare's "Managed robots.txt" feature adds rules that block GPTBot, ClaudeBot, and others from accessing your site.
How to check: Visit yoursite.com/robots.txt. If you see "Cloudflare Managed content" with Disallow rules for GPTBot/ClaudeBot, your site is blocked.
How to fix:
Note: this only controls whether AI bots can read your site to answer user questions. If you want to block AI training on your content but still allow AI search, you can keep Cloudflare's managed robots.txt and add this to your own robots.txt:
User-agent: GPTBot Allow: / User-agent: ClaudeBot Allow: /
This allows AI search bots to read your site while Cloudflare's content signals still mark your content as "no AI training" (ai-train=no).
AI agents prefer pages with detailed, structured content. Pages with 5,000+ characters get significantly more AI citations than short pages.
How to improve:
We can generate all these files for you and tell you exactly where to put them. Free while in beta. No strings attached.
Contact us - hello@checkscore.ai