Is ChatGPT Finding Your Site?

Get your AI visibility score in seconds. We check the four signals that determine whether ChatGPT, Claude, and Perplexity can crawl, understand, and cite your content.

Free. No account needed.

What Your AI Visibility Score Measures

The checker runs four tests on your site and combines them into a score out of 100. Each test maps to a real barrier that stops AI chatbots from discovering or citing your content.

  • 01
    robots.txt compliance — Whether GPTBot, ClaudeBot, PerplexityBot, and Google-Extended are explicitly allowed to crawl your pages. A blanket Disallow: / blocks all of them, even if you only intended to stop spam bots.
  • 02
    Structured data — Whether you have JSON-LD schema markup that tells AI engines what your page is about, who made it, and what category it belongs to. Without it, AI engines have to guess your context from body text alone.
  • 03
    Content quality signals — A working meta description (the text AI engines use as your page's canonical summary), an H1 heading, and enough body text for the AI to produce a coherent answer from.
  • 04
    Mobile readiness — A viewport meta tag and responsive layout. AI crawlers use mobile-first indexing, same as Googlebot. A site that fails mobile checks is deprioritised in crawl queues.

A score above 80 means you're well-positioned to appear in AI-generated answers. Below 50 means at least one of these barriers is actively blocking you.

The Most Common Reasons Sites Score Low

Most indie founders don't realise they're blocking AI bots — because those bots didn't exist when their site was built. Three patterns account for the majority of low scores:

1. A robots.txt that blocks everything

A Disallow: / under User-agent: * was added to stop spam crawlers — but it catches GPTBot and ClaudeBot too. If your site was built before 2022, check this first.

2. No structured data anywhere on the site

Without JSON-LD schema, AI engines have to infer everything from your body text. They'll often mis-categorise the page or skip it in favour of a competitor who has explicit schema telling them exactly what the product does.

3. Missing or generic meta description

The meta description is the first thing AI engines read when deciding whether to cite a page. "Welcome to our site" or a missing tag sends AI engines to your competitor's page instead.

How to Fix Each Factor

Fixing robots.txt

Open your robots.txt file (at yourdomain.com/robots.txt) and add explicit allow rules before any wildcard disallow:

User-agent: GPTBot
Allow: /

User-agent: ClaudeBot
Allow: /

User-agent: PerplexityBot
Allow: /

User-agent: Google-Extended
Allow: /

Adding structured data

Add a <script type="application/ld+json"> block to your page <head> with at minimum a WebSite or SoftwareApplication schema. Google's Rich Results Test can validate it instantly.

Writing a useful meta description

One sentence, under 155 characters, that directly answers "what is this page about?" Avoid vague openers like "Learn more about" or "Find out how." State the thing plainly. AI engines reward specificity.

Common Questions

Does a higher score guarantee ChatGPT will cite me?

No — content quality, topical authority, and how many sources already discuss your subject all factor in. But a low score guarantees you won't be cited, because the bots can't see your content to begin with.

How often should I run this check?

Run it after any major site migration, CMS update, or infrastructure change. robots.txt files are easy to accidentally overwrite during deploys.

Do I need different settings for each AI chatbot?

Only for robots.txt — each bot has its own user-agent name. Structured data and content quality improvements benefit all AI engines equally.