Robots.txt Tester
Validate rules + simulate how Googlebot, Bingbot, AI crawlers (GPTBot, ClaudeBot) interpret your robots.txt.
✓ ALLOWED
Matched rule:
Allow: /Matched block: User-agent: Googlebot
Parsed:
3 blocks, 1 sitemap
- https://example.com/sitemap.xml
Blocking AI crawlers (2026)
User-agent: GPTBot Disallow: / User-agent: ClaudeBot Disallow: / User-agent: PerplexityBot Disallow: / User-agent: Google-Extended Disallow: / User-agent: anthropic-ai Disallow: /
Add these to prevent LLMs from training on your content. Note: Google-Extended blocks only AI training, not Googlebot search crawling.