Free online tools to generate, calculate,
convert, format, encode, and play.
 

Robots.txt Analyzer

Paste or fetch a robots.txt file to validate syntax, analyze directives, and test URL paths against crawler rules.


Robots.txt Content


How It Works

This tool parses robots.txt content line by line, validates the syntax against the RFC 9309 Robots Exclusion Protocol standard, and identifies common issues.

What Gets Checked
  • Syntax validation — Each line is checked for valid directive format
  • Directive ordering — User-agent must precede Allow/Disallow rules
  • Wildcard patterns — Validates * and $ pattern usage
  • Sitemap URLs — Checks for valid absolute URLs
  • Common mistakes — Detects typos, duplicate rules, conflicting directives
URL Path Testing
  • Pattern matching — Follows standard path-matching rules
  • Wildcard support — Handles * (any sequence) and $ (end anchor)
  • Specificity — Longer matching rules take precedence
  • Agent resolution — Tests against specific or wildcard user-agent
Valid Directives
  • User-agent — Target crawler name
  • Disallow — Block a URL path
  • Allow — Override a Disallow rule
  • Sitemap — XML sitemap location
  • Crawl-delay — Delay between requests
Best Practices
  • Always include a wildcard (*) user-agent group
  • Reference your sitemap(s) for better crawl coverage
  • Keep the file under 500 KB (Google's limit)
  • Use Allow to create exceptions within Disallow rules
  • Test rules before deploying to production


Feedback

Help us improve this page by providing feedback, and include your name/email if you want us to reach back. Thank you in advance.


Share with