Back to Tools
Technical SEO

Robots.txt Tester

Test if a URL is allowed or blocked by robots.txt rules.

Robots.txt Directives

  • Disallow: blocks crawlers from the path
  • Allow: permits crawling (overrides Disallow)
  • * matches any characters; $ matches end of URL
  • Sitemap: tells crawlers where to find your sitemap

How to Use

  1. 1 Enter the website domain you want to check
  2. 2 Click "Fetch" to retrieve the robots.txt file
  3. 3 Review the robots.txt content and rules
  4. 4 Select a user-agent and enter a path to test
  5. 5 See whether the path is allowed or blocked, and which rule matches

Common robots.txt Issues

  • Blocking important pages: Check that product, category, and content pages aren't accidentally disallowed
  • Blocking CSS/JS: Search engines need access to render pages properly
  • Missing sitemap: Add a Sitemap directive to help crawlers find your XML sitemap
  • Overly broad rules: Be specific - a Disallow: / blocks everything

Let's work together

Monthly retainers or one-off projects. No lengthy reports that sit in a drawer.

Let's Talk