Robots.txt Tester
Paste your robots.txt and test any URL against any crawler to see what is allowed.
Robots.txt Tester Features
Verify your crawl rules before deployment.
Shows exactly which rule matched and whether it is Allow or Disallow.
Test against specific crawlers like Googlebot or the wildcard *.
Displays all applicable rules for the selected user-agent at a glance.
About Robots.txt Tester
This tool lets you paste any robots.txt file and test how it behaves for a specific crawler and URL path. It parses the rules, finds the most specific match, and tells you whether access is allowed or disallowed.
Frequently Asked Questions
Is this tool free to use?
Yes, this tool is completely free.
No account or registration is required.
You can use it as many times as you like.
Is my data private?
All processing happens in your browser.
Your robots.txt content is never sent to any server.
We do not store, log, or share your input.
Does it support wildcards?
Yes. The * wildcard in paths is supported.
The $ end-of-path anchor is also recognized.
Complex patterns are matched using regular expression conversion.
What if no rule matches?
If no rule matches, the URL is considered allowed by default.
This is the standard robots.txt behavior.
The tester will clearly state when no matching rule was found.
Does it support the Sitemap directive?
The Sitemap directive is parsed from the robots.txt content.
It does not affect allow/disallow rules and is informational only.
Sitemap URLs are shown in the raw rules list.
Does it work on mobile?
Yes, the tool is fully responsive.
It works on phones, tablets, and desktops.
No app download is needed.
What browsers are supported?
All modern browsers are supported.
This includes Chrome, Firefox, Safari, and Edge.
Keep your browser updated for best results.
Can I test multiple URLs at once?
Currently the tool tests one URL at a time.
Change the URL path in the input and click Test again for each URL.
The robots.txt content stays loaded while you test.