Robots.txt & Sitemap Tester
Check if a specific path is allowed for a given user-agent. Fetch and display the robots.txt.
About this Robots.txt & Sitemap Tester
Check robots.txt rules for a given user-agent and path, and view the file contents. Our robots.txt tester helps you verify that your robots.txt file is working correctly, test which paths are allowed or disallowed for specific user agents, and view sitemap links. Perfect for debugging robots.txt configuration, ensuring crawlers can access important pages, or verifying that blocked directories are properly restricted.
Key Features
Test robots.txt rules for specific paths
Check rules for different user agents
View full robots.txt file contents
Verify allow/disallow rules
Check sitemap links
Test multiple paths and user agents
Validate robots.txt format
Works entirely in your browser
How to Use
Enter the website URL to test
Select or enter a user agent (e.g., Googlebot, * for all)
Enter the path you want to test
Click 'Test' to check robots.txt rules
View whether the path is allowed or disallowed
Review the full robots.txt file
Test different paths and user agents
Verify your robots.txt configuration
Popular Use Cases
Verify robots.txt is working correctly
Test which paths are blocked or allowed
Debug robots.txt configuration issues
Check rules for specific user agents
Verify sitemap links in robots.txt
Ensure important pages are accessible
Test robots.txt after making changes
Verify blocked directories are restricted
Tips & Best Practices
Test with different user agents to see rule differences
Test both allowed and disallowed paths
Verify that important pages are accessible
Check that blocked directories are properly restricted
Use * for all user agents or specific ones like Googlebot
Test paths with and without trailing slashes
Verify sitemap links are accessible
Frequently Asked Questions
What user agents should I test?
Test with common user agents like Googlebot (Google), Bingbot (Bing), or * (all crawlers). Different crawlers may have different rules if specified in robots.txt.
How do I know if a path is allowed?
The tester will show whether a path is allowed or disallowed based on robots.txt rules. Allowed means crawlers can access it, disallowed means they should not.
Can I test paths that don't exist?
Yes, you can test any path. The tester checks robots.txt rules, not whether the path actually exists on the website.
What if robots.txt doesn't exist?
If robots.txt doesn't exist, all paths are allowed by default. The tester will indicate that no robots.txt file was found.