Robots.txt & Sitemap Tester

Check if a specific path is allowed for a given user-agent. Fetch and display the robots.txt.

ℹ️

About this Robots.txt & Sitemap Tester

Check robots.txt rules for a given user-agent and path, and view the file contents. Our robots.txt tester helps you verify that your robots.txt file is working correctly, test which paths are allowed or disallowed for specific user agents, and view sitemap links. Perfect for debugging robots.txt configuration, ensuring crawlers can access important pages, or verifying that blocked directories are properly restricted.

Key Features

Test robots.txt rules for specific paths

Check rules for different user agents

View full robots.txt file contents

Verify allow/disallow rules

Check sitemap links

Test multiple paths and user agents

Validate robots.txt format

Works entirely in your browser

📖

How to Use

1

Enter the website URL to test

2

Select or enter a user agent (e.g., Googlebot, * for all)

3

Enter the path you want to test

4

Click 'Test' to check robots.txt rules

5

View whether the path is allowed or disallowed

6

Review the full robots.txt file

7

Test different paths and user agents

8

Verify your robots.txt configuration

💡

Popular Use Cases

1

Verify robots.txt is working correctly

2

Test which paths are blocked or allowed

3

Debug robots.txt configuration issues

4

Check rules for specific user agents

5

Verify sitemap links in robots.txt

6

Ensure important pages are accessible

7

Test robots.txt after making changes

8

Verify blocked directories are restricted

💡

Tips & Best Practices

Test with different user agents to see rule differences

Test both allowed and disallowed paths

Verify that important pages are accessible

Check that blocked directories are properly restricted

Use * for all user agents or specific ones like Googlebot

Test paths with and without trailing slashes

Verify sitemap links are accessible

Frequently Asked Questions

Q

What user agents should I test?

Test with common user agents like Googlebot (Google), Bingbot (Bing), or * (all crawlers). Different crawlers may have different rules if specified in robots.txt.

Q

How do I know if a path is allowed?

The tester will show whether a path is allowed or disallowed based on robots.txt rules. Allowed means crawlers can access it, disallowed means they should not.

Q

Can I test paths that don't exist?

Yes, you can test any path. The tester checks robots.txt rules, not whether the path actually exists on the website.

Q

What if robots.txt doesn't exist?

If robots.txt doesn't exist, all paths are allowed by default. The tester will indicate that no robots.txt file was found.