Robots.txt Generator

Create a robots.txt file to control which parts of your site search engines can crawl and index.

1

Add Rules

Specify which bots and URLs to allow or disallow

2

Add Sitemaps

Include sitemap URLs for better indexing

3

Get Code

Generate and download your robots.txt file

User-agent Rules

Sitemaps

Crawl-delay (Optional)

Note: Crawl-delay is respected by some search engines like Bing and Yandex, but not by Google.