robots.txt Generator
Create a robots.txt file for your website with visual controls.
100% private — runs entirely in your browser
Quick Presets
Rules Builder
Sitemap URLs
Crawl Delay (Optional)
seconds
Not all bots respect crawl-delay. Google ignores it — use Search Console instead.
Generated robots.txt
| Directive | Description | Example |
|---|---|---|
| User-agent | Specifies which bot the rules apply to | User-agent: Googlebot |
| Disallow | Blocks a path from crawling | Disallow: /admin/ |
| Allow | Explicitly allows a path (overrides Disallow) | Allow: /admin/login |
| Sitemap | Points to your XML sitemap | Sitemap: https://example.com/sitemap.xml |
| Crawl-delay | Seconds between requests (not all bots support) | Crawl-delay: 10 |
| * | Wildcard — matches any bot or path segment | User-agent: * |
| $ | End-of-URL marker | Disallow: /*.pdf$ |