robots.txt Generator

Create a robots.txt file for your website with visual controls.

100% private — runs entirely in your browser
Quick Presets
Rules Builder
Sitemap URLs
Crawl Delay (Optional)
seconds
Not all bots respect crawl-delay. Google ignores it — use Search Console instead.
Generated robots.txt
DirectiveDescriptionExample
User-agentSpecifies which bot the rules apply toUser-agent: Googlebot
DisallowBlocks a path from crawlingDisallow: /admin/
AllowExplicitly allows a path (overrides Disallow)Allow: /admin/login
SitemapPoints to your XML sitemapSitemap: https://example.com/sitemap.xml
Crawl-delaySeconds between requests (not all bots support)Crawl-delay: 10
*Wildcard — matches any bot or path segmentUser-agent: *
$End-of-URL markerDisallow: /*.pdf$
Copied to clipboard!