Robots.txt Generator

Build a properly formatted robots.txt file for your website. Configure user agents, crawl rules, sitemaps, and more.

Quick Presets

User Agent
Allowed Paths
Disallowed Paths

Common: /admin/, /wp-admin/, /private/, /tmp/, /cgi-bin/

Crawl Delay (optional)

Seconds between requests. Leave empty to skip.

Sitemap URLs
Host (optional)

Preferred domain. Used by some crawlers (Yandex).

Generated robots.txt
# Your robots.txt will appear here as you configure the options.
Copied to clipboard