Robots.txt Generator

Build a robots.txt file to control how search engine crawlers access your site. Configure user-agents, crawl rules, and sitemap location.

User-Agents

Crawl Rules

Additional Options

Generated robots.txt

Quick Tips

  • Place robots.txt in your site's root directory (e.g., example.com/robots.txt)
  • Use * as the user-agent to apply rules to all crawlers
  • Disallow: / blocks the entire site from crawling
  • Always include your sitemap URL for better indexing
  • Crawl-delay is honored by Bing and Yandex but not Google