Robots.txt Generator

Build a robots.txt file to control how search engine crawlers access your site. Configure user-agents, crawl rules, and sitemap location.

User-Agents

Crawl Rules

Additional Options

Generated robots.txt

Quick Tips

  • Place robots.txt in your site's root directory (e.g., example.com/robots.txt)
  • Use * as the user-agent to apply rules to all crawlers
  • Disallow: / blocks the entire site from crawling
  • Always include your sitemap URL for better indexing
  • Crawl-delay is honored by Bing and Yandex but not Google

Frequently Asked Questions

What is a robots.txt file?

A robots.txt file tells search engine crawlers which pages on your site they can and cannot access.

Will a wrong robots.txt hurt my SEO?

Yes. Incorrectly blocking important pages can prevent them from being indexed. Always test before deploying.

Can I include my sitemap URL?

Yes, our generator includes a field to add your sitemap URL, which is recommended for SEO best practices.

Where should I place my robots.txt file?

The robots.txt file must be placed in the root directory of your website so it is accessible at yourdomain.com/robots.txt. Search engine crawlers will only look for it at that exact location, so placing it in a subdirectory will have no effect.

Does Google respect the Crawl-delay directive?

No, Googlebot ignores the Crawl-delay directive in robots.txt. However, Bing, Yandex, and some other crawlers do honor it. To control Google's crawl rate, use the crawl rate settings in Google Search Console instead.