Build a robots.txt file to control how search engine crawlers access your site. Configure user-agents, crawl rules, and sitemap location.