Generate custom robots.txt files to control how search engines crawl your site.
Specify the user-agent (e.g., *, Googlebot, Bingbot).
Add the URL to your sitemap.
Define paths to allow or disallow. Use / to indicate all paths.
Your generated robots.txt will appear here.