FK
FreeAIKit

Robots.txt Generator

Generate a robots.txt file to control how search engines crawl your site. Add rules, sitemaps, and crawl delays. Free, instant.

Agent Rule #1

|

Sitemap URLs

robots.txt Output

# robots.txt generated by FreeAIKit.app

User-agent: *
Allow: /

Upload this file to the root of your website at example.com/robots.txt

FAQ

What is robots.txt?

robots.txt is a text file placed at the root of your website (e.g., example.com/robots.txt) that tells search engine crawlers which pages or directories they can or cannot access. It follows the Robots Exclusion Protocol.

Does robots.txt affect SEO?

Yes. robots.txt controls which pages get crawled, but it does NOT remove pages from search results (use noindex for that). Blocking important resources can hurt SEO, while blocking irrelevant pages helps crawl budget efficiency.

What is crawl-delay?

Crawl-delay tells bots to wait a specified number of seconds between requests. It helps reduce server load from aggressive crawlers. Note: Google ignores crawl-delay (use Search Console instead) but Bing and others respect it.

Should I include a sitemap in robots.txt?

Yes! Adding a Sitemap directive in robots.txt helps search engines discover your sitemap.xml automatically. This is one of the simplest ways to ensure all your important pages get indexed.

More SEO Tools You May Like