How to Block All AI Crawlers in robots.txt
Block all AI training crawlers — GPTBot, ClaudeBot, PerplexityBot, Bytespider, Google-Extended — with one robots.txt block.
robots.txt Syntax — Block All AI Crawlers
COPY-PASTE READY
User-agent: GPTBot Disallow: / User-agent: ClaudeBot Disallow: / User-agent: PerplexityBot Disallow: / User-agent: Bytespider Disallow: / User-agent: Google-Extended Disallow: / User-agent: CCBot Disallow: /
What This Does
Blocking All AI Crawlers prevents your content from being used in AI training datasets. This does not remove existing knowledge — it only prevents future crawling. Content already in training datasets before you add this block is not retroactively removed.
Validate your robots.txt live — check AI bot coverage and get a corrected file.
Open robots.txt Validator →