Fix: robots.txt Crawl-Delay Too High

The Crawl-delay directive in robots.txt tells crawlers to wait N seconds between requests. Setting it too high (10+ seconds) slows how quickly new content is indexed and can cause Googlebot to crawl fewer pages per day. Google does not officially support Crawl-delay — use Google Search Console's crawl rate settings instead.

The Problem

Crawl-delay was added to robots.txt to protect servers from aggressive crawling. However, Google ignores the Crawl-delay directive — it manages crawl rate internally based on server response times and your GSC settings. Setting a high Crawl-delay only affects crawlers that honour it (Bing, some smaller bots) without protecting against Google's crawl load.

The Fix

Remove Crawl-delay or set low for compatibility
# BEFORE — high crawl-delay slows indexing for crawlers that honour it:
# User-agent: *
# Crawl-delay: 30

# AFTER — remove crawl-delay (Google ignores it, others over-throttle):
User-agent: *
Allow: /

Sitemap: https://yourdomain.com/sitemap.xml

# If you need to throttle Bing specifically, use a low value:
User-agent: Bingbot
Crawl-delay: 2

Remove Crawl-delay for User-agent: *. To control Googlebot's crawl rate, use Google Search Console → Settings → Crawl rate settings. For Bing specifically, a Crawl-delay of 1–2 seconds is reasonable. Values above 10 seconds significantly reduce indexing speed for the crawlers that honour the directive.

Validate your robots.txt live — fetch any URL and get a corrected file in one click.

Open robots.txt Validator →

Frequently Asked Questions

Does Google respect Crawl-delay in robots.txt?
No. Google officially ignores the Crawl-delay directive. Googlebot manages crawl rate based on server response times and your Google Search Console crawl rate settings.
How do I slow down Googlebot without Crawl-delay?
Use Google Search Console → Settings → Crawl rate. You can set it to 'Limit Google's maximum crawl rate' and adjust the slider. This is the only way to affect Googlebot's crawl rate.
What is a reasonable Crawl-delay value for Bing?
For most sites: 1–3 seconds. Bing's default crawl rate is already conservative. A Crawl-delay of 1 is a safe default that limits any server impact without significantly slowing indexing.

Related Guides