Hugo robots.txt — Correct Configuration

Hugo does not generate a robots.txt by default. To enable it, set enableRobotsTXT = true in config.toml. Hugo then uses the layouts/robots.txt template to generate the file, or falls back to a minimal default.

Enable robots.txt in config.toml

# config.toml
baseURL = "https://yourdomain.com"
enableRobotsTXT = true

Custom template — layouts/robots.txt

User-agent: *
{{ if hugo.IsProduction }}Allow: /{{ else }}Disallow: /{{ end }}
Disallow: /admin/
Disallow: /draft/

User-agent: GPTBot
Allow: /

User-agent: ClaudeBot
Allow: /

Sitemap: {{ "sitemap.xml" | absURL }}

The hugo.IsProduction check ensures staging builds disallow all crawling while production builds allow it — preventing staging content from being indexed if it's accidentally deployed publicly.

Validate your robots.txt live — fetch any URL and get AI bot coverage + URL tester.

Open robots.txt Validator →

Frequently Asked Questions

How do I enable robots.txt generation in Hugo?
Set enableRobotsTXT = true in config.toml (or hugo.toml in Hugo 0.110+). Hugo will then generate /robots.txt from layouts/robots.txt if it exists, or use an internal default template that generates Allow: / for all bots.
Where does Hugo output robots.txt?
Hugo outputs robots.txt to the /public/ directory root during hugo build. The file is at public/robots.txt which maps to https://yourdomain.com/robots.txt after deployment.
How do I add a sitemap reference to Hugo's robots.txt?
In your layouts/robots.txt template, use: Sitemap: {{ "sitemap.xml" | absURL }}. Hugo's absURL function prepends your baseURL, so it generates the correct absolute URL. Hugo generates sitemap.xml automatically.