Shopify robots.txt — Correct Configuration Guide
Shopify generates a robots.txt file automatically and does not allow direct file editing. Since Shopify 2021, you can customise robots.txt using a liquid template: templates/robots.txt.liquid.
Default Shopify robots.txt (Generated)
User-agent: * Disallow: /admin Disallow: /cart Disallow: /orders Disallow: /checkouts/ Disallow: /checkout Disallow: /cgi-bin Disallow: /search? Disallow: /apple-app-site-association Disallow: /.well-known/shopify/monorail
Customising via liquid template
# templates/robots.txt.liquid
{% for group in robots.default_groups %}
{{ group }}
{% endfor %}
# Add AI crawler directives:
User-agent: GPTBot
Allow: /
User-agent: ClaudeBot
Allow: /
Sitemap: {{ shop.url }}/sitemap.xml
The robots.default_groups loop outputs Shopify's default rules. You then add custom directives below. This preserves Shopify's required blocks while adding your customisations.
Validate your robots.txt live — fetch any URL and get AI bot coverage + URL tester.
Open robots.txt Validator →Frequently Asked Questions
Can I edit Shopify's robots.txt directly?
Not through the file manager. Shopify generates robots.txt automatically. To customise it, go to Online Store → Themes → Edit Code → Create new template → robots.txt.liquid. This requires a 2.0 theme that supports app blocks.
Should I block /collections/ in Shopify robots.txt?
No. Collection pages are important for SEO — they are the category pages that rank for broad product terms. Blocking them removes your product category pages from Google. Only block /collections/?sort_by= and /collections/?filter= to prevent duplicate content from faceted navigation.
Does Shopify automatically add a sitemap to robots.txt?
Yes. Shopify's auto-generated robots.txt includes a Sitemap: line pointing to your store's sitemap.xml. If you override with a liquid template, include {{ shop.url }}/sitemap.xml yourself.