Nuxt.js robots.txt — Correct Configuration

Nuxt.js serves robots.txt from the /public/ directory (Nuxt 3) or /static/ directory (Nuxt 2). The @nuxtjs/robots module provides dynamic configuration via nuxt.config.ts.

Method 1: Static file — public/robots.txt (Nuxt 3)

User-agent: *
Allow: /
Disallow: /admin/
Disallow: /api/

Sitemap: https://yourdomain.com/sitemap.xml

Method 2: @nuxtjs/robots module (nuxt.config.ts)

// nuxt.config.ts
export default defineNuxtConfig({
  modules: ['@nuxtjs/robots'],
  robots: {
    rules: [
      { UserAgent: '*' },
      { Allow: '/' },
      { Disallow: '/admin/' },
      { Disallow: '/api/' },
      { Sitemap: 'https://yourdomain.com/sitemap.xml' }
    ]
  }
})

Validate your robots.txt live — fetch any URL and get AI bot coverage + URL tester.

Open robots.txt Validator →

Frequently Asked Questions

Where does robots.txt go in Nuxt 3?
Place it in the /public/ directory as public/robots.txt. Nuxt 3 serves all files in /public/ at the root URL, so public/robots.txt is accessible at https://yourdomain.com/robots.txt.
Does @nuxtjs/robots work with SSR and static generation?
Yes. The module works with both SSR (server-side rendered) and static generation (nuxt generate). For static sites, it generates a static robots.txt file during the build process.
How do I reference a Nuxt sitemap in robots.txt?
If you're using @nuxtjs/sitemap, the default sitemap URL is /sitemap.xml or /sitemap_index.xml. Add Sitemap: https://yourdomain.com/sitemap.xml to your robots.txt or in the module configuration.