Next.js robots.txt — Correct Configuration
Next.js supports three approaches for robots.txt: a static file in /public/, the App Router Metadata API (app/robots.ts), or the next-sitemap package for automatic generation.
Method 1: App Router — app/robots.ts (Recommended)
// app/robots.ts
import { MetadataRoute } from 'next'
export default function robots(): MetadataRoute.Robots {
return {
rules: {
userAgent: '*',
allow: '/',
disallow: ['/api/', '/admin/', '/_next/'],
},
sitemap: 'https://yourdomain.com/sitemap.xml',
}
}
Method 2: Static file — public/robots.txt
User-agent: * Allow: / Disallow: /api/ Disallow: /admin/ Sitemap: https://yourdomain.com/sitemap.xml
What to Disallow in Next.js
/api/ — API routes should not be indexed. /admin/ or /dashboard/ — authenticated routes. /_next/ — Next.js build assets (note: usually already non-indexable but explicit is better).
Validate your robots.txt live — fetch any URL and get AI bot coverage + URL tester.
Open robots.txt Validator →Frequently Asked Questions
Should I block /_next/ in Next.js robots.txt?
Generally no. /_next/static/ contains CSS, JS, and image files that Googlebot needs to render your pages. Blocking it prevents rendering. /_next/ as a whole is not normally a path that would appear in search results, but allow it for asset loading.
How do I generate robots.txt automatically in Next.js?
Use the next-sitemap package: add robots: true to next-sitemap.config.js. It generates both sitemap.xml and robots.txt on each build. Alternatively, use the App Router robots.ts file which Next.js serves at /robots.txt automatically.
Does Next.js serve robots.txt automatically?
With the App Router and a robots.ts file in /app — yes. With the Pages Router, place a static robots.txt in the /public directory and Next.js serves it at /robots.txt automatically.