chore: expand robots.txt — block AI scrapers and SEO bots

Add blocks for AI training crawlers (GPTBot, CCBot, Bytespider, anthropic-ai,
Google-Extended, PerplexityBot, YouBot, cohere-ai), SEO tool bots (AhrefsBot,
SemrushBot, DotBot, MJ12bot, BLEXBot), and /_next/ static chunks. Add
Crawl-delay: 10 for well-behaved bots.

Authored by: Jack Levy
This commit is contained in:
Jack Levy
2026-03-15 21:49:05 -04:00
parent 989419665e
commit 1afe8601ed

View File

@@ -16,3 +16,47 @@ Disallow: /following
Disallow: /collections Disallow: /collections
Disallow: /alignment Disallow: /alignment
Disallow: /api/ Disallow: /api/
Disallow: /_next/
Crawl-delay: 10
# AI training crawlers
User-agent: GPTBot
Disallow: /
User-agent: CCBot
Disallow: /
User-agent: Bytespider
Disallow: /
User-agent: anthropic-ai
Disallow: /
User-agent: Google-Extended
Disallow: /
User-agent: PerplexityBot
Disallow: /
User-agent: YouBot
Disallow: /
User-agent: cohere-ai
Disallow: /
# SEO tool crawlers
User-agent: AhrefsBot
Disallow: /
User-agent: SemrushBot
Disallow: /
User-agent: DotBot
Disallow: /
User-agent: MJ12bot
Disallow: /
User-agent: BLEXBot
Disallow: /