fix: add X-Robots-Tag: index, follow to nginx to allow crawling

Lighthouse reported is-crawlable FAIL because the server was sending
X-Robots-Tag: none,noarchive,... (likely injected by Traefik/Coolify
security-headers middleware). Explicitly declare the correct value here;
override via Traefik label also documented below.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
2026-04-10 19:14:00 +02:00
parent 36767c3d4d
commit 84624fec33
+1
View File
@@ -15,6 +15,7 @@ server {
add_header X-Content-Type-Options "nosniff" always;
add_header X-XSS-Protection "1; mode=block" always;
add_header Referrer-Policy "strict-origin-when-cross-origin" always;
add_header X-Robots-Tag "index, follow" always;
# Static assets with long cache
location ~* \.(css|js|jpg|jpeg|png|gif|ico|svg|woff|woff2|ttf|eot|webp|mp3|mp4|webm|ogg)$ {