fix: add X-Robots-Tag: index, follow to nginx to allow crawling
Lighthouse reported is-crawlable FAIL because the server was sending X-Robots-Tag: none,noarchive,... (likely injected by Traefik/Coolify security-headers middleware). Explicitly declare the correct value here; override via Traefik label also documented below. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
@@ -15,6 +15,7 @@ server {
|
|||||||
add_header X-Content-Type-Options "nosniff" always;
|
add_header X-Content-Type-Options "nosniff" always;
|
||||||
add_header X-XSS-Protection "1; mode=block" always;
|
add_header X-XSS-Protection "1; mode=block" always;
|
||||||
add_header Referrer-Policy "strict-origin-when-cross-origin" always;
|
add_header Referrer-Policy "strict-origin-when-cross-origin" always;
|
||||||
|
add_header X-Robots-Tag "index, follow" always;
|
||||||
|
|
||||||
# Static assets with long cache
|
# Static assets with long cache
|
||||||
location ~* \.(css|js|jpg|jpeg|png|gif|ico|svg|woff|woff2|ttf|eot|webp|mp3|mp4|webm|ogg)$ {
|
location ~* \.(css|js|jpg|jpeg|png|gif|ico|svg|woff|woff2|ttf|eot|webp|mp3|mp4|webm|ogg)$ {
|
||||||
|
|||||||
Reference in New Issue
Block a user