2025-10-26 18:22:11 +01:00
# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## Overview
Multi-service Docker Compose stack named "falcon" managing production services on pivoine.art domain. Uses Arty for configuration management with centralized environment variables and custom scripts.
## Architecture
### Compose Include Pattern
Root `compose.yaml` uses Docker Compose's `include` directive to orchestrate multiple service stacks:
- **core**: Shared PostgreSQL 16 + Redis 7 infrastructure
- **proxy**: Traefik reverse proxy with Let's Encrypt SSL
- **sexy**: Directus 11 CMS + SvelteKit frontend
- **awsm**: Next.js application with SQLite
- **track**: Umami analytics (PostgreSQL)
- **gotify**: Push notification server
feat: add Scrapy, n8n, and Filestash stacks to Falcon
Added three new service stacks to the docker-compose infrastructure:
- **Scrapy stack** (scrapy.pivoine.art):
- scrapyd: Web scraping daemon with web interface (port 6800)
- scrapy: Development container for spider commands
- scrapyrt: Real-time API for running spiders (port 9080)
- **n8n stack** (n8n.pivoine.art):
- Workflow automation platform with PostgreSQL backend
- 200+ integrations for automated tasks
- Runners enabled for task execution
- Webhook support for external triggers
- **Filestash stack** (stash.pivoine.art):
- Web-based file manager with multi-backend support
- Supports SFTP, S3, Dropbox, Google Drive, FTP, WebDAV
- In-browser file viewing and media playback
Infrastructure updates:
- Updated PostgreSQL init script to create n8n database
- Added environment variables to arty.yml for all three stacks
- Updated compose.yaml include list
- Updated CLAUDE.md and README.md documentation
- Normalized service names in existing stacks (gotify, proxy, umami, vpn)
All services integrated with Traefik for SSL termination and include
Watchtower auto-update labels.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-04 22:36:13 +01:00
- **scrapy**: Scrapyd web scraping cluster (scrapyd, scrapy, scrapyrt)
- **n8n**: Workflow automation platform (PostgreSQL)
- **stash**: Filestash web-based file manager
2025-11-06 07:49:49 +01:00
- **links**: Linkwarden bookmark manager (PostgreSQL + Meilisearch)
2025-11-06 11:15:12 +01:00
- **vault**: Vaultwarden password manager (SQLite)
2025-11-06 12:24:07 +01:00
- **joplin**: Joplin Server note-taking and sync platform (PostgreSQL)
2025-11-06 17:09:52 +01:00
- **vert**: VERT file format converter (WebAssembly-based, stateless)
2025-11-06 10:13:07 +01:00
- **restic**: Backrest backup system with restic backend
- **sablier**: Dynamic scaling plugin for Traefik
2025-10-26 18:22:11 +01:00
- **vpn**: WireGuard VPN (wg-easy)
All services connect to a single external Docker network (`falcon_network` by default, defined by `$NETWORK_NAME` ).
### Environment Management via Arty
Configuration is centralized in `arty.yml` :
- **envs.default**: All environment variables with sensible defaults
- **scripts**: Common Docker Compose and operational commands
- Variables follow naming pattern: `{SERVICE}_COMPOSE_PROJECT_NAME` , `{SERVICE}_TRAEFIK_HOST` , etc.
Sensitive values (passwords, secrets) live in `.env` and override arty.yml defaults.
2025-11-04 23:24:00 +01:00
### Traefik Routing & Security Architecture
2025-10-26 18:22:11 +01:00
Services expose themselves via Docker labels:
- HTTP → HTTPS redirect on `web` entrypoint (port 80)
- SSL termination on `web-secure` entrypoint (port 443)
- Let's Encrypt certificates stored in `proxy` volume
- Path-based routing: `/api` routes to Directus backend, root to frontend
- Compression middleware applied via labels
- All routers scoped to `$NETWORK_NAME` network
2025-11-04 23:24:00 +01:00
**Security Features:**
- **TLS Security**: Minimum TLS 1.2, strong cipher suites (ECDHE, AES-GCM, ChaCha20), SNI strict mode
- **Security Headers**: HSTS (1-year), X-Frame-Options, X-XSS-Protection, Content-Type-Options, Referrer-Policy, Permissions-Policy
- **Dynamic Configuration**: Security settings in `proxy/dynamic/security.yaml` with auto-reload
- **Rate Limiting**: Available middlewares (100 req/s general, 30 req/s API)
- **HTTP Basic Auth**: Scrapyd protected with username/password authentication
2025-10-26 18:22:11 +01:00
### Database Initialization
`core/postgres/init/01-init-databases.sh` runs on first PostgreSQL startup:
- Creates `directus` database for Sexy CMS
- Creates `umami` database for Track analytics
feat: add Scrapy, n8n, and Filestash stacks to Falcon
Added three new service stacks to the docker-compose infrastructure:
- **Scrapy stack** (scrapy.pivoine.art):
- scrapyd: Web scraping daemon with web interface (port 6800)
- scrapy: Development container for spider commands
- scrapyrt: Real-time API for running spiders (port 9080)
- **n8n stack** (n8n.pivoine.art):
- Workflow automation platform with PostgreSQL backend
- 200+ integrations for automated tasks
- Runners enabled for task execution
- Webhook support for external triggers
- **Filestash stack** (stash.pivoine.art):
- Web-based file manager with multi-backend support
- Supports SFTP, S3, Dropbox, Google Drive, FTP, WebDAV
- In-browser file viewing and media playback
Infrastructure updates:
- Updated PostgreSQL init script to create n8n database
- Added environment variables to arty.yml for all three stacks
- Updated compose.yaml include list
- Updated CLAUDE.md and README.md documentation
- Normalized service names in existing stacks (gotify, proxy, umami, vpn)
All services integrated with Traefik for SSL termination and include
Watchtower auto-update labels.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-04 22:36:13 +01:00
- Creates `n8n` database for workflow automation
2025-11-06 07:49:49 +01:00
- Creates `linkwarden` database for Links bookmark manager
2025-11-06 12:24:07 +01:00
- Creates `joplin` database for Joplin Server
2025-10-26 18:22:11 +01:00
- Grants privileges to `$DB_USER`
## Common Commands
All commands use `pnpm arty` (leveraging scripts in arty.yml):
### Stack Management
```bash
# Start all services
pnpm arty up
# Stop all services
pnpm arty down
# View running containers
pnpm arty ps
# Follow logs for all services
pnpm arty logs
# Restart all services
pnpm arty restart
# Pull latest images
pnpm arty pull
# View rendered configuration (with variables substituted)
pnpm arty config
```
### Network Setup
```bash
# Create external Docker network (required before first up)
pnpm arty net/create
```
### Database Operations (Sexy/Directus)
```bash
# Export Directus database + schema snapshot
pnpm arty db/dump
# Import database dump + apply schema snapshot
pnpm arty db/import
# Export Directus uploads directory
pnpm arty uploads/export
# Import Directus uploads directory
pnpm arty uploads/import
```
### Deployment
```bash
# Sync .env file to remote VPS
pnpm arty env/sync
```
## Service-Specific Details
### Core Services (core/compose.yaml)
- **postgres**: PostgreSQL 16 Alpine, exposed on host port 5432
- Uses scram-sha-256 authentication
- Health check via `pg_isready`
- Init scripts mounted from `./postgres/init/`
- **redis**: Redis 7 Alpine for caching
- Used by Directus for cache and websocket storage
### Sexy (sexy/compose.yaml)
Directus headless CMS + SvelteKit frontend:
- **directus**: Directus 11.12.0
- Admin panel + GraphQL/REST API
- Traefik routes `/api` path to port 8055
- Volumes: `directus_uploads` for media, `directus_bundle` for extensions
- Email via SMTP (IONOS configuration in .env)
- **frontend**: Custom SvelteKit app from ghcr.io/valknarxxx/sexy
- Serves on port 3000
- Shared `directus_bundle` volume for Directus extensions
### Proxy (proxy/compose.yaml)
Traefik 3.x reverse proxy:
- Global HTTP→HTTPS redirect
- Let's Encrypt via TLS challenge
- Dashboard disabled (`api.dashboard=false` )
- Reads labels from Docker socket (`/var/run/docker.sock` )
- Scoped to `$NETWORK_NAME` network via provider configuration
### AWSM (awsm/compose.yaml)
Next.js app with embedded SQLite:
- Serves awesome-app list directory
- Optional GitHub token for API rate limits
- Optional webhook secret for database updates
- Database persisted in `awesome_data` volume
feat: add Scrapy, n8n, and Filestash stacks to Falcon
Added three new service stacks to the docker-compose infrastructure:
- **Scrapy stack** (scrapy.pivoine.art):
- scrapyd: Web scraping daemon with web interface (port 6800)
- scrapy: Development container for spider commands
- scrapyrt: Real-time API for running spiders (port 9080)
- **n8n stack** (n8n.pivoine.art):
- Workflow automation platform with PostgreSQL backend
- 200+ integrations for automated tasks
- Runners enabled for task execution
- Webhook support for external triggers
- **Filestash stack** (stash.pivoine.art):
- Web-based file manager with multi-backend support
- Supports SFTP, S3, Dropbox, Google Drive, FTP, WebDAV
- In-browser file viewing and media playback
Infrastructure updates:
- Updated PostgreSQL init script to create n8n database
- Added environment variables to arty.yml for all three stacks
- Updated compose.yaml include list
- Updated CLAUDE.md and README.md documentation
- Normalized service names in existing stacks (gotify, proxy, umami, vpn)
All services integrated with Traefik for SSL termination and include
Watchtower auto-update labels.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-04 22:36:13 +01:00
### Scrapy (scrapy/compose.yaml)
Web scraping cluster with three services:
- **scrapyd**: Scrapyd daemon exposed at `scrapy.pivoine.art:6800`
- Web interface for deploying and managing spiders
2025-11-04 23:24:00 +01:00
- Protected by HTTP Basic Auth (credentials in `.env` )
feat: add Scrapy, n8n, and Filestash stacks to Falcon
Added three new service stacks to the docker-compose infrastructure:
- **Scrapy stack** (scrapy.pivoine.art):
- scrapyd: Web scraping daemon with web interface (port 6800)
- scrapy: Development container for spider commands
- scrapyrt: Real-time API for running spiders (port 9080)
- **n8n stack** (n8n.pivoine.art):
- Workflow automation platform with PostgreSQL backend
- 200+ integrations for automated tasks
- Runners enabled for task execution
- Webhook support for external triggers
- **Filestash stack** (stash.pivoine.art):
- Web-based file manager with multi-backend support
- Supports SFTP, S3, Dropbox, Google Drive, FTP, WebDAV
- In-browser file viewing and media playback
Infrastructure updates:
- Updated PostgreSQL init script to create n8n database
- Added environment variables to arty.yml for all three stacks
- Updated compose.yaml include list
- Updated CLAUDE.md and README.md documentation
- Normalized service names in existing stacks (gotify, proxy, umami, vpn)
All services integrated with Traefik for SSL termination and include
Watchtower auto-update labels.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-04 22:36:13 +01:00
- Data persisted in `scrapyd_data` volume
- **scrapy**: Development container for running Scrapy commands
- Shared `scrapy_code` volume for spider projects
- **scrapyrt**: Scrapyd Real-Time API on port 9080
- Run spiders via HTTP API without scheduling
2025-11-04 23:24:00 +01:00
**Authentication**: Access requires username/password (stored as `SCRAPY_AUTH_USERS` in `.env` using htpasswd format)
feat: add Scrapy, n8n, and Filestash stacks to Falcon
Added three new service stacks to the docker-compose infrastructure:
- **Scrapy stack** (scrapy.pivoine.art):
- scrapyd: Web scraping daemon with web interface (port 6800)
- scrapy: Development container for spider commands
- scrapyrt: Real-time API for running spiders (port 9080)
- **n8n stack** (n8n.pivoine.art):
- Workflow automation platform with PostgreSQL backend
- 200+ integrations for automated tasks
- Runners enabled for task execution
- Webhook support for external triggers
- **Filestash stack** (stash.pivoine.art):
- Web-based file manager with multi-backend support
- Supports SFTP, S3, Dropbox, Google Drive, FTP, WebDAV
- In-browser file viewing and media playback
Infrastructure updates:
- Updated PostgreSQL init script to create n8n database
- Added environment variables to arty.yml for all three stacks
- Updated compose.yaml include list
- Updated CLAUDE.md and README.md documentation
- Normalized service names in existing stacks (gotify, proxy, umami, vpn)
All services integrated with Traefik for SSL termination and include
Watchtower auto-update labels.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-04 22:36:13 +01:00
### n8n (n8n/compose.yaml)
Workflow automation platform:
- **n8n**: n8n application exposed at `n8n.pivoine.art:5678`
- Visual workflow builder with 200+ integrations
- PostgreSQL backend for workflow persistence
- Runners enabled for task execution
- Webhook support for external triggers
- Data persisted in `n8n_data` volume
### Stash (stash/compose.yaml)
Web-based file manager:
- **filestash**: Filestash app exposed at `stash.pivoine.art:8334`
- Support for multiple storage backends (SFTP, S3, Dropbox, Google Drive, FTP, WebDAV)
- In-browser file viewer and media player
- File sharing capabilities
- Data persisted in `filestash_data` volume
2025-11-06 07:49:49 +01:00
### Links (links/compose.yaml)
Linkwarden bookmark manager with full-text search:
- **linkwarden**: Linkwarden app exposed at `links.pivoine.art:3000`
- Bookmark and link management with collections
- Full-text search via Meilisearch
- Collaborative bookmark sharing
- Screenshot and PDF archiving
- Browser extension support
- PostgreSQL backend for bookmark persistence
- Data persisted in `linkwarden_data` volume
- **linkwarden_meilisearch**: Meilisearch v1.12.8 search engine
- Powers full-text search for bookmarks
- Data persisted in `linkwarden_meili_data` volume
**Required Environment Variables** (add to `.env` ):
- `LINKS_NEXTAUTH_SECRET` : NextAuth.js secret for session encryption
- `LINKS_MEILI_MASTER_KEY` : Meilisearch master key for API authentication
2025-11-06 11:15:12 +01:00
### Vault (vault/compose.yaml)
Vaultwarden password manager (Bitwarden-compatible server):
- **vaultwarden**: Vaultwarden app exposed at `vault.pivoine.art:80`
- Self-hosted password manager compatible with Bitwarden clients
- Supports TOTP, WebAuthn/U2F two-factor authentication
- Secure password generation and sharing
- Browser extensions and mobile apps available
- Emergency access and organization support
- Data persisted in `vaultwarden_data` volume (SQLite database)
**Configuration**:
- **DOMAIN**: `https://vault.pivoine.art` (required for proper HTTPS operation)
- **WEBSOCKET_ENABLED**: `true` (enables real-time sync)
- **SIGNUPS_ALLOWED**: `false` (disable open registrations for security)
- **INVITATIONS_ALLOWED**: `true` (allow inviting users)
- **SHOW_PASSWORD_HINT**: `false` (security best practice)
**Important**:
- First user to register becomes the admin
- Use strong master password - it cannot be recovered
- Enable 2FA for all accounts
- Access admin panel at `/admin` (requires `ADMIN_TOKEN` in `.env` )
2025-11-06 12:24:07 +01:00
### Joplin (joplin/compose.yaml)
Joplin Server note-taking and synchronization platform:
- **joplin**: Joplin Server app exposed at `joplin.pivoine.art:22300`
- Self-hosted sync server for Joplin note-taking clients
- End-to-end encryption support for notebooks
- Multi-device synchronization (desktop, mobile, CLI)
- Markdown-based notes with attachments
- PostgreSQL backend for data persistence
- Compatible with official Joplin clients (Windows, macOS, Linux, Android, iOS)
- Data persisted in PostgreSQL `joplin` database
**Configuration**:
- **APP_BASE_URL**: `https://joplin.pivoine.art` (required for client synchronization)
- **APP_PORT**: `22300` (Joplin Server default port)
- **DB_CLIENT**: `pg` (PostgreSQL database driver)
- Uses shared core PostgreSQL instance
**Usage**:
1. Access https://joplin.pivoine.art to create an account
2. In Joplin desktop/mobile app, go to Settings → Synchronization
3. Select "Joplin Server" as sync target
4. Enter server URL: `https://joplin.pivoine.art`
5. Enter email and password created in step 1
2025-11-06 17:09:52 +01:00
### Vert (vert/compose.yaml)
VERT universal file format converter:
- **vert**: VERT app exposed at `vert.pivoine.art:80`
- WebAssembly-based file conversion (client-side processing)
- Supports 250+ file formats (images, audio, documents, video)
- No file size limits
- Privacy-focused: all conversions happen in the browser
- No persistent data storage required
**Configuration**:
- **PUB_HOSTNAME**: `vert.pivoine.art` (for proper URL generation)
- **PUB_ENV**: `production`
- **PUB_DISABLE_ALL_EXTERNAL_REQUESTS**: `true` (privacy mode)
**Usage**:
Simply access https://vert.pivoine.art and drag/drop files to convert between formats. All processing happens in your browser using WebAssembly - no data is uploaded to the server.
**Note**: VERT is stateless and doesn't require backups as no data is persisted.
2025-11-06 10:13:07 +01:00
### Restic (restic/compose.yaml)
Backrest backup system with restic backend:
- **backrest**: Backrest web UI exposed at `restic.pivoine.art:9898`
- Web-based interface for managing restic backups
- Automated scheduled backups with retention policies
- Support for multiple backup plans and repositories
- Real-time backup status and history
- Restore capabilities via web UI
- Data persisted in `backrest_data` , `backrest_config` , `backrest_cache` volumes
2025-11-06 10:40:28 +01:00
**Repository Configuration**:
- **Name**: `hidrive-backup`
- **URI**: `/repos` (mounted from `/mnt/hidrive/users/valknar/Backup` )
- **Password**: `falcon-backup-2025`
- **Auto-initialize**: Enabled (creates repository if missing)
- **Auto-unlock**: Enabled (automatically unlocks stuck repositories)
- **Maintenance**:
- Prune: Weekly (Sundays at 2 AM) - removes old snapshots per retention policy
- Check: Weekly (Sundays at 3 AM) - verifies repository integrity
**Backup Plans** (11 automated daily backups):
1. **postgres-backup ** (2 AM daily)
- Path: `/volumes/core_postgres_data`
- Retention: 7 daily, 4 weekly, 6 monthly, 2 yearly
2. **redis-backup ** (3 AM daily)
- Path: `/volumes/core_redis_data`
- Retention: 7 daily, 4 weekly, 3 monthly
3. **directus-uploads-backup ** (4 AM daily)
- Path: `/volumes/directus_uploads`
- Retention: 7 daily, 4 weekly, 6 monthly, 2 yearly
4. **directus-bundle-backup ** (4 AM daily)
- Path: `/volumes/directus_bundle`
- Retention: 7 daily, 4 weekly, 3 monthly
5. **awesome-backup ** (5 AM daily)
- Path: `/volumes/awesome_data`
- Retention: 7 daily, 4 weekly, 6 monthly
6. **gotify-backup ** (5 AM daily)
- Path: `/volumes/gotify_data`
- Retention: 7 daily, 4 weekly, 3 monthly
7. **scrapy-backup ** (6 AM daily)
- Paths: `/volumes/scrapyd_data` , `/volumes/scrapy_code`
- Retention: 7 daily, 4 weekly, 3 monthly
8. **n8n-backup ** (6 AM daily)
- Path: `/volumes/n8n_data`
- Retention: 7 daily, 4 weekly, 6 monthly
9. **filestash-backup ** (7 AM daily)
- Path: `/volumes/filestash_data`
- Retention: 7 daily, 4 weekly, 3 monthly
10. **linkwarden-backup ** (7 AM daily)
- Paths: `/volumes/linkwarden_data` , `/volumes/linkwarden_meili_data`
- Retention: 7 daily, 4 weekly, 6 monthly
11. **letsencrypt-backup ** (8 AM daily)
- Path: `/volumes/letsencrypt_data`
- Retention: 7 daily, 4 weekly, 12 monthly, 3 yearly
2025-11-06 11:15:12 +01:00
12. **vaultwarden-backup ** (8 AM daily)
- Path: `/volumes/vaultwarden_data`
- Retention: 7 daily, 4 weekly, 12 monthly, 3 yearly
2025-11-06 12:24:07 +01:00
13. **joplin-backup ** (2 AM daily)
- Path: `/volumes/joplin_data`
- Retention: 7 daily, 4 weekly, 6 monthly, 2 yearly
2025-11-06 10:40:28 +01:00
**Volume Mounting**:
All Docker volumes are mounted read-only to `/volumes/` with prefixed names (e.g., `backup_core_postgres_data` ) to avoid naming conflicts with other compose stacks.
**Configuration Management**:
- `config.json` template in repository defines all backup plans
- On first run, copy config into volume: `docker cp restic/config.json restic_app:/config/config.json`
- Config version must be `4` for Backrest 1.10.1 compatibility
- Backrest manages auth automatically (username: `valknar` , password set via web UI on first access)
**Important**: The backup destination path `/mnt/hidrive/users/valknar/Backup` must be accessible from the container. Ensure HiDrive is mounted on the host before starting the backup service.
2025-11-06 10:13:07 +01:00
2025-10-26 18:22:11 +01:00
## Important Environment Variables
Key variables defined in `arty.yml` and overridden in `.env` :
- `NETWORK_NAME` : Docker network name (default: `falcon_network` )
- `ADMIN_EMAIL` : Used for Let's Encrypt and service admin accounts
- `DB_USER` , `DB_PASSWORD` : PostgreSQL credentials
- `CORE_DB_HOST` , `CORE_DB_PORT` : PostgreSQL connection (default: `postgres:5432` )
- `CORE_REDIS_HOST` , `CORE_REDIS_PORT` : Redis connection (default: `redis:6379` )
- `{SERVICE}_TRAEFIK_HOST` : Domain for each service
- `{SERVICE}_TRAEFIK_ENABLED` : Toggle Traefik exposure
- `SEXY_DIRECTUS_SECRET` : Directus security secret
- `TRACK_APP_SECRET` : Umami analytics secret
## Volume Management
Each service uses named volumes prefixed with project name:
- `core_postgres_data` , `core_redis_data` : Database persistence
- `core_directus_uploads` , `core_directus_bundle` : Directus media/extensions
- `awesome_data` : AWSM SQLite database
feat: add Scrapy, n8n, and Filestash stacks to Falcon
Added three new service stacks to the docker-compose infrastructure:
- **Scrapy stack** (scrapy.pivoine.art):
- scrapyd: Web scraping daemon with web interface (port 6800)
- scrapy: Development container for spider commands
- scrapyrt: Real-time API for running spiders (port 9080)
- **n8n stack** (n8n.pivoine.art):
- Workflow automation platform with PostgreSQL backend
- 200+ integrations for automated tasks
- Runners enabled for task execution
- Webhook support for external triggers
- **Filestash stack** (stash.pivoine.art):
- Web-based file manager with multi-backend support
- Supports SFTP, S3, Dropbox, Google Drive, FTP, WebDAV
- In-browser file viewing and media playback
Infrastructure updates:
- Updated PostgreSQL init script to create n8n database
- Added environment variables to arty.yml for all three stacks
- Updated compose.yaml include list
- Updated CLAUDE.md and README.md documentation
- Normalized service names in existing stacks (gotify, proxy, umami, vpn)
All services integrated with Traefik for SSL termination and include
Watchtower auto-update labels.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-04 22:36:13 +01:00
- `scrapy_scrapyd_data` , `scrapy_scrapy_code` : Scrapy spider data and code
- `n8n_n8n_data` : n8n workflow data
- `stash_filestash_data` : Filestash configuration and state
2025-11-06 07:49:49 +01:00
- `links_data` , `links_meili_data` : Linkwarden bookmarks and Meilisearch index
2025-11-06 11:15:12 +01:00
- `vault_data` : Vaultwarden password vault (SQLite database)
2025-11-06 10:13:07 +01:00
- `restic_data` , `restic_config` , `restic_cache` , `restic_tmp` : Backrest backup system
2025-10-26 18:22:11 +01:00
- `proxy_letsencrypt_data` : SSL certificates
Volumes can be inspected with:
```bash
docker volume ls | grep falcon
docker volume inspect <volume_name>
```
2025-11-04 23:24:00 +01:00
## Security Configuration
### HTTP Basic Authentication
Scrapyd is protected with HTTP Basic Auth via Traefik middleware:
- Credentials stored in `.env` as `SCRAPY_AUTH_USERS`
- Format: `username:$apr1$hash` (Apache htpasswd format)
- Generate new hash: `openssl passwd -apr1 'your_password'`
- Remember to escape `$` signs with `$$` in `.env` files
**To update credentials:**
```bash
# Generate hash
echo "username:$(openssl passwd -apr1 'new_password')"
# Update .env
SCRAPY_AUTH_USERS=username:$$apr1$$hash$$here
# Sync to VPS
rsync -avzhe ssh .env root@vps: ~/Projects/docker-compose/
# Restart services
ssh -A root@vps "cd ~/Projects/docker-compose && arty restart"
```
### Security Headers & TLS
Global security settings applied via `proxy/dynamic/security.yaml` :
- **TLS**: Minimum TLS 1.2, strong ciphers only, SNI strict mode
- **Headers**: HSTS, X-Frame-Options, CSP, Referrer-Policy, etc.
- **Rate Limiting**: Available middlewares for DDoS protection
Test security:
```bash
# Check headers
curl -I https://scrapy.pivoine.art
# SSL Labs test
# Visit: https://www.ssllabs.com/ssltest/analyze.html?d=scrapy.pivoine.art
```
### Modifying Security Settings
Edit `proxy/dynamic/security.yaml` to customize:
- TLS versions and cipher suites
- Security header values
- Rate limiting thresholds
Traefik automatically reloads changes (no restart needed).
2025-10-26 18:22:11 +01:00
## Troubleshooting
### Services won't start
1. Ensure external network exists: `pnpm arty net/create`
2. Check if services reference correct `$NETWORK_NAME` in labels
3. Verify `.env` has required secrets (compare with `arty.yml` envs.default)
### SSL certificates failing
1. Check `ADMIN_EMAIL` is set in `.env`
2. Ensure ports 80/443 are accessible from internet
3. Inspect Traefik logs: `docker logs proxy_app`
### Database connection errors
1. Check PostgreSQL is healthy: `docker ps` (should show healthy status)
2. Verify database exists: `docker exec core_postgres psql -U $DB_USER -l`
3. Check credentials match between `.env` and service configs
### Directus schema migration
- Export schema: `pnpm arty db/dump` (creates `sexy/directus.yaml` )
- Import to new instance: `pnpm arty db/import` (applies schema snapshot)
- Schema file stored in `sexy/directus.yaml` for version control