feat: add Scrapy, n8n, and Filestash stacks to Falcon

Added three new service stacks to the docker-compose infrastructure:

- **Scrapy stack** (scrapy.pivoine.art):
  - scrapyd: Web scraping daemon with web interface (port 6800)
  - scrapy: Development container for spider commands
  - scrapyrt: Real-time API for running spiders (port 9080)

- **n8n stack** (n8n.pivoine.art):
  - Workflow automation platform with PostgreSQL backend
  - 200+ integrations for automated tasks
  - Runners enabled for task execution
  - Webhook support for external triggers

- **Filestash stack** (stash.pivoine.art):
  - Web-based file manager with multi-backend support
  - Supports SFTP, S3, Dropbox, Google Drive, FTP, WebDAV
  - In-browser file viewing and media playback

Infrastructure updates:
- Updated PostgreSQL init script to create n8n database
- Added environment variables to arty.yml for all three stacks
- Updated compose.yaml include list
- Updated CLAUDE.md and README.md documentation
- Normalized service names in existing stacks (gotify, proxy, umami, vpn)

All services integrated with Traefik for SSL termination and include
Watchtower auto-update labels.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
2025-11-04 22:36:13 +01:00
parent 816c24f86f
commit 785942da61
12 changed files with 260 additions and 29 deletions

View File

@@ -16,6 +16,9 @@ Root `compose.yaml` uses Docker Compose's `include` directive to orchestrate mul
- **awsm**: Next.js application with SQLite
- **track**: Umami analytics (PostgreSQL)
- **gotify**: Push notification server
- **scrapy**: Scrapyd web scraping cluster (scrapyd, scrapy, scrapyrt)
- **n8n**: Workflow automation platform (PostgreSQL)
- **stash**: Filestash web-based file manager
- **vpn**: WireGuard VPN (wg-easy)
All services connect to a single external Docker network (`falcon_network` by default, defined by `$NETWORK_NAME`).
@@ -41,6 +44,7 @@ Services expose themselves via Docker labels:
`core/postgres/init/01-init-databases.sh` runs on first PostgreSQL startup:
- Creates `directus` database for Sexy CMS
- Creates `umami` database for Track analytics
- Creates `n8n` database for workflow automation
- Grants privileges to `$DB_USER`
## Common Commands
@@ -134,6 +138,33 @@ Next.js app with embedded SQLite:
- Optional webhook secret for database updates
- Database persisted in `awesome_data` volume
### Scrapy (scrapy/compose.yaml)
Web scraping cluster with three services:
- **scrapyd**: Scrapyd daemon exposed at `scrapy.pivoine.art:6800`
- Web interface for deploying and managing spiders
- Data persisted in `scrapyd_data` volume
- **scrapy**: Development container for running Scrapy commands
- Shared `scrapy_code` volume for spider projects
- **scrapyrt**: Scrapyd Real-Time API on port 9080
- Run spiders via HTTP API without scheduling
### n8n (n8n/compose.yaml)
Workflow automation platform:
- **n8n**: n8n application exposed at `n8n.pivoine.art:5678`
- Visual workflow builder with 200+ integrations
- PostgreSQL backend for workflow persistence
- Runners enabled for task execution
- Webhook support for external triggers
- Data persisted in `n8n_data` volume
### Stash (stash/compose.yaml)
Web-based file manager:
- **filestash**: Filestash app exposed at `stash.pivoine.art:8334`
- Support for multiple storage backends (SFTP, S3, Dropbox, Google Drive, FTP, WebDAV)
- In-browser file viewer and media player
- File sharing capabilities
- Data persisted in `filestash_data` volume
## Important Environment Variables
Key variables defined in `arty.yml` and overridden in `.env`:
@@ -153,6 +184,9 @@ Each service uses named volumes prefixed with project name:
- `core_postgres_data`, `core_redis_data`: Database persistence
- `core_directus_uploads`, `core_directus_bundle`: Directus media/extensions
- `awesome_data`: AWSM SQLite database
- `scrapy_scrapyd_data`, `scrapy_scrapy_code`: Scrapy spider data and code
- `n8n_n8n_data`: n8n workflow data
- `stash_filestash_data`: Filestash configuration and state
- `proxy_letsencrypt_data`: SSL certificates
Volumes can be inspected with:

View File

@@ -48,6 +48,9 @@ The **Falcon** is a state-of-the-art containerized starship, powered by Docker's
| **AWSM** | *Intergalactic discovery catalog* | [awesome.pivoine.art](https://awesome.pivoine.art) |
| **TRACK** | *Mission analytics & telemetry* | [umami.pivoine.art](https://umami.pivoine.art) |
| **GOTIFY** | *Subspace communication relay* | [gotify.pivoine.art](https://gotify.pivoine.art) |
| **SCRAPY** | *Web scraping reconnaissance cluster* | [scrapy.pivoine.art](https://scrapy.pivoine.art) |
| **N8N** | *Automated workflow command center* | [n8n.pivoine.art](https://n8n.pivoine.art) |
| **STASH** | *Universal file management portal* | [stash.pivoine.art](https://stash.pivoine.art) |
| **VPN** | *Cloaking device network* | [vpn.pivoine.art](https://vpn.pivoine.art) |
### ⚙️ INFRASTRUCTURE
@@ -61,7 +64,8 @@ The **Falcon** is a state-of-the-art containerized starship, powered by Docker's
├─────────────────────────────────────────────────┤
│ 💾 POSTGRESQL 16 DATA CORE │
│ ├─ Directus Sector Database │
─ Umami Analytics Vault │
─ Umami Analytics Vault │
│ └─ n8n Workflow Engine Database │
├─────────────────────────────────────────────────┤
│ ⚡ REDIS CACHE HYPERDRIVE │
│ └─ Warp-speed data acceleration │
@@ -157,6 +161,9 @@ THE FALCON (falcon_network)
│ ├─ Awesome Catalog [awesome.pivoine.art]
│ ├─ Umami Analytics [umami.pivoine.art]
│ ├─ Gotify Messenger [gotify.pivoine.art]
│ ├─ Scrapyd Cluster [scrapy.pivoine.art]
│ ├─ n8n Workflows [n8n.pivoine.art]
│ ├─ Filestash Files [stash.pivoine.art]
│ └─ WireGuard VPN [vpn.pivoine.art]
└─ 💾 STORAGE VOLUMES
@@ -164,6 +171,10 @@ THE FALCON (falcon_network)
├─ directus_uploads → Alien encounter evidence
├─ directus_bundle → Custom modules
├─ awesome_data → Discovery catalog
├─ scrapyd_data → Web scraping archives
├─ scrapy_code → Spider project code
├─ n8n_data → Workflow configurations
├─ filestash_data → File manager state
└─ letsencrypt_data → Shield certificates
```

View File

@@ -53,6 +53,30 @@ envs:
GOTIFY_COMPOSE_PROJECT_NAME: messaging
GOTIFY_IMAGE: gotify/server:latest
GOTIFY_TRAEFIK_HOST: gotify.pivoine.art
# Scrapy
SCRAPY_TRAEFIK_ENABLED: true
SCRAPY_COMPOSE_PROJECT_NAME: scrapy
SCRAPY_SCRAPYD_IMAGE: vimagick/scrapyd
SCRAPY_IMAGE: vimagick/scrapyd
SCRAPY_SCRAPYRT_IMAGE: vimagick/scrapyd
SCRAPY_TRAEFIK_HOST: scrapy.pivoine.art
SCRAPY_SCRAPYD_PORT: 6800
SCRAPY_SCRAPYRT_PORT: 9080
# n8n
N8N_TRAEFIK_ENABLED: true
N8N_COMPOSE_PROJECT_NAME: n8n
N8N_IMAGE: docker.n8n.io/n8nio/n8n
N8N_TRAEFIK_HOST: n8n.pivoine.art
N8N_PORT: 5678
N8N_DB_NAME: n8n
N8N_DB_SCHEMA: public
# Filestash
STASH_TRAEFIK_ENABLED: true
STASH_COMPOSE_PROJECT_NAME: stash
STASH_IMAGE: machines/filestash:latest
STASH_TRAEFIK_HOST: stash.pivoine.art
STASH_PORT: 8334
STASH_CANARY: true
# Proxy
PROXY_COMPOSE_PROJECT_NAME: proxy
PROXY_DOCKER_IMAGE: traefik:latest

View File

@@ -4,6 +4,9 @@ include:
- awsm/compose.yaml
- sexy/compose.yaml
- gotify/compose.yaml
- scrapy/compose.yaml
- n8n/compose.yaml
- stash/compose.yaml
- umami/compose.yaml
- proxy/compose.yaml
- watch/compose.yaml

View File

@@ -12,19 +12,24 @@ psql -v ON_ERROR_STOP=1 --username "$POSTGRES_USER" --dbname "$POSTGRES_DB" <<-E
-- Main application database
SELECT 'CREATE DATABASE directus'
WHERE NOT EXISTS (SELECT FROM pg_database WHERE datname = 'directus')\gexec
-- n8n workflow database
-- Umami analytics database
SELECT 'CREATE DATABASE umami'
WHERE NOT EXISTS (SELECT FROM pg_database WHERE datname = 'umami')\gexec
-- n8n workflow automation database
SELECT 'CREATE DATABASE n8n'
WHERE NOT EXISTS (SELECT FROM pg_database WHERE datname = 'n8n')\gexec
-- Grant privileges to all databases
GRANT ALL PRIVILEGES ON DATABASE directus TO $POSTGRES_USER;
GRANT ALL PRIVILEGES ON DATABASE umami TO $POSTGRES_USER;
GRANT ALL PRIVILEGES ON DATABASE n8n TO $POSTGRES_USER;
-- Log success
SELECT 'Compose databases initialized:' AS status;
SELECT datname FROM pg_database
WHERE datname IN ('directus', 'umami')
SELECT datname FROM pg_database
WHERE datname IN ('directus', 'umami', 'n8n')
ORDER BY datname;
EOSQL
@@ -35,4 +40,5 @@ echo ""
echo "Databases available:"
echo " • directus - Sexy application database"
echo " • umami - Tracking database"
echo " • n8n - Workflow automation database"
echo ""

View File

@@ -1,5 +1,5 @@
services:
gotify_app:
gotify:
image: ${GOTIFY_IMAGE}
container_name: ${GOTIFY_COMPOSE_PROJECT_NAME}_app
restart: unless-stopped

52
n8n/compose.yaml Normal file
View File

@@ -0,0 +1,52 @@
services:
n8n:
image: ${N8N_IMAGE:-docker.n8n.io/n8nio/n8n}
container_name: ${N8N_COMPOSE_PROJECT_NAME}_app
restart: unless-stopped
ports:
- "${N8N_PORT:-5678}:5678"
volumes:
- n8n_data:/home/node/.n8n
environment:
TZ: ${TIMEZONE:-Europe/Berlin}
GENERIC_TIMEZONE: ${TIMEZONE:-Europe/Berlin}
N8N_ENFORCE_SETTINGS_FILE_PERMISSIONS: "true"
N8N_RUNNERS_ENABLED: "true"
DB_TYPE: postgresdb
DB_POSTGRESDB_DATABASE: ${N8N_DB_NAME}
DB_POSTGRESDB_HOST: ${CORE_DB_HOST}
DB_POSTGRESDB_PORT: ${CORE_DB_PORT}
DB_POSTGRESDB_USER: ${DB_USER}
DB_POSTGRESDB_PASSWORD: ${DB_PASSWORD}
DB_POSTGRESDB_SCHEMA: ${N8N_DB_SCHEMA:-public}
N8N_HOST: ${N8N_TRAEFIK_HOST}
N8N_PORT: 5678
N8N_PROTOCOL: https
WEBHOOK_URL: https://${N8N_TRAEFIK_HOST}/
depends_on:
- postgres
networks:
- compose_network
labels:
- 'traefik.enable=${N8N_TRAEFIK_ENABLED}'
- 'traefik.http.middlewares.${N8N_COMPOSE_PROJECT_NAME}-n8n-redirect-web-secure.redirectscheme.scheme=https'
- 'traefik.http.routers.${N8N_COMPOSE_PROJECT_NAME}-n8n-web.middlewares=${N8N_COMPOSE_PROJECT_NAME}-n8n-redirect-web-secure'
- 'traefik.http.routers.${N8N_COMPOSE_PROJECT_NAME}-n8n-web.rule=Host(`${N8N_TRAEFIK_HOST}`)'
- 'traefik.http.routers.${N8N_COMPOSE_PROJECT_NAME}-n8n-web.entrypoints=web'
- 'traefik.http.routers.${N8N_COMPOSE_PROJECT_NAME}-n8n-web-secure.rule=Host(`${N8N_TRAEFIK_HOST}`)'
- 'traefik.http.routers.${N8N_COMPOSE_PROJECT_NAME}-n8n-web-secure.tls.certresolver=resolver'
- 'traefik.http.routers.${N8N_COMPOSE_PROJECT_NAME}-n8n-web-secure.entrypoints=web-secure'
- 'traefik.http.middlewares.${N8N_COMPOSE_PROJECT_NAME}-n8n-web-secure-compress.compress=true'
- 'traefik.http.routers.${N8N_COMPOSE_PROJECT_NAME}-n8n-web-secure.middlewares=${N8N_COMPOSE_PROJECT_NAME}-n8n-web-secure-compress'
- 'traefik.http.services.${N8N_COMPOSE_PROJECT_NAME}-n8n-web-secure.loadbalancer.server.port=5678'
- 'traefik.docker.network=${NETWORK_NAME}'
- 'com.centurylinklabs.watchtower.enable=${WATCHTOWER_LABEL_ENABLE}'
volumes:
n8n_data:
name: ${N8N_COMPOSE_PROJECT_NAME}_n8n_data
networks:
compose_network:
name: ${NETWORK_NAME}
external: true

View File

@@ -1,5 +1,5 @@
services:
traefik_app:
traefik:
image: ${PROXY_DOCKER_IMAGE}
container_name: ${PROXY_COMPOSE_PROJECT_NAME}_app
restart: unless-stopped
@@ -7,52 +7,52 @@ services:
# API & Dashboard
- '--api.dashboard=false'
- '--api.insecure=false'
# Logging
- '--log.level=${PROXY_LOG_LEVEL:-INFO}'
- '--accesslog=true'
# Global
- '--global.sendAnonymousUsage=false'
- '--global.checkNewVersion=true'
# Docker Provider
- '--providers.docker=true'
- '--providers.docker.exposedbydefault=false'
- '--providers.docker.network=${NETWORK_NAME}'
# File Provider for dynamic configuration
# - '--providers.file.directory=/etc/traefik/dynamic'
# - '--providers.file.watch=true'
# Entrypoints
- '--entrypoints.web.address=:${PROXY_PORT_HTTP:-80}'
- '--entrypoints.web-secure.address=:${PROXY_PORT_HTTPS:-443}'
# Global HTTP to HTTPS redirect
- '--entrypoints.web.http.redirections.entryPoint.to=web-secure'
- '--entrypoints.web.http.redirections.entryPoint.scheme=https'
- '--entrypoints.web.http.redirections.entryPoint.permanent=true'
# Let's Encrypt
- '--certificatesresolvers.resolver.acme.tlschallenge=true'
- '--certificatesresolvers.resolver.acme.email=${ADMIN_EMAIL}'
- '--certificatesresolvers.resolver.acme.storage=/letsencrypt/acme.json'
healthcheck:
test: ["CMD", "traefik", "healthcheck", "--ping"]
interval: 30s
timeout: 5s
retries: 3
start_period: 10s
networks:
- compose_network
ports:
- "${PROXY_PORT_HTTP:-80}:80"
- "${PROXY_PORT_HTTPS:-443}:443"
volumes:
- letsencrypt_data:/letsencrypt
- /var/run/docker.sock:/var/run/docker.sock:ro

63
scrapy/compose.yaml Normal file
View File

@@ -0,0 +1,63 @@
services:
scrapyd:
image: ${SCRAPY_SCRAPYD_IMAGE:-vimagick/scrapyd}
container_name: ${SCRAPY_COMPOSE_PROJECT_NAME}_scrapyd
restart: unless-stopped
volumes:
- scrapyd_data:/var/lib/scrapyd
- /usr/local/lib/python3.9/dist-packages
environment:
TZ: ${TIMEZONE:-Europe/Berlin}
networks:
- compose_network
labels:
- 'traefik.enable=${SCRAPY_TRAEFIK_ENABLED}'
- 'traefik.http.middlewares.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-redirect-web-secure.redirectscheme.scheme=https'
- 'traefik.http.routers.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web.middlewares=${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-redirect-web-secure'
- 'traefik.http.routers.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web.rule=Host(`${SCRAPY_TRAEFIK_HOST}`)'
- 'traefik.http.routers.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web.entrypoints=web'
- 'traefik.http.routers.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web-secure.rule=Host(`${SCRAPY_TRAEFIK_HOST}`)'
- 'traefik.http.routers.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web-secure.tls.certresolver=resolver'
- 'traefik.http.routers.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web-secure.entrypoints=web-secure'
- 'traefik.http.middlewares.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web-secure-compress.compress=true'
- 'traefik.http.routers.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web-secure.middlewares=${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web-secure-compress'
- 'traefik.http.services.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web-secure.loadbalancer.server.port=6800'
- 'traefik.docker.network=${NETWORK_NAME}'
- 'com.centurylinklabs.watchtower.enable=${WATCHTOWER_LABEL_ENABLE}'
scrapy:
image: ${SCRAPY_IMAGE:-vimagick/scrapyd}
container_name: ${SCRAPY_COMPOSE_PROJECT_NAME}_scrapy
command: bash
volumes:
- ./code:/code
working_dir: /code
restart: unless-stopped
environment:
TZ: ${TIMEZONE:-Europe/Berlin}
networks:
- compose_network
scrapyrt:
image: ${SCRAPY_SCRAPYRT_IMAGE:-vimagick/scrapyd}
container_name: ${SCRAPY_COMPOSE_PROJECT_NAME}_scrapyrt
command: scrapyrt -i 0.0.0.0 -p 9080
restart: unless-stopped
ports:
- "${SCRAPY_SCRAPYRT_PORT:-9080}:9080"
volumes:
- scrapy_code:/code
working_dir: /code
environment:
TZ: ${TIMEZONE:-Europe/Berlin}
networks:
- compose_network
volumes:
scrapyd_data:
name: ${SCRAPY_COMPOSE_PROJECT_NAME}_scrapyd_data
networks:
compose_network:
name: ${NETWORK_NAME}
external: true

38
stash/compose.yaml Normal file
View File

@@ -0,0 +1,38 @@
services:
filestash:
image: ${STASH_IMAGE:-machines/filestash:latest}
container_name: ${STASH_COMPOSE_PROJECT_NAME}_app
restart: unless-stopped
ports:
- "${STASH_PORT:-8334}:8334"
volumes:
- filestash_data:/app/data/state/
environment:
TZ: ${TIMEZONE:-Europe/Berlin}
APPLICATION_URL: https://${STASH_TRAEFIK_HOST}
CANARY: ${STASH_CANARY:-true}
networks:
- compose_network
labels:
- 'traefik.enable=${STASH_TRAEFIK_ENABLED}'
- 'traefik.http.middlewares.${STASH_COMPOSE_PROJECT_NAME}-filestash-redirect-web-secure.redirectscheme.scheme=https'
- 'traefik.http.routers.${STASH_COMPOSE_PROJECT_NAME}-filestash-web.middlewares=${STASH_COMPOSE_PROJECT_NAME}-filestash-redirect-web-secure'
- 'traefik.http.routers.${STASH_COMPOSE_PROJECT_NAME}-filestash-web.rule=Host(`${STASH_TRAEFIK_HOST}`)'
- 'traefik.http.routers.${STASH_COMPOSE_PROJECT_NAME}-filestash-web.entrypoints=web'
- 'traefik.http.routers.${STASH_COMPOSE_PROJECT_NAME}-filestash-web-secure.rule=Host(`${STASH_TRAEFIK_HOST}`)'
- 'traefik.http.routers.${STASH_COMPOSE_PROJECT_NAME}-filestash-web-secure.tls.certresolver=resolver'
- 'traefik.http.routers.${STASH_COMPOSE_PROJECT_NAME}-filestash-web-secure.entrypoints=web-secure'
- 'traefik.http.middlewares.${STASH_COMPOSE_PROJECT_NAME}-filestash-web-secure-compress.compress=true'
- 'traefik.http.routers.${STASH_COMPOSE_PROJECT_NAME}-filestash-web-secure.middlewares=${STASH_COMPOSE_PROJECT_NAME}-filestash-web-secure-compress'
- 'traefik.http.services.${STASH_COMPOSE_PROJECT_NAME}-filestash-web-secure.loadbalancer.server.port=8334'
- 'traefik.docker.network=${NETWORK_NAME}'
- 'com.centurylinklabs.watchtower.enable=${WATCHTOWER_LABEL_ENABLE}'
volumes:
filestash_data:
name: ${STASH_COMPOSE_PROJECT_NAME}_filestash_data
networks:
compose_network:
name: ${NETWORK_NAME}
external: true

View File

@@ -1,37 +1,37 @@
services:
umami_app:
umami:
image: ${TRACK_DOCKER_IMAGE}
container_name: ${TRACK_COMPOSE_PROJECT_NAME}_app
restart: unless-stopped
environment:
TZ: ${TIMEZONE:-Europe/Amsterdam}
# Database Configuration
DATABASE_URL: postgresql://${DB_USER}:${DB_PASSWORD}@${CORE_DB_HOST}:${CORE_DB_PORT}/${TRACK_DB_NAME}
DATABASE_TYPE: postgresql
# Application Secret
APP_SECRET: ${TRACK_APP_SECRET}
# Redis Cache Integration
REDIS_URL: redis://${CORE_REDIS_HOST}:${CORE_REDIS_PORT}
CACHE_ENABLED: true
networks:
- compose_network
healthcheck:
test: ["CMD-SHELL", "curl -f http://localhost:3000/api/heartbeat || exit 1"]
interval: 30s
timeout: 10s
retries: 5
start_period: 40s
labels:
# Traefik Configuration
- 'traefik.enable=${TRACK_TRAEFIK_ENABLED:-true}'
# HTTP to HTTPS redirect
- 'traefik.http.middlewares.${TRACK_COMPOSE_PROJECT_NAME}-redirect-web-secure.redirectscheme.scheme=https'
- 'traefik.http.routers.${TRACK_COMPOSE_PROJECT_NAME}-web.middlewares=${TRACK_COMPOSE_PROJECT_NAME}-redirect-web-secure'

View File

@@ -1,5 +1,5 @@
services:
vpn_app:
vpn:
image: ${VPN_DOCKER_IMAGE}
container_name: ${VPN_COMPOSE_PROJECT_NAME}_app
restart: unless-stopped