feat: add Scrapy, n8n, and Filestash stacks to Falcon
Added three new service stacks to the docker-compose infrastructure: - **Scrapy stack** (scrapy.pivoine.art): - scrapyd: Web scraping daemon with web interface (port 6800) - scrapy: Development container for spider commands - scrapyrt: Real-time API for running spiders (port 9080) - **n8n stack** (n8n.pivoine.art): - Workflow automation platform with PostgreSQL backend - 200+ integrations for automated tasks - Runners enabled for task execution - Webhook support for external triggers - **Filestash stack** (stash.pivoine.art): - Web-based file manager with multi-backend support - Supports SFTP, S3, Dropbox, Google Drive, FTP, WebDAV - In-browser file viewing and media playback Infrastructure updates: - Updated PostgreSQL init script to create n8n database - Added environment variables to arty.yml for all three stacks - Updated compose.yaml include list - Updated CLAUDE.md and README.md documentation - Normalized service names in existing stacks (gotify, proxy, umami, vpn) All services integrated with Traefik for SSL termination and include Watchtower auto-update labels. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
34
CLAUDE.md
34
CLAUDE.md
@@ -16,6 +16,9 @@ Root `compose.yaml` uses Docker Compose's `include` directive to orchestrate mul
|
|||||||
- **awsm**: Next.js application with SQLite
|
- **awsm**: Next.js application with SQLite
|
||||||
- **track**: Umami analytics (PostgreSQL)
|
- **track**: Umami analytics (PostgreSQL)
|
||||||
- **gotify**: Push notification server
|
- **gotify**: Push notification server
|
||||||
|
- **scrapy**: Scrapyd web scraping cluster (scrapyd, scrapy, scrapyrt)
|
||||||
|
- **n8n**: Workflow automation platform (PostgreSQL)
|
||||||
|
- **stash**: Filestash web-based file manager
|
||||||
- **vpn**: WireGuard VPN (wg-easy)
|
- **vpn**: WireGuard VPN (wg-easy)
|
||||||
|
|
||||||
All services connect to a single external Docker network (`falcon_network` by default, defined by `$NETWORK_NAME`).
|
All services connect to a single external Docker network (`falcon_network` by default, defined by `$NETWORK_NAME`).
|
||||||
@@ -41,6 +44,7 @@ Services expose themselves via Docker labels:
|
|||||||
`core/postgres/init/01-init-databases.sh` runs on first PostgreSQL startup:
|
`core/postgres/init/01-init-databases.sh` runs on first PostgreSQL startup:
|
||||||
- Creates `directus` database for Sexy CMS
|
- Creates `directus` database for Sexy CMS
|
||||||
- Creates `umami` database for Track analytics
|
- Creates `umami` database for Track analytics
|
||||||
|
- Creates `n8n` database for workflow automation
|
||||||
- Grants privileges to `$DB_USER`
|
- Grants privileges to `$DB_USER`
|
||||||
|
|
||||||
## Common Commands
|
## Common Commands
|
||||||
@@ -134,6 +138,33 @@ Next.js app with embedded SQLite:
|
|||||||
- Optional webhook secret for database updates
|
- Optional webhook secret for database updates
|
||||||
- Database persisted in `awesome_data` volume
|
- Database persisted in `awesome_data` volume
|
||||||
|
|
||||||
|
### Scrapy (scrapy/compose.yaml)
|
||||||
|
Web scraping cluster with three services:
|
||||||
|
- **scrapyd**: Scrapyd daemon exposed at `scrapy.pivoine.art:6800`
|
||||||
|
- Web interface for deploying and managing spiders
|
||||||
|
- Data persisted in `scrapyd_data` volume
|
||||||
|
- **scrapy**: Development container for running Scrapy commands
|
||||||
|
- Shared `scrapy_code` volume for spider projects
|
||||||
|
- **scrapyrt**: Scrapyd Real-Time API on port 9080
|
||||||
|
- Run spiders via HTTP API without scheduling
|
||||||
|
|
||||||
|
### n8n (n8n/compose.yaml)
|
||||||
|
Workflow automation platform:
|
||||||
|
- **n8n**: n8n application exposed at `n8n.pivoine.art:5678`
|
||||||
|
- Visual workflow builder with 200+ integrations
|
||||||
|
- PostgreSQL backend for workflow persistence
|
||||||
|
- Runners enabled for task execution
|
||||||
|
- Webhook support for external triggers
|
||||||
|
- Data persisted in `n8n_data` volume
|
||||||
|
|
||||||
|
### Stash (stash/compose.yaml)
|
||||||
|
Web-based file manager:
|
||||||
|
- **filestash**: Filestash app exposed at `stash.pivoine.art:8334`
|
||||||
|
- Support for multiple storage backends (SFTP, S3, Dropbox, Google Drive, FTP, WebDAV)
|
||||||
|
- In-browser file viewer and media player
|
||||||
|
- File sharing capabilities
|
||||||
|
- Data persisted in `filestash_data` volume
|
||||||
|
|
||||||
## Important Environment Variables
|
## Important Environment Variables
|
||||||
|
|
||||||
Key variables defined in `arty.yml` and overridden in `.env`:
|
Key variables defined in `arty.yml` and overridden in `.env`:
|
||||||
@@ -153,6 +184,9 @@ Each service uses named volumes prefixed with project name:
|
|||||||
- `core_postgres_data`, `core_redis_data`: Database persistence
|
- `core_postgres_data`, `core_redis_data`: Database persistence
|
||||||
- `core_directus_uploads`, `core_directus_bundle`: Directus media/extensions
|
- `core_directus_uploads`, `core_directus_bundle`: Directus media/extensions
|
||||||
- `awesome_data`: AWSM SQLite database
|
- `awesome_data`: AWSM SQLite database
|
||||||
|
- `scrapy_scrapyd_data`, `scrapy_scrapy_code`: Scrapy spider data and code
|
||||||
|
- `n8n_n8n_data`: n8n workflow data
|
||||||
|
- `stash_filestash_data`: Filestash configuration and state
|
||||||
- `proxy_letsencrypt_data`: SSL certificates
|
- `proxy_letsencrypt_data`: SSL certificates
|
||||||
|
|
||||||
Volumes can be inspected with:
|
Volumes can be inspected with:
|
||||||
|
|||||||
13
README.md
13
README.md
@@ -48,6 +48,9 @@ The **Falcon** is a state-of-the-art containerized starship, powered by Docker's
|
|||||||
| **AWSM** | *Intergalactic discovery catalog* | [awesome.pivoine.art](https://awesome.pivoine.art) |
|
| **AWSM** | *Intergalactic discovery catalog* | [awesome.pivoine.art](https://awesome.pivoine.art) |
|
||||||
| **TRACK** | *Mission analytics & telemetry* | [umami.pivoine.art](https://umami.pivoine.art) |
|
| **TRACK** | *Mission analytics & telemetry* | [umami.pivoine.art](https://umami.pivoine.art) |
|
||||||
| **GOTIFY** | *Subspace communication relay* | [gotify.pivoine.art](https://gotify.pivoine.art) |
|
| **GOTIFY** | *Subspace communication relay* | [gotify.pivoine.art](https://gotify.pivoine.art) |
|
||||||
|
| **SCRAPY** | *Web scraping reconnaissance cluster* | [scrapy.pivoine.art](https://scrapy.pivoine.art) |
|
||||||
|
| **N8N** | *Automated workflow command center* | [n8n.pivoine.art](https://n8n.pivoine.art) |
|
||||||
|
| **STASH** | *Universal file management portal* | [stash.pivoine.art](https://stash.pivoine.art) |
|
||||||
| **VPN** | *Cloaking device network* | [vpn.pivoine.art](https://vpn.pivoine.art) |
|
| **VPN** | *Cloaking device network* | [vpn.pivoine.art](https://vpn.pivoine.art) |
|
||||||
|
|
||||||
### ⚙️ INFRASTRUCTURE
|
### ⚙️ INFRASTRUCTURE
|
||||||
@@ -61,7 +64,8 @@ The **Falcon** is a state-of-the-art containerized starship, powered by Docker's
|
|||||||
├─────────────────────────────────────────────────┤
|
├─────────────────────────────────────────────────┤
|
||||||
│ 💾 POSTGRESQL 16 DATA CORE │
|
│ 💾 POSTGRESQL 16 DATA CORE │
|
||||||
│ ├─ Directus Sector Database │
|
│ ├─ Directus Sector Database │
|
||||||
│ └─ Umami Analytics Vault │
|
│ ├─ Umami Analytics Vault │
|
||||||
|
│ └─ n8n Workflow Engine Database │
|
||||||
├─────────────────────────────────────────────────┤
|
├─────────────────────────────────────────────────┤
|
||||||
│ ⚡ REDIS CACHE HYPERDRIVE │
|
│ ⚡ REDIS CACHE HYPERDRIVE │
|
||||||
│ └─ Warp-speed data acceleration │
|
│ └─ Warp-speed data acceleration │
|
||||||
@@ -157,6 +161,9 @@ THE FALCON (falcon_network)
|
|||||||
│ ├─ Awesome Catalog [awesome.pivoine.art]
|
│ ├─ Awesome Catalog [awesome.pivoine.art]
|
||||||
│ ├─ Umami Analytics [umami.pivoine.art]
|
│ ├─ Umami Analytics [umami.pivoine.art]
|
||||||
│ ├─ Gotify Messenger [gotify.pivoine.art]
|
│ ├─ Gotify Messenger [gotify.pivoine.art]
|
||||||
|
│ ├─ Scrapyd Cluster [scrapy.pivoine.art]
|
||||||
|
│ ├─ n8n Workflows [n8n.pivoine.art]
|
||||||
|
│ ├─ Filestash Files [stash.pivoine.art]
|
||||||
│ └─ WireGuard VPN [vpn.pivoine.art]
|
│ └─ WireGuard VPN [vpn.pivoine.art]
|
||||||
│
|
│
|
||||||
└─ 💾 STORAGE VOLUMES
|
└─ 💾 STORAGE VOLUMES
|
||||||
@@ -164,6 +171,10 @@ THE FALCON (falcon_network)
|
|||||||
├─ directus_uploads → Alien encounter evidence
|
├─ directus_uploads → Alien encounter evidence
|
||||||
├─ directus_bundle → Custom modules
|
├─ directus_bundle → Custom modules
|
||||||
├─ awesome_data → Discovery catalog
|
├─ awesome_data → Discovery catalog
|
||||||
|
├─ scrapyd_data → Web scraping archives
|
||||||
|
├─ scrapy_code → Spider project code
|
||||||
|
├─ n8n_data → Workflow configurations
|
||||||
|
├─ filestash_data → File manager state
|
||||||
└─ letsencrypt_data → Shield certificates
|
└─ letsencrypt_data → Shield certificates
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|||||||
24
arty.yml
24
arty.yml
@@ -53,6 +53,30 @@ envs:
|
|||||||
GOTIFY_COMPOSE_PROJECT_NAME: messaging
|
GOTIFY_COMPOSE_PROJECT_NAME: messaging
|
||||||
GOTIFY_IMAGE: gotify/server:latest
|
GOTIFY_IMAGE: gotify/server:latest
|
||||||
GOTIFY_TRAEFIK_HOST: gotify.pivoine.art
|
GOTIFY_TRAEFIK_HOST: gotify.pivoine.art
|
||||||
|
# Scrapy
|
||||||
|
SCRAPY_TRAEFIK_ENABLED: true
|
||||||
|
SCRAPY_COMPOSE_PROJECT_NAME: scrapy
|
||||||
|
SCRAPY_SCRAPYD_IMAGE: vimagick/scrapyd
|
||||||
|
SCRAPY_IMAGE: vimagick/scrapyd
|
||||||
|
SCRAPY_SCRAPYRT_IMAGE: vimagick/scrapyd
|
||||||
|
SCRAPY_TRAEFIK_HOST: scrapy.pivoine.art
|
||||||
|
SCRAPY_SCRAPYD_PORT: 6800
|
||||||
|
SCRAPY_SCRAPYRT_PORT: 9080
|
||||||
|
# n8n
|
||||||
|
N8N_TRAEFIK_ENABLED: true
|
||||||
|
N8N_COMPOSE_PROJECT_NAME: n8n
|
||||||
|
N8N_IMAGE: docker.n8n.io/n8nio/n8n
|
||||||
|
N8N_TRAEFIK_HOST: n8n.pivoine.art
|
||||||
|
N8N_PORT: 5678
|
||||||
|
N8N_DB_NAME: n8n
|
||||||
|
N8N_DB_SCHEMA: public
|
||||||
|
# Filestash
|
||||||
|
STASH_TRAEFIK_ENABLED: true
|
||||||
|
STASH_COMPOSE_PROJECT_NAME: stash
|
||||||
|
STASH_IMAGE: machines/filestash:latest
|
||||||
|
STASH_TRAEFIK_HOST: stash.pivoine.art
|
||||||
|
STASH_PORT: 8334
|
||||||
|
STASH_CANARY: true
|
||||||
# Proxy
|
# Proxy
|
||||||
PROXY_COMPOSE_PROJECT_NAME: proxy
|
PROXY_COMPOSE_PROJECT_NAME: proxy
|
||||||
PROXY_DOCKER_IMAGE: traefik:latest
|
PROXY_DOCKER_IMAGE: traefik:latest
|
||||||
|
|||||||
@@ -4,6 +4,9 @@ include:
|
|||||||
- awsm/compose.yaml
|
- awsm/compose.yaml
|
||||||
- sexy/compose.yaml
|
- sexy/compose.yaml
|
||||||
- gotify/compose.yaml
|
- gotify/compose.yaml
|
||||||
|
- scrapy/compose.yaml
|
||||||
|
- n8n/compose.yaml
|
||||||
|
- stash/compose.yaml
|
||||||
- umami/compose.yaml
|
- umami/compose.yaml
|
||||||
- proxy/compose.yaml
|
- proxy/compose.yaml
|
||||||
- watch/compose.yaml
|
- watch/compose.yaml
|
||||||
|
|||||||
@@ -13,18 +13,23 @@ psql -v ON_ERROR_STOP=1 --username "$POSTGRES_USER" --dbname "$POSTGRES_DB" <<-E
|
|||||||
SELECT 'CREATE DATABASE directus'
|
SELECT 'CREATE DATABASE directus'
|
||||||
WHERE NOT EXISTS (SELECT FROM pg_database WHERE datname = 'directus')\gexec
|
WHERE NOT EXISTS (SELECT FROM pg_database WHERE datname = 'directus')\gexec
|
||||||
|
|
||||||
-- n8n workflow database
|
-- Umami analytics database
|
||||||
SELECT 'CREATE DATABASE umami'
|
SELECT 'CREATE DATABASE umami'
|
||||||
WHERE NOT EXISTS (SELECT FROM pg_database WHERE datname = 'umami')\gexec
|
WHERE NOT EXISTS (SELECT FROM pg_database WHERE datname = 'umami')\gexec
|
||||||
|
|
||||||
|
-- n8n workflow automation database
|
||||||
|
SELECT 'CREATE DATABASE n8n'
|
||||||
|
WHERE NOT EXISTS (SELECT FROM pg_database WHERE datname = 'n8n')\gexec
|
||||||
|
|
||||||
-- Grant privileges to all databases
|
-- Grant privileges to all databases
|
||||||
GRANT ALL PRIVILEGES ON DATABASE directus TO $POSTGRES_USER;
|
GRANT ALL PRIVILEGES ON DATABASE directus TO $POSTGRES_USER;
|
||||||
GRANT ALL PRIVILEGES ON DATABASE umami TO $POSTGRES_USER;
|
GRANT ALL PRIVILEGES ON DATABASE umami TO $POSTGRES_USER;
|
||||||
|
GRANT ALL PRIVILEGES ON DATABASE n8n TO $POSTGRES_USER;
|
||||||
|
|
||||||
-- Log success
|
-- Log success
|
||||||
SELECT 'Compose databases initialized:' AS status;
|
SELECT 'Compose databases initialized:' AS status;
|
||||||
SELECT datname FROM pg_database
|
SELECT datname FROM pg_database
|
||||||
WHERE datname IN ('directus', 'umami')
|
WHERE datname IN ('directus', 'umami', 'n8n')
|
||||||
ORDER BY datname;
|
ORDER BY datname;
|
||||||
EOSQL
|
EOSQL
|
||||||
|
|
||||||
@@ -35,4 +40,5 @@ echo ""
|
|||||||
echo "Databases available:"
|
echo "Databases available:"
|
||||||
echo " • directus - Sexy application database"
|
echo " • directus - Sexy application database"
|
||||||
echo " • umami - Tracking database"
|
echo " • umami - Tracking database"
|
||||||
|
echo " • n8n - Workflow automation database"
|
||||||
echo ""
|
echo ""
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
services:
|
services:
|
||||||
gotify_app:
|
gotify:
|
||||||
image: ${GOTIFY_IMAGE}
|
image: ${GOTIFY_IMAGE}
|
||||||
container_name: ${GOTIFY_COMPOSE_PROJECT_NAME}_app
|
container_name: ${GOTIFY_COMPOSE_PROJECT_NAME}_app
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
|
|||||||
52
n8n/compose.yaml
Normal file
52
n8n/compose.yaml
Normal file
@@ -0,0 +1,52 @@
|
|||||||
|
services:
|
||||||
|
n8n:
|
||||||
|
image: ${N8N_IMAGE:-docker.n8n.io/n8nio/n8n}
|
||||||
|
container_name: ${N8N_COMPOSE_PROJECT_NAME}_app
|
||||||
|
restart: unless-stopped
|
||||||
|
ports:
|
||||||
|
- "${N8N_PORT:-5678}:5678"
|
||||||
|
volumes:
|
||||||
|
- n8n_data:/home/node/.n8n
|
||||||
|
environment:
|
||||||
|
TZ: ${TIMEZONE:-Europe/Berlin}
|
||||||
|
GENERIC_TIMEZONE: ${TIMEZONE:-Europe/Berlin}
|
||||||
|
N8N_ENFORCE_SETTINGS_FILE_PERMISSIONS: "true"
|
||||||
|
N8N_RUNNERS_ENABLED: "true"
|
||||||
|
DB_TYPE: postgresdb
|
||||||
|
DB_POSTGRESDB_DATABASE: ${N8N_DB_NAME}
|
||||||
|
DB_POSTGRESDB_HOST: ${CORE_DB_HOST}
|
||||||
|
DB_POSTGRESDB_PORT: ${CORE_DB_PORT}
|
||||||
|
DB_POSTGRESDB_USER: ${DB_USER}
|
||||||
|
DB_POSTGRESDB_PASSWORD: ${DB_PASSWORD}
|
||||||
|
DB_POSTGRESDB_SCHEMA: ${N8N_DB_SCHEMA:-public}
|
||||||
|
N8N_HOST: ${N8N_TRAEFIK_HOST}
|
||||||
|
N8N_PORT: 5678
|
||||||
|
N8N_PROTOCOL: https
|
||||||
|
WEBHOOK_URL: https://${N8N_TRAEFIK_HOST}/
|
||||||
|
depends_on:
|
||||||
|
- postgres
|
||||||
|
networks:
|
||||||
|
- compose_network
|
||||||
|
labels:
|
||||||
|
- 'traefik.enable=${N8N_TRAEFIK_ENABLED}'
|
||||||
|
- 'traefik.http.middlewares.${N8N_COMPOSE_PROJECT_NAME}-n8n-redirect-web-secure.redirectscheme.scheme=https'
|
||||||
|
- 'traefik.http.routers.${N8N_COMPOSE_PROJECT_NAME}-n8n-web.middlewares=${N8N_COMPOSE_PROJECT_NAME}-n8n-redirect-web-secure'
|
||||||
|
- 'traefik.http.routers.${N8N_COMPOSE_PROJECT_NAME}-n8n-web.rule=Host(`${N8N_TRAEFIK_HOST}`)'
|
||||||
|
- 'traefik.http.routers.${N8N_COMPOSE_PROJECT_NAME}-n8n-web.entrypoints=web'
|
||||||
|
- 'traefik.http.routers.${N8N_COMPOSE_PROJECT_NAME}-n8n-web-secure.rule=Host(`${N8N_TRAEFIK_HOST}`)'
|
||||||
|
- 'traefik.http.routers.${N8N_COMPOSE_PROJECT_NAME}-n8n-web-secure.tls.certresolver=resolver'
|
||||||
|
- 'traefik.http.routers.${N8N_COMPOSE_PROJECT_NAME}-n8n-web-secure.entrypoints=web-secure'
|
||||||
|
- 'traefik.http.middlewares.${N8N_COMPOSE_PROJECT_NAME}-n8n-web-secure-compress.compress=true'
|
||||||
|
- 'traefik.http.routers.${N8N_COMPOSE_PROJECT_NAME}-n8n-web-secure.middlewares=${N8N_COMPOSE_PROJECT_NAME}-n8n-web-secure-compress'
|
||||||
|
- 'traefik.http.services.${N8N_COMPOSE_PROJECT_NAME}-n8n-web-secure.loadbalancer.server.port=5678'
|
||||||
|
- 'traefik.docker.network=${NETWORK_NAME}'
|
||||||
|
- 'com.centurylinklabs.watchtower.enable=${WATCHTOWER_LABEL_ENABLE}'
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
n8n_data:
|
||||||
|
name: ${N8N_COMPOSE_PROJECT_NAME}_n8n_data
|
||||||
|
|
||||||
|
networks:
|
||||||
|
compose_network:
|
||||||
|
name: ${NETWORK_NAME}
|
||||||
|
external: true
|
||||||
@@ -1,5 +1,5 @@
|
|||||||
services:
|
services:
|
||||||
traefik_app:
|
traefik:
|
||||||
image: ${PROXY_DOCKER_IMAGE}
|
image: ${PROXY_DOCKER_IMAGE}
|
||||||
container_name: ${PROXY_COMPOSE_PROJECT_NAME}_app
|
container_name: ${PROXY_COMPOSE_PROJECT_NAME}_app
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
|
|||||||
63
scrapy/compose.yaml
Normal file
63
scrapy/compose.yaml
Normal file
@@ -0,0 +1,63 @@
|
|||||||
|
services:
|
||||||
|
scrapyd:
|
||||||
|
image: ${SCRAPY_SCRAPYD_IMAGE:-vimagick/scrapyd}
|
||||||
|
container_name: ${SCRAPY_COMPOSE_PROJECT_NAME}_scrapyd
|
||||||
|
restart: unless-stopped
|
||||||
|
volumes:
|
||||||
|
- scrapyd_data:/var/lib/scrapyd
|
||||||
|
- /usr/local/lib/python3.9/dist-packages
|
||||||
|
environment:
|
||||||
|
TZ: ${TIMEZONE:-Europe/Berlin}
|
||||||
|
networks:
|
||||||
|
- compose_network
|
||||||
|
labels:
|
||||||
|
- 'traefik.enable=${SCRAPY_TRAEFIK_ENABLED}'
|
||||||
|
- 'traefik.http.middlewares.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-redirect-web-secure.redirectscheme.scheme=https'
|
||||||
|
- 'traefik.http.routers.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web.middlewares=${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-redirect-web-secure'
|
||||||
|
- 'traefik.http.routers.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web.rule=Host(`${SCRAPY_TRAEFIK_HOST}`)'
|
||||||
|
- 'traefik.http.routers.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web.entrypoints=web'
|
||||||
|
- 'traefik.http.routers.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web-secure.rule=Host(`${SCRAPY_TRAEFIK_HOST}`)'
|
||||||
|
- 'traefik.http.routers.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web-secure.tls.certresolver=resolver'
|
||||||
|
- 'traefik.http.routers.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web-secure.entrypoints=web-secure'
|
||||||
|
- 'traefik.http.middlewares.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web-secure-compress.compress=true'
|
||||||
|
- 'traefik.http.routers.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web-secure.middlewares=${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web-secure-compress'
|
||||||
|
- 'traefik.http.services.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web-secure.loadbalancer.server.port=6800'
|
||||||
|
- 'traefik.docker.network=${NETWORK_NAME}'
|
||||||
|
- 'com.centurylinklabs.watchtower.enable=${WATCHTOWER_LABEL_ENABLE}'
|
||||||
|
|
||||||
|
scrapy:
|
||||||
|
image: ${SCRAPY_IMAGE:-vimagick/scrapyd}
|
||||||
|
container_name: ${SCRAPY_COMPOSE_PROJECT_NAME}_scrapy
|
||||||
|
command: bash
|
||||||
|
volumes:
|
||||||
|
- ./code:/code
|
||||||
|
working_dir: /code
|
||||||
|
restart: unless-stopped
|
||||||
|
environment:
|
||||||
|
TZ: ${TIMEZONE:-Europe/Berlin}
|
||||||
|
networks:
|
||||||
|
- compose_network
|
||||||
|
|
||||||
|
scrapyrt:
|
||||||
|
image: ${SCRAPY_SCRAPYRT_IMAGE:-vimagick/scrapyd}
|
||||||
|
container_name: ${SCRAPY_COMPOSE_PROJECT_NAME}_scrapyrt
|
||||||
|
command: scrapyrt -i 0.0.0.0 -p 9080
|
||||||
|
restart: unless-stopped
|
||||||
|
ports:
|
||||||
|
- "${SCRAPY_SCRAPYRT_PORT:-9080}:9080"
|
||||||
|
volumes:
|
||||||
|
- scrapy_code:/code
|
||||||
|
working_dir: /code
|
||||||
|
environment:
|
||||||
|
TZ: ${TIMEZONE:-Europe/Berlin}
|
||||||
|
networks:
|
||||||
|
- compose_network
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
scrapyd_data:
|
||||||
|
name: ${SCRAPY_COMPOSE_PROJECT_NAME}_scrapyd_data
|
||||||
|
|
||||||
|
networks:
|
||||||
|
compose_network:
|
||||||
|
name: ${NETWORK_NAME}
|
||||||
|
external: true
|
||||||
38
stash/compose.yaml
Normal file
38
stash/compose.yaml
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
services:
|
||||||
|
filestash:
|
||||||
|
image: ${STASH_IMAGE:-machines/filestash:latest}
|
||||||
|
container_name: ${STASH_COMPOSE_PROJECT_NAME}_app
|
||||||
|
restart: unless-stopped
|
||||||
|
ports:
|
||||||
|
- "${STASH_PORT:-8334}:8334"
|
||||||
|
volumes:
|
||||||
|
- filestash_data:/app/data/state/
|
||||||
|
environment:
|
||||||
|
TZ: ${TIMEZONE:-Europe/Berlin}
|
||||||
|
APPLICATION_URL: https://${STASH_TRAEFIK_HOST}
|
||||||
|
CANARY: ${STASH_CANARY:-true}
|
||||||
|
networks:
|
||||||
|
- compose_network
|
||||||
|
labels:
|
||||||
|
- 'traefik.enable=${STASH_TRAEFIK_ENABLED}'
|
||||||
|
- 'traefik.http.middlewares.${STASH_COMPOSE_PROJECT_NAME}-filestash-redirect-web-secure.redirectscheme.scheme=https'
|
||||||
|
- 'traefik.http.routers.${STASH_COMPOSE_PROJECT_NAME}-filestash-web.middlewares=${STASH_COMPOSE_PROJECT_NAME}-filestash-redirect-web-secure'
|
||||||
|
- 'traefik.http.routers.${STASH_COMPOSE_PROJECT_NAME}-filestash-web.rule=Host(`${STASH_TRAEFIK_HOST}`)'
|
||||||
|
- 'traefik.http.routers.${STASH_COMPOSE_PROJECT_NAME}-filestash-web.entrypoints=web'
|
||||||
|
- 'traefik.http.routers.${STASH_COMPOSE_PROJECT_NAME}-filestash-web-secure.rule=Host(`${STASH_TRAEFIK_HOST}`)'
|
||||||
|
- 'traefik.http.routers.${STASH_COMPOSE_PROJECT_NAME}-filestash-web-secure.tls.certresolver=resolver'
|
||||||
|
- 'traefik.http.routers.${STASH_COMPOSE_PROJECT_NAME}-filestash-web-secure.entrypoints=web-secure'
|
||||||
|
- 'traefik.http.middlewares.${STASH_COMPOSE_PROJECT_NAME}-filestash-web-secure-compress.compress=true'
|
||||||
|
- 'traefik.http.routers.${STASH_COMPOSE_PROJECT_NAME}-filestash-web-secure.middlewares=${STASH_COMPOSE_PROJECT_NAME}-filestash-web-secure-compress'
|
||||||
|
- 'traefik.http.services.${STASH_COMPOSE_PROJECT_NAME}-filestash-web-secure.loadbalancer.server.port=8334'
|
||||||
|
- 'traefik.docker.network=${NETWORK_NAME}'
|
||||||
|
- 'com.centurylinklabs.watchtower.enable=${WATCHTOWER_LABEL_ENABLE}'
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
filestash_data:
|
||||||
|
name: ${STASH_COMPOSE_PROJECT_NAME}_filestash_data
|
||||||
|
|
||||||
|
networks:
|
||||||
|
compose_network:
|
||||||
|
name: ${NETWORK_NAME}
|
||||||
|
external: true
|
||||||
@@ -1,5 +1,5 @@
|
|||||||
services:
|
services:
|
||||||
umami_app:
|
umami:
|
||||||
image: ${TRACK_DOCKER_IMAGE}
|
image: ${TRACK_DOCKER_IMAGE}
|
||||||
container_name: ${TRACK_COMPOSE_PROJECT_NAME}_app
|
container_name: ${TRACK_COMPOSE_PROJECT_NAME}_app
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
services:
|
services:
|
||||||
vpn_app:
|
vpn:
|
||||||
image: ${VPN_DOCKER_IMAGE}
|
image: ${VPN_DOCKER_IMAGE}
|
||||||
container_name: ${VPN_COMPOSE_PROJECT_NAME}_app
|
container_name: ${VPN_COMPOSE_PROJECT_NAME}_app
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
|
|||||||
Reference in New Issue
Block a user