Files
docker-compose/scrapy/compose.yaml
Sebastian Krüger 785942da61 feat: add Scrapy, n8n, and Filestash stacks to Falcon
Added three new service stacks to the docker-compose infrastructure:

- **Scrapy stack** (scrapy.pivoine.art):
  - scrapyd: Web scraping daemon with web interface (port 6800)
  - scrapy: Development container for spider commands
  - scrapyrt: Real-time API for running spiders (port 9080)

- **n8n stack** (n8n.pivoine.art):
  - Workflow automation platform with PostgreSQL backend
  - 200+ integrations for automated tasks
  - Runners enabled for task execution
  - Webhook support for external triggers

- **Filestash stack** (stash.pivoine.art):
  - Web-based file manager with multi-backend support
  - Supports SFTP, S3, Dropbox, Google Drive, FTP, WebDAV
  - In-browser file viewing and media playback

Infrastructure updates:
- Updated PostgreSQL init script to create n8n database
- Added environment variables to arty.yml for all three stacks
- Updated compose.yaml include list
- Updated CLAUDE.md and README.md documentation
- Normalized service names in existing stacks (gotify, proxy, umami, vpn)

All services integrated with Traefik for SSL termination and include
Watchtower auto-update labels.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-04 22:36:13 +01:00

64 lines
2.5 KiB
YAML

services:
scrapyd:
image: ${SCRAPY_SCRAPYD_IMAGE:-vimagick/scrapyd}
container_name: ${SCRAPY_COMPOSE_PROJECT_NAME}_scrapyd
restart: unless-stopped
volumes:
- scrapyd_data:/var/lib/scrapyd
- /usr/local/lib/python3.9/dist-packages
environment:
TZ: ${TIMEZONE:-Europe/Berlin}
networks:
- compose_network
labels:
- 'traefik.enable=${SCRAPY_TRAEFIK_ENABLED}'
- 'traefik.http.middlewares.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-redirect-web-secure.redirectscheme.scheme=https'
- 'traefik.http.routers.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web.middlewares=${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-redirect-web-secure'
- 'traefik.http.routers.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web.rule=Host(`${SCRAPY_TRAEFIK_HOST}`)'
- 'traefik.http.routers.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web.entrypoints=web'
- 'traefik.http.routers.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web-secure.rule=Host(`${SCRAPY_TRAEFIK_HOST}`)'
- 'traefik.http.routers.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web-secure.tls.certresolver=resolver'
- 'traefik.http.routers.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web-secure.entrypoints=web-secure'
- 'traefik.http.middlewares.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web-secure-compress.compress=true'
- 'traefik.http.routers.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web-secure.middlewares=${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web-secure-compress'
- 'traefik.http.services.${SCRAPY_COMPOSE_PROJECT_NAME}-scrapyd-web-secure.loadbalancer.server.port=6800'
- 'traefik.docker.network=${NETWORK_NAME}'
- 'com.centurylinklabs.watchtower.enable=${WATCHTOWER_LABEL_ENABLE}'
scrapy:
image: ${SCRAPY_IMAGE:-vimagick/scrapyd}
container_name: ${SCRAPY_COMPOSE_PROJECT_NAME}_scrapy
command: bash
volumes:
- ./code:/code
working_dir: /code
restart: unless-stopped
environment:
TZ: ${TIMEZONE:-Europe/Berlin}
networks:
- compose_network
scrapyrt:
image: ${SCRAPY_SCRAPYRT_IMAGE:-vimagick/scrapyd}
container_name: ${SCRAPY_COMPOSE_PROJECT_NAME}_scrapyrt
command: scrapyrt -i 0.0.0.0 -p 9080
restart: unless-stopped
ports:
- "${SCRAPY_SCRAPYRT_PORT:-9080}:9080"
volumes:
- scrapy_code:/code
working_dir: /code
environment:
TZ: ${TIMEZONE:-Europe/Berlin}
networks:
- compose_network
volumes:
scrapyd_data:
name: ${SCRAPY_COMPOSE_PROJECT_NAME}_scrapyd_data
networks:
compose_network:
name: ${NETWORK_NAME}
external: true