feat: add Scrapy, n8n, and Filestash stacks to Falcon
Added three new service stacks to the docker-compose infrastructure: - **Scrapy stack** (scrapy.pivoine.art): - scrapyd: Web scraping daemon with web interface (port 6800) - scrapy: Development container for spider commands - scrapyrt: Real-time API for running spiders (port 9080) - **n8n stack** (n8n.pivoine.art): - Workflow automation platform with PostgreSQL backend - 200+ integrations for automated tasks - Runners enabled for task execution - Webhook support for external triggers - **Filestash stack** (stash.pivoine.art): - Web-based file manager with multi-backend support - Supports SFTP, S3, Dropbox, Google Drive, FTP, WebDAV - In-browser file viewing and media playback Infrastructure updates: - Updated PostgreSQL init script to create n8n database - Added environment variables to arty.yml for all three stacks - Updated compose.yaml include list - Updated CLAUDE.md and README.md documentation - Normalized service names in existing stacks (gotify, proxy, umami, vpn) All services integrated with Traefik for SSL termination and include Watchtower auto-update labels. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
34
CLAUDE.md
34
CLAUDE.md
@@ -16,6 +16,9 @@ Root `compose.yaml` uses Docker Compose's `include` directive to orchestrate mul
|
||||
- **awsm**: Next.js application with SQLite
|
||||
- **track**: Umami analytics (PostgreSQL)
|
||||
- **gotify**: Push notification server
|
||||
- **scrapy**: Scrapyd web scraping cluster (scrapyd, scrapy, scrapyrt)
|
||||
- **n8n**: Workflow automation platform (PostgreSQL)
|
||||
- **stash**: Filestash web-based file manager
|
||||
- **vpn**: WireGuard VPN (wg-easy)
|
||||
|
||||
All services connect to a single external Docker network (`falcon_network` by default, defined by `$NETWORK_NAME`).
|
||||
@@ -41,6 +44,7 @@ Services expose themselves via Docker labels:
|
||||
`core/postgres/init/01-init-databases.sh` runs on first PostgreSQL startup:
|
||||
- Creates `directus` database for Sexy CMS
|
||||
- Creates `umami` database for Track analytics
|
||||
- Creates `n8n` database for workflow automation
|
||||
- Grants privileges to `$DB_USER`
|
||||
|
||||
## Common Commands
|
||||
@@ -134,6 +138,33 @@ Next.js app with embedded SQLite:
|
||||
- Optional webhook secret for database updates
|
||||
- Database persisted in `awesome_data` volume
|
||||
|
||||
### Scrapy (scrapy/compose.yaml)
|
||||
Web scraping cluster with three services:
|
||||
- **scrapyd**: Scrapyd daemon exposed at `scrapy.pivoine.art:6800`
|
||||
- Web interface for deploying and managing spiders
|
||||
- Data persisted in `scrapyd_data` volume
|
||||
- **scrapy**: Development container for running Scrapy commands
|
||||
- Shared `scrapy_code` volume for spider projects
|
||||
- **scrapyrt**: Scrapyd Real-Time API on port 9080
|
||||
- Run spiders via HTTP API without scheduling
|
||||
|
||||
### n8n (n8n/compose.yaml)
|
||||
Workflow automation platform:
|
||||
- **n8n**: n8n application exposed at `n8n.pivoine.art:5678`
|
||||
- Visual workflow builder with 200+ integrations
|
||||
- PostgreSQL backend for workflow persistence
|
||||
- Runners enabled for task execution
|
||||
- Webhook support for external triggers
|
||||
- Data persisted in `n8n_data` volume
|
||||
|
||||
### Stash (stash/compose.yaml)
|
||||
Web-based file manager:
|
||||
- **filestash**: Filestash app exposed at `stash.pivoine.art:8334`
|
||||
- Support for multiple storage backends (SFTP, S3, Dropbox, Google Drive, FTP, WebDAV)
|
||||
- In-browser file viewer and media player
|
||||
- File sharing capabilities
|
||||
- Data persisted in `filestash_data` volume
|
||||
|
||||
## Important Environment Variables
|
||||
|
||||
Key variables defined in `arty.yml` and overridden in `.env`:
|
||||
@@ -153,6 +184,9 @@ Each service uses named volumes prefixed with project name:
|
||||
- `core_postgres_data`, `core_redis_data`: Database persistence
|
||||
- `core_directus_uploads`, `core_directus_bundle`: Directus media/extensions
|
||||
- `awesome_data`: AWSM SQLite database
|
||||
- `scrapy_scrapyd_data`, `scrapy_scrapy_code`: Scrapy spider data and code
|
||||
- `n8n_n8n_data`: n8n workflow data
|
||||
- `stash_filestash_data`: Filestash configuration and state
|
||||
- `proxy_letsencrypt_data`: SSL certificates
|
||||
|
||||
Volumes can be inspected with:
|
||||
|
||||
Reference in New Issue
Block a user