Compare commits

..

10 Commits

Author SHA1 Message Date
83ca9d4fb5 chore: update all dependencies to latest versions
Some checks failed
Build and Push Docker Image to Gitea / build-and-push (push) Failing after 46s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 16:43:56 +01:00
225b9d41f5 chore: clean up repo and fix docker compose configuration
- Remove outdated docs (COMPOSE.md, DOCKER.md, QUICKSTART.md, REBUILD_GUIDE.md)
- Remove build.sh, compose.production.yml, gamification-schema.sql, directus.yaml
- Simplify compose.yml for local dev (remove env var indirection)
- Add directus.yml schema snapshot and schema.sql from VPS
- Add schema:export and schema:import scripts to package.json
- Ignore .env files (vars set via compose environment)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 16:36:49 +01:00
ad83fb553a feat: switch to dynamic env for Umami to allow runtime configuration
All checks were successful
Build and Push Docker Image to Gitea / build-and-push (push) Successful in 4m41s
2026-02-21 11:47:32 +01:00
2be36a679d fix: resolve buttplug-wasm build error by using Vec and slices
All checks were successful
Build and Push Docker Image to Gitea / build-and-push (push) Successful in 5m26s
2026-02-21 11:33:12 +01:00
75d4b4227c chore: update buttplug dependencies to commit fad6c9d
Some checks failed
Build and Push Docker Image to Gitea / build-and-push (push) Failing after 2m10s
2026-02-21 11:29:36 +01:00
2277e4f686 fix: resolve buttplug-wasm build error by using as_ref() for JS arrays
Some checks failed
Build and Push Docker Image to Gitea / build-and-push (push) Failing after 2m7s
2026-02-21 11:13:53 +01:00
13c6977e59 feat: add PUBLIC_UMAMI_SCRIPT variable and integrate into layout
Some checks failed
Build and Push Docker Image to Gitea / build-and-push (push) Failing after 2m13s
2026-02-21 11:05:30 +01:00
c85fa7798e chore: remove Letterspace newsletter integration and all LETTERSPACE_* variables 2026-02-21 10:56:43 +01:00
ce30eca574 chore: new imprint address 2026-02-21 10:37:19 +01:00
6724afa939 fix: use correct Directus preset parameter for asset transforms
All checks were successful
Build and Push Docker Image to Gitea / build-and-push (push) Successful in 6m4s
Changed ?transform= to ?key= so Directus storage asset presets
(mini, thumbnail, preview, medium, banner) are actually applied.
Previously all images were served at full resolution.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-16 08:42:38 +01:00
31 changed files with 5606 additions and 4613 deletions

3
.gitignore vendored
View File

@@ -3,5 +3,8 @@ dist/
target/
pkg/
.env
.env.*
.claude/

View File

@@ -1,424 +0,0 @@
# Docker Compose Guide
This guide explains the Docker Compose setup for sexy.pivoine.art with local development and production configurations.
## Architecture Overview
The application uses a **multi-file compose setup** with two configurations:
1. **`compose.yml`** - Base configuration for local development
2. **`compose.production.yml`** - Production overrides with Traefik integration
### Service Architecture
```
┌─────────────────────────────────────────────────────────────┐
│ 🌐 Traefik Reverse Proxy (Production Only) │
│ ├─ HTTPS Termination │
│ ├─ Automatic Let's Encrypt │
│ └─ Routes traffic to frontend & Directus API │
├─────────────────────────────────────────────────────────────┤
│ 💄 Frontend (SvelteKit) │
│ ├─ Port 3000 (internal) │
│ ├─ Serves on https://sexy.pivoine.art │
│ └─ Proxies /api to Directus │
├─────────────────────────────────────────────────────────────┤
│ 🎭 Directus CMS │
│ ├─ Port 8055 (internal) │
│ ├─ Serves on https://sexy.pivoine.art/api │
│ ├─ Custom bundle extensions mounted │
│ └─ Uploads volume │
├─────────────────────────────────────────────────────────────┤
│ 🗄️ PostgreSQL (Local) / External (Production) │
│ └─ Database for Directus │
├─────────────────────────────────────────────────────────────┤
│ 💾 Redis (Local) / External (Production) │
│ └─ Cache & session storage │
└─────────────────────────────────────────────────────────────┘
```
## Local Development Setup
### Prerequisites
- Docker 20.10+
- Docker Compose 2.0+
### Quick Start
1. **Create environment file:**
```bash
cp .env.example .env
# Edit .env with your local settings (defaults work fine)
```
2. **Start all services:**
```bash
docker-compose up -d
```
3. **Access services:**
- Frontend: http://localhost:3000 (if enabled)
- Directus: http://localhost:8055
- Directus Admin: http://localhost:8055/admin
4. **View logs:**
```bash
docker-compose logs -f
```
5. **Stop services:**
```bash
docker-compose down
```
### Local Services
#### PostgreSQL
- **Image:** `postgres:16-alpine`
- **Port:** 5432 (internal only)
- **Volume:** `postgres-data`
- **Database:** `sexy`
#### Redis
- **Image:** `redis:7-alpine`
- **Port:** 6379 (internal only)
- **Volume:** `redis-data`
- **Persistence:** AOF enabled
#### Directus
- **Image:** `directus/directus:11`
- **Port:** 8055 (exposed)
- **Volumes:**
- `directus-uploads` - File uploads
- `./packages/bundle/dist` - Custom extensions
- **Features:**
- Auto-reload extensions
- WebSockets enabled
- CORS enabled for localhost
### Local Development Workflow
```bash
# Start infrastructure (Postgres, Redis, Directus)
docker-compose up -d
# Develop frontend locally with hot reload
cd packages/frontend
pnpm dev
# Build Directus bundle
pnpm --filter @sexy.pivoine.art/bundle build
# Restart Directus to load new bundle
docker-compose restart directus
```
## Production Deployment
### Prerequisites
- External PostgreSQL database
- External Redis instance
- Traefik reverse proxy configured
- External network: `compose_network`
### Setup
The production compose file now uses the `include` directive to automatically extend `compose.yml`, making deployment simpler.
1. **Create production environment file:**
```bash
cp .env.production.example .env.production
```
2. **Edit `.env.production` with your values:**
```bash
# Database (external)
CORE_DB_HOST=your-postgres-host
SEXY_DB_NAME=sexy_production
DB_USER=sexy
DB_PASSWORD=your-secure-password
# Redis (external)
CORE_REDIS_HOST=your-redis-host
# Directus
SEXY_DIRECTUS_SECRET=your-32-char-random-secret
ADMIN_PASSWORD=your-secure-admin-password
# Traefik
SEXY_TRAEFIK_HOST=sexy.pivoine.art
# Frontend
PUBLIC_API_URL=https://sexy.pivoine.art/api
PUBLIC_URL=https://sexy.pivoine.art
# Email (SMTP)
EMAIL_SMTP_HOST=smtp.your-provider.com
EMAIL_SMTP_USER=your-email@domain.com
EMAIL_SMTP_PASSWORD=your-smtp-password
```
3. **Deploy:**
```bash
# Simple deployment - compose.production.yml includes compose.yml automatically
docker-compose -f compose.production.yml --env-file .env.production up -d
# Or use the traditional multi-file approach (same result)
docker-compose -f compose.yml -f compose.production.yml --env-file .env.production up -d
```
### Production Services
#### Directus
- **Image:** `directus/directus:11` (configurable)
- **Network:** `compose_network` (external)
- **Volumes:**
- `/var/www/sexy.pivoine.art/uploads` - Persistent uploads
- `/var/www/sexy.pivoine.art/packages/bundle/dist` - Extensions
- **Traefik routing:**
- Domain: `sexy.pivoine.art/api`
- Strips `/api` prefix before forwarding
- HTTPS with auto-certificates
#### Frontend
- **Image:** `ghcr.io/valknarxxx/sexy:latest` (from GHCR)
- **Network:** `compose_network` (external)
- **Volume:** `/var/www/sexy.pivoine.art` - Application code
- **Traefik routing:**
- Domain: `sexy.pivoine.art`
- HTTPS with auto-certificates
### Traefik Integration
Both services are configured with Traefik labels for automatic routing:
**Frontend:**
- HTTP → HTTPS redirect
- Routes `sexy.pivoine.art` to port 3000
- Gzip compression enabled
**Directus API:**
- HTTP → HTTPS redirect
- Routes `sexy.pivoine.art/api` to port 8055
- Strips `/api` prefix
- Gzip compression enabled
### Production Commands
```bash
# Deploy/update (simplified - uses include)
docker-compose -f compose.production.yml --env-file .env.production up -d
# View logs
docker-compose -f compose.production.yml logs -f
# Restart specific service
docker-compose -f compose.production.yml restart frontend
# Stop all services
docker-compose -f compose.production.yml down
# Update images
docker-compose -f compose.production.yml pull
docker-compose -f compose.production.yml up -d
```
## Environment Variables
### Local Development (`.env`)
| Variable | Default | Description |
|----------|---------|-------------|
| `DB_DATABASE` | `sexy` | Database name |
| `DB_USER` | `sexy` | Database user |
| `DB_PASSWORD` | `sexy` | Database password |
| `DIRECTUS_SECRET` | - | Secret for Directus (min 32 chars) |
| `ADMIN_EMAIL` | `admin@sexy.pivoine.art` | Admin email |
| `ADMIN_PASSWORD` | `admin` | Admin password |
| `CORS_ORIGIN` | `http://localhost:3000` | CORS allowed origins |
See `.env.example` for full list.
### Production (`.env.production`)
| Variable | Description | Required |
|----------|-------------|----------|
| `CORE_DB_HOST` | External PostgreSQL host | ✅ |
| `SEXY_DB_NAME` | Database name | ✅ |
| `DB_PASSWORD` | Database password | ✅ |
| `CORE_REDIS_HOST` | External Redis host | ✅ |
| `SEXY_DIRECTUS_SECRET` | Directus secret key | ✅ |
| `SEXY_TRAEFIK_HOST` | Domain name | ✅ |
| `EMAIL_SMTP_HOST` | SMTP server | ✅ |
| `EMAIL_SMTP_PASSWORD` | SMTP password | ✅ |
| `SEXY_FRONTEND_PUBLIC_API_URL` | Frontend API URL | ✅ |
| `SEXY_FRONTEND_PUBLIC_URL` | Frontend public URL | ✅ |
See `.env.production.example` for full list.
**Note:** All frontend-specific variables are prefixed with `SEXY_FRONTEND_` for clarity.
## Volumes
### Local Development
- `postgres-data` - PostgreSQL database
- `redis-data` - Redis persistence
- `directus-uploads` - Uploaded files
### Production
- `/var/www/sexy.pivoine.art/uploads` - Directus uploads
- `/var/www/sexy.pivoine.art` - Application code (frontend)
## Networks
### Local: `sexy-network`
- Bridge network
- Internal communication only
- Directus exposed on 8055
### Production: `compose_network`
- External network (pre-existing)
- Connects to Traefik
- No exposed ports (Traefik handles routing)
## Health Checks
All services include health checks:
**PostgreSQL:**
- Command: `pg_isready`
- Interval: 10s
**Redis:**
- Command: `redis-cli ping`
- Interval: 10s
**Directus:**
- Endpoint: `/server/health`
- Interval: 30s
**Frontend:**
- HTTP GET: `localhost:3000`
- Interval: 30s
## Troubleshooting
### Local Development
**Problem:** Directus won't start
```bash
# Check logs
docker-compose logs directus
# Common issues:
# 1. Database not ready - wait for postgres to be healthy
# 2. Wrong secret - check DIRECTUS_SECRET is at least 32 chars
```
**Problem:** Can't connect to database
```bash
# Check if postgres is running
docker-compose ps postgres
# Verify health
docker-compose exec postgres pg_isready -U sexy
```
**Problem:** Extensions not loading
```bash
# Rebuild bundle
pnpm --filter @sexy.pivoine.art/bundle build
# Verify volume mount
docker-compose exec directus ls -la /directus/extensions/
# Restart Directus
docker-compose restart directus
```
### Production
**Problem:** Services not accessible via domain
```bash
# Check Traefik labels
docker inspect sexy_frontend | grep traefik
# Verify compose_network exists
docker network ls | grep compose_network
# Check Traefik is running
docker ps | grep traefik
```
**Problem:** Can't connect to external database
```bash
# Test connection from Directus container
docker-compose exec directus sh
apk add postgresql-client
psql -h $CORE_DB_HOST -U $DB_USER -d $SEXY_DB_NAME
```
**Problem:** Frontend can't reach Directus API
```bash
# Check Directus is accessible
curl https://sexy.pivoine.art/api/server/health
# Verify CORS settings
# PUBLIC_API_URL should match the public Directus URL
```
## Migration from Old Setup
If migrating from `docker-compose.production.yml`:
1. **Rename environment variables** according to `.env.production.example`
2. **Update command** to use both compose files
3. **Verify Traefik labels** match your setup
4. **Test** with `docker-compose config` to see merged configuration
```bash
# Validate configuration
docker-compose -f compose.yml -f compose.production.yml --env-file .env.production config
# Deploy
docker-compose -f compose.yml -f compose.production.yml --env-file .env.production up -d
```
## Best Practices
### Local Development
1. Use default credentials (they're fine for local)
2. Keep `EXTENSIONS_AUTO_RELOAD=true` for quick iteration
3. Run frontend via `pnpm dev` for hot reload
4. Restart Directus after bundle changes
### Production
1. Use strong passwords for database and admin
2. Set `EXTENSIONS_AUTO_RELOAD=false` for stability
3. Use GHCR images for frontend
4. Enable Gzip compression via Traefik
5. Monitor logs regularly
6. Keep backups of uploads and database
## See Also
- [DOCKER.md](DOCKER.md) - Docker image documentation
- [QUICKSTART.md](QUICKSTART.md) - Quick start guide
- [CLAUDE.md](CLAUDE.md) - Development guide

378
DOCKER.md
View File

@@ -1,378 +0,0 @@
# Docker Deployment Guide
This guide covers building and deploying sexy.pivoine.art using Docker.
## Overview
The Dockerfile uses a multi-stage build process:
1. **Base stage**: Sets up Node.js and pnpm
2. **Builder stage**: Installs Rust, compiles WASM, builds all packages
3. **Runner stage**: Minimal production image with only runtime dependencies
## Prerequisites
- Docker 20.10+ with BuildKit support
- Docker Compose 2.0+ (optional, for orchestration)
## Building the Image
### Basic Build
```bash
docker build -t sexy.pivoine.art:latest .
```
### Build with Build Arguments
```bash
docker build \
--build-arg NODE_ENV=production \
-t sexy.pivoine.art:latest \
.
```
### Multi-platform Build (for ARM64 and AMD64)
```bash
docker buildx build \
--platform linux/amd64,linux/arm64 \
-t sexy.pivoine.art:latest \
--push \
.
```
## Running the Container
### Run with Environment Variables
```bash
docker run -d \
--name sexy-pivoine-frontend \
-p 3000:3000 \
-e PUBLIC_API_URL=https://api.pivoine.art \
-e PUBLIC_URL=https://sexy.pivoine.art \
-e PUBLIC_UMAMI_ID=your-umami-id \
-e LETTERSPACE_API_URL=https://api.letterspace.com/v1 \
-e LETTERSPACE_API_KEY=your-api-key \
-e LETTERSPACE_LIST_ID=your-list-id \
sexy.pivoine.art:latest
```
### Run with Environment File
```bash
# Create .env.production from template
cp .env.production.example .env.production
# Edit .env.production with your values
nano .env.production
# Run container
docker run -d \
--name sexy-pivoine-frontend \
-p 3000:3000 \
--env-file .env.production \
sexy.pivoine.art:latest
```
## Docker Compose Deployment
### Using docker-compose.production.yml
```bash
# 1. Create environment file
cp .env.production.example .env.production
# 2. Edit environment variables
nano .env.production
# 3. Build and start
docker-compose -f docker-compose.production.yml up -d --build
# 4. View logs
docker-compose -f docker-compose.production.yml logs -f frontend
# 5. Stop services
docker-compose -f docker-compose.production.yml down
```
### Scale the Application
```bash
docker-compose -f docker-compose.production.yml up -d --scale frontend=3
```
## Environment Variables Reference
### Required Variables
| Variable | Description | Example |
|----------|-------------|---------|
| `PUBLIC_API_URL` | Directus API backend URL | `https://api.pivoine.art` |
| `PUBLIC_URL` | Frontend application URL | `https://sexy.pivoine.art` |
### Optional Variables
| Variable | Description | Example |
|----------|-------------|---------|
| `PUBLIC_UMAMI_ID` | Umami analytics tracking ID | `abc123def-456` |
| `LETTERSPACE_API_URL` | Letterspace API endpoint | `https://api.letterspace.com/v1` |
| `LETTERSPACE_API_KEY` | Letterspace authentication key | `sk_live_...` |
| `LETTERSPACE_LIST_ID` | Mailing list identifier | `list_abc123` |
| `PORT` | Application port (inside container) | `3000` |
| `HOST` | Host binding | `0.0.0.0` |
| `NODE_ENV` | Node environment | `production` |
## Health Checks
The container includes a built-in health check that pings the HTTP server every 30 seconds:
```bash
# Check container health
docker inspect --format='{{.State.Health.Status}}' sexy-pivoine-frontend
# View health check logs
docker inspect --format='{{json .State.Health}}' sexy-pivoine-frontend | jq
```
## Logs and Debugging
### View Container Logs
```bash
# Follow logs
docker logs -f sexy-pivoine-frontend
# Last 100 lines
docker logs --tail 100 sexy-pivoine-frontend
# With timestamps
docker logs -f --timestamps sexy-pivoine-frontend
```
### Execute Commands in Running Container
```bash
# Open shell
docker exec -it sexy-pivoine-frontend sh
# Check Node.js version
docker exec sexy-pivoine-frontend node --version
# Check environment variables
docker exec sexy-pivoine-frontend env
```
### Debug Build Issues
```bash
# Build with no cache
docker build --no-cache -t sexy.pivoine.art:latest .
# Build specific stage for debugging
docker build --target builder -t sexy.pivoine.art:builder .
# Inspect builder stage
docker run -it --rm sexy.pivoine.art:builder sh
```
## Production Best Practices
### 1. Use Specific Tags
```bash
# Tag with version
docker build -t sexy.pivoine.art:1.0.0 .
docker tag sexy.pivoine.art:1.0.0 sexy.pivoine.art:latest
```
### 2. Image Scanning
```bash
# Scan for vulnerabilities (requires Docker Scout or Trivy)
docker scout cves sexy.pivoine.art:latest
# Or with Trivy
trivy image sexy.pivoine.art:latest
```
### 3. Resource Limits
```bash
docker run -d \
--name sexy-pivoine-frontend \
-p 3000:3000 \
--memory="2g" \
--cpus="2" \
--env-file .env.production \
sexy.pivoine.art:latest
```
### 4. Restart Policies
```bash
docker run -d \
--name sexy-pivoine-frontend \
--restart=unless-stopped \
-p 3000:3000 \
--env-file .env.production \
sexy.pivoine.art:latest
```
### 5. Use Docker Secrets (Docker Swarm)
```bash
# Create secrets
echo "your-api-key" | docker secret create letterspace_api_key -
# Deploy with secrets
docker service create \
--name sexy-pivoine-frontend \
--secret letterspace_api_key \
-p 3000:3000 \
sexy.pivoine.art:latest
```
## Optimization Tips
### Reduce Build Time
1. **Use BuildKit cache mounts** (already enabled in Dockerfile)
2. **Leverage layer caching** - structure Dockerfile to cache dependencies
3. **Use `.dockerignore`** - exclude unnecessary files from build context
### Reduce Image Size
Current optimizations:
- Multi-stage build (builder artifacts not in final image)
- Production-only dependencies (`pnpm install --prod`)
- Minimal base image (`node:20.19.1-slim`)
- Only necessary build artifacts copied to runner
Image size breakdown:
```bash
docker images sexy.pivoine.art:latest
```
## CI/CD Integration
### GitHub Actions (Automated)
This repository includes automated GitHub Actions workflows for building, scanning, and managing Docker images.
**Pre-configured workflows:**
- **Build & Push** (`.github/workflows/docker-build-push.yml`)
- Automatically builds and pushes to `ghcr.io/valknarxxx/sexy`
- Triggers on push to main/develop, version tags, and PRs
- Multi-platform builds (AMD64 + ARM64)
- Smart tagging: latest, branch names, semver, commit SHAs
- **Security Scan** (`.github/workflows/docker-scan.yml`)
- Daily vulnerability scans with Trivy
- Reports to GitHub Security tab
- Scans on every release
- **Cleanup** (`.github/workflows/cleanup-images.yml`)
- Weekly cleanup of old untagged images
- Keeps last 10 versions
**Using pre-built images:**
```bash
# Pull latest from GitHub Container Registry
docker pull ghcr.io/valknarxxx/sexy:latest
# Pull specific version
docker pull ghcr.io/valknarxxx/sexy:v1.0.0
# Run the image
docker run -d -p 3000:3000 --env-file .env.production ghcr.io/valknarxxx/sexy:latest
```
**Triggering builds:**
```bash
# Push to main → builds 'latest' tag
git push origin main
# Create version tag → builds semver tags
git tag v1.0.0 && git push origin v1.0.0
# Pull request → builds but doesn't push
```
See `.github/workflows/README.md` for detailed workflow documentation.
## Troubleshooting
### Build Fails at Rust Installation
**Problem**: Rust installation fails or times out
**Solution**:
- Check internet connectivity
- Use a Rust mirror if in restricted network
- Increase build timeout
### WASM Build Fails
**Problem**: `wasm-bindgen-cli` version mismatch
**Solution**:
```dockerfile
# In Dockerfile, pin wasm-bindgen-cli version
RUN cargo install wasm-bindgen-cli --version 0.2.103
```
### Container Exits Immediately
**Problem**: Container starts then exits
**Solution**: Check logs and verify:
```bash
docker logs sexy-pivoine-frontend
# Verify build output exists
docker run -it --rm sexy.pivoine.art:latest ls -la packages/frontend/build
```
### Port Already in Use
**Problem**: Port 3000 already bound
**Solution**:
```bash
# Use different host port
docker run -d -p 8080:3000 sexy.pivoine.art:latest
```
## Maintenance
### Clean Up
```bash
# Remove stopped containers
docker container prune
# Remove unused images
docker image prune -a
# Remove build cache
docker builder prune
# Complete cleanup (use with caution)
docker system prune -a --volumes
```
### Update Base Image
Regularly update the base Node.js image:
```bash
# Pull latest Node 20 LTS
docker pull node:20.19.1-slim
# Rebuild
docker build --pull -t sexy.pivoine.art:latest .
```

View File

@@ -124,9 +124,7 @@ ENV NODE_ENV=production \
ENV PUBLIC_API_URL="" \
PUBLIC_URL="" \
PUBLIC_UMAMI_ID="" \
LETTERSPACE_API_URL="" \
LETTERSPACE_API_KEY="" \
LETTERSPACE_LIST_ID=""
PUBLIC_UMAMI_SCRIPT=""
# Expose application port
EXPOSE 3000

View File

@@ -1,334 +0,0 @@
# Quick Start Guide
Get sexy.pivoine.art running in under 5 minutes using pre-built Docker images.
## Prerequisites
- Docker 20.10+
- Docker Compose 2.0+ (optional)
## Option 1: Docker Run (Fastest)
### Step 1: Pull the Image
```bash
docker pull ghcr.io/valknarxxx/sexy:latest
```
### Step 2: Create Environment File
```bash
cat > .env.production << EOF
PUBLIC_API_URL=https://api.your-domain.com
PUBLIC_URL=https://your-domain.com
PUBLIC_UMAMI_ID=
LETTERSPACE_API_URL=
LETTERSPACE_API_KEY=
LETTERSPACE_LIST_ID=
EOF
```
### Step 3: Run the Container
```bash
docker run -d \
--name sexy-pivoine \
-p 3000:3000 \
--env-file .env.production \
--restart unless-stopped \
ghcr.io/valknarxxx/sexy:latest
```
### Step 4: Verify
```bash
# Check if running
docker ps | grep sexy-pivoine
# Check logs
docker logs -f sexy-pivoine
# Test the application
curl http://localhost:3000
```
Your application is now running at `http://localhost:3000` 🎉
## Option 2: Docker Compose (Recommended)
### Step 1: Download docker-compose.production.yml
```bash
curl -O https://raw.githubusercontent.com/valknarxxx/sexy/main/docker-compose.production.yml
```
Or if you have the repository:
```bash
cd /path/to/sexy.pivoine.art
```
### Step 2: Create Environment File
```bash
cp .env.production.example .env.production
nano .env.production # Edit with your values
```
### Step 3: Start Services
```bash
docker-compose -f docker-compose.production.yml up -d
```
### Step 4: Monitor
```bash
# View logs
docker-compose -f docker-compose.production.yml logs -f
# Check status
docker-compose -f docker-compose.production.yml ps
```
Your application is now running at `http://localhost:3000` 🎉
## Accessing Private Images
If the image is in a private registry:
### Step 1: Create GitHub Personal Access Token
1. Go to https://github.com/settings/tokens
2. Click "Generate new token (classic)"
3. Select scope: `read:packages`
4. Generate and copy the token
### Step 2: Login to GitHub Container Registry
```bash
echo YOUR_GITHUB_TOKEN | docker login ghcr.io -u YOUR_GITHUB_USERNAME --password-stdin
```
### Step 3: Pull and Run
Now you can pull private images:
```bash
docker pull ghcr.io/valknarxxx/sexy:latest
```
## Environment Variables
### Required
| Variable | Description | Example |
|----------|-------------|---------|
| `PUBLIC_API_URL` | Directus API endpoint | `https://api.pivoine.art` |
| `PUBLIC_URL` | Frontend URL | `https://sexy.pivoine.art` |
### Optional
| Variable | Description | Example |
|----------|-------------|---------|
| `PUBLIC_UMAMI_ID` | Analytics tracking ID | `abc-123-def` |
| `LETTERSPACE_API_URL` | Newsletter API | `https://api.letterspace.com/v1` |
| `LETTERSPACE_API_KEY` | Newsletter API key | `sk_live_...` |
| `LETTERSPACE_LIST_ID` | Mailing list ID | `list_abc123` |
## Common Commands
### View Logs
```bash
# Follow logs (Docker Run)
docker logs -f sexy-pivoine
# Follow logs (Docker Compose)
docker-compose -f docker-compose.production.yml logs -f
```
### Restart Container
```bash
# Docker Run
docker restart sexy-pivoine
# Docker Compose
docker-compose -f docker-compose.production.yml restart
```
### Stop Container
```bash
# Docker Run
docker stop sexy-pivoine
# Docker Compose
docker-compose -f docker-compose.production.yml down
```
### Update to Latest Version
```bash
# Docker Run
docker pull ghcr.io/valknarxxx/sexy:latest
docker stop sexy-pivoine
docker rm sexy-pivoine
# Then re-run the docker run command from Step 3
# Docker Compose
docker-compose -f docker-compose.production.yml pull
docker-compose -f docker-compose.production.yml up -d
```
### Shell Access
```bash
# Docker Run
docker exec -it sexy-pivoine sh
# Docker Compose
docker-compose -f docker-compose.production.yml exec frontend sh
```
## Available Image Tags
| Tag | Description | Use Case |
|-----|-------------|----------|
| `latest` | Latest stable build from main | Production |
| `v1.0.0` | Specific version | Production (pinned) |
| `develop` | Latest from develop branch | Staging |
| `main-abc123` | Specific commit | Testing |
**Best Practice:** Use version tags in production for predictable deployments.
## Production Deployment
### 1. Use Version Tags
```bash
# Instead of :latest
docker pull ghcr.io/valknarxxx/sexy:v1.0.0
```
### 2. Add Resource Limits
```bash
docker run -d \
--name sexy-pivoine \
-p 3000:3000 \
--env-file .env.production \
--memory="2g" \
--cpus="2" \
--restart unless-stopped \
ghcr.io/valknarxxx/sexy:v1.0.0
```
### 3. Use a Reverse Proxy
Example with nginx:
```nginx
server {
listen 80;
server_name sexy.pivoine.art;
location / {
proxy_pass http://localhost:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
}
```
### 4. Enable HTTPS
Use Certbot or similar:
```bash
certbot --nginx -d sexy.pivoine.art
```
## Health Check
The container includes a built-in health check:
```bash
# Check container health
docker inspect --format='{{.State.Health.Status}}' sexy-pivoine
```
Possible statuses:
- `starting` - Container just started
- `healthy` - Application is responding
- `unhealthy` - Application is not responding
## Troubleshooting
### Container Exits Immediately
```bash
# Check logs
docker logs sexy-pivoine
# Common issues:
# - Missing environment variables
# - Port 3000 already in use
# - Invalid environment variable values
```
### Cannot Pull Image
```bash
# For private images, ensure you're logged in
docker login ghcr.io
# Check if image exists
docker pull ghcr.io/valknarxxx/sexy:latest
```
### Port Already in Use
```bash
# Use a different port
docker run -d -p 8080:3000 ghcr.io/valknarxxx/sexy:latest
# Or find what's using port 3000
lsof -i :3000
```
### Application Not Accessible
```bash
# Check if container is running
docker ps | grep sexy-pivoine
# Check logs
docker logs sexy-pivoine
# Verify port mapping
docker port sexy-pivoine
# Test from inside container
docker exec sexy-pivoine wget -O- http://localhost:3000
```
## Next Steps
- **Production setup:** See [DOCKER.md](DOCKER.md)
- **Development:** See [CLAUDE.md](CLAUDE.md)
- **CI/CD:** See [.github/workflows/README.md](.github/workflows/README.md)
## Support
- **Issues:** https://github.com/valknarxxx/sexy/issues
- **Discussions:** https://github.com/valknarxxx/sexy/discussions
- **Security:** Report privately via GitHub Security tab
## License
See [LICENSE](LICENSE) file for details.

View File

@@ -38,7 +38,6 @@ Like Beate Uhse breaking barriers in post-war Germany, we believe in the freedom
- 📱 **Responsive Design** that looks sexy on any device
- 🌍 **Internationalization** — pleasure speaks all languages
- 📊 **Analytics Integration** (Umami) — know your admirers
- 📧 **Newsletter Integration** (Letterspace) — stay connected
<div align="center">
@@ -212,10 +211,8 @@ pnpm --filter @sexy.pivoine.art/frontend start
### 💜 Optional (The Extras)
- `PUBLIC_UMAMI_ID` — Analytics tracking
- `LETTERSPACE_API_URL` — Newsletter endpoint
- `LETTERSPACE_API_KEY` — Newsletter key
- `LETTERSPACE_LIST_ID` — Mailing list
- `PUBLIC_UMAMI_ID` — Analytics tracking ID
- `PUBLIC_UMAMI_SCRIPT` — Umami script URL
See [.env.production.example](.env.production.example) for the full configuration.

View File

@@ -1,265 +0,0 @@
# 🔄 Rebuild Guide - When You Need to Rebuild the Image
## Why Rebuild?
SvelteKit's `PUBLIC_*` environment variables are **baked into the JavaScript** at build time. You need to rebuild when:
1. ✅ Changing `PUBLIC_API_URL`
2. ✅ Changing `PUBLIC_URL`
3. ✅ Changing `PUBLIC_UMAMI_ID`
4. ✅ Changing any `LETTERSPACE_*` variables
5. ❌ NOT needed for Directus env vars (those are runtime)
## Quick Rebuild Process
### 1. Update Frontend Environment Variables
Edit the frontend `.env` file:
```bash
nano packages/frontend/.env
```
Set your production values:
```bash
PUBLIC_API_URL=https://sexy.pivoine.art/api
PUBLIC_URL=https://sexy.pivoine.art
PUBLIC_UMAMI_ID=your-umami-id
LETTERSPACE_API_URL=https://api.letterspace.com/v1
LETTERSPACE_API_KEY=your-key
LETTERSPACE_LIST_ID=your-list-id
```
### 2. Rebuild the Image
```bash
# From the project root
docker build -t ghcr.io/valknarxxx/sexy:latest -t sexy.pivoine.art:latest .
```
**Expected Time:** 30-45 minutes (first build), 10-15 minutes (cached rebuild)
### 3. Restart Services
```bash
# If using docker-compose
cd /home/valknar/Projects/docker-compose/sexy
docker compose down
docker compose up -d
# Or directly
docker stop sexy_frontend
docker rm sexy_frontend
docker compose up -d frontend
```
## Monitoring the Build
### Check Build Progress
```bash
# Watch build output
docker build -t ghcr.io/valknarxxx/sexy:latest .
# Build stages:
# 1. Base (~30s) - Node.js setup
# 2. Builder (~25-40min) - Rust + WASM + packages
# - Rust installation: ~2-3 min
# - wasm-bindgen-cli: ~10-15 min
# - WASM build: ~5-10 min
# - Package builds: ~5-10 min
# 3. Runner (~2min) - Final image assembly
```
### Verify Environment Variables in Built Image
```bash
# Check what PUBLIC_API_URL is baked in
docker run --rm ghcr.io/valknarxxx/sexy:latest sh -c \
"grep -r 'PUBLIC_API_URL' /home/node/app/packages/frontend/build/ | head -3"
# Should show: https://sexy.pivoine.art/api
```
## Push to GitHub Container Registry
After successful build:
```bash
# Login to GHCR (first time only)
echo $GITHUB_TOKEN | docker login ghcr.io -u valknarxxx --password-stdin
# Push the image
docker push ghcr.io/valknarxxx/sexy:latest
```
## Alternative: Build Arguments (Future Enhancement)
To avoid rebuilding for every env change, consider adding build arguments:
```dockerfile
# In Dockerfile, before building frontend:
ARG PUBLIC_API_URL=https://sexy.pivoine.art/api
ARG PUBLIC_URL=https://sexy.pivoine.art
ARG PUBLIC_UMAMI_ID=
# Create .env.production dynamically
RUN echo "PUBLIC_API_URL=${PUBLIC_API_URL}" > packages/frontend/.env.production && \
echo "PUBLIC_URL=${PUBLIC_URL}" >> packages/frontend/.env.production && \
echo "PUBLIC_UMAMI_ID=${PUBLIC_UMAMI_ID}" >> packages/frontend/.env.production
```
Then build with:
```bash
docker build \
--build-arg PUBLIC_API_URL=https://sexy.pivoine.art/api \
--build-arg PUBLIC_URL=https://sexy.pivoine.art \
-t ghcr.io/valknarxxx/sexy:latest .
```
## Troubleshooting
### Build Fails at Rust Installation
```bash
# Check network connectivity
ping -c 3 sh.rustup.rs
# Build with verbose output
docker build --progress=plain -t ghcr.io/valknarxxx/sexy:latest .
```
### Build Fails at WASM
```bash
# Check if wasm-bindgen-cli matches package.json version
docker run --rm rust:latest cargo install wasm-bindgen-cli --version 0.2.103
```
### Frontend Still Shows Wrong URL
```bash
# Verify .env file is correct
cat packages/frontend/.env
# Check if old image is cached
docker images | grep sexy
docker rmi ghcr.io/valknarxxx/sexy:old-tag
# Force rebuild without cache
docker build --no-cache -t ghcr.io/valknarxxx/sexy:latest .
```
### Container Starts But Can't Connect to API
1. Check Traefik routing:
```bash
docker logs traefik | grep sexy
```
2. Check if Directus is accessible:
```bash
curl -I https://sexy.pivoine.art/api/server/health
```
3. Check frontend logs:
```bash
docker logs sexy_frontend
```
## Development vs Production
### Development (Local)
- Use `pnpm dev` for hot reload
- No rebuild needed for code changes
- Env vars from `.env` or shell
### Production (Docker)
- Rebuild required for PUBLIC_* changes
- Changes baked into JavaScript
- Env vars from `packages/frontend/.env`
## Optimization Tips
### Speed Up Rebuilds
1. **Use BuildKit cache:**
```bash
export DOCKER_BUILDKIT=1
docker build --build-arg BUILDKIT_INLINE_CACHE=1 -t ghcr.io/valknarxxx/sexy:latest .
```
2. **Multi-stage caching:**
- Dockerfile already optimized with multi-stage build
- Dependencies cached separately from code
3. **Parallel builds:**
```bash
# Build with more CPU cores
docker build --cpus 4 -t ghcr.io/valknarxxx/sexy:latest .
```
### Reduce Image Size
Current optimizations:
- ✅ Multi-stage build
- ✅ Production dependencies only
- ✅ Minimal base image
- ✅ No dev tools in final image
Expected sizes:
- Base: ~100MB
- Builder: ~2-3GB (not shipped)
- Runner: ~300-500MB (final)
## Automation
### GitHub Actions (Already Set Up)
The `.github/workflows/docker-build-push.yml` automatically:
1. Builds on push to main
2. Creates version tags
3. Pushes to GHCR
4. Caches layers for faster builds
**Trigger a rebuild:**
```bash
git tag v1.0.1
git push origin v1.0.1
```
### Local Build Script
Use the provided `build.sh`:
```bash
./build.sh -t v1.0.0 -p
```
## When NOT to Rebuild
You DON'T need to rebuild for:
- ❌ Directus configuration changes
- ❌ Database credentials
- ❌ Redis settings
- ❌ SMTP settings
- ❌ Session cookie settings
- ❌ Traefik labels
These are runtime environment variables and can be changed in docker-compose.
## Summary
| Change | Rebuild Needed | How to Apply |
|--------|----------------|--------------|
| `PUBLIC_API_URL` | ✅ Yes | Rebuild image |
| `PUBLIC_URL` | ✅ Yes | Rebuild image |
| `PUBLIC_UMAMI_ID` | ✅ Yes | Rebuild image |
| `LETTERSPACE_*` | ✅ Yes | Rebuild image |
| `SEXY_DIRECTUS_*` | ❌ No | Restart container |
| `DB_*` | ❌ No | Restart container |
| `EMAIL_*` | ❌ No | Restart container |
| Traefik labels | ❌ No | Restart container |
---
**Remember:** The key difference is **build-time** (compiled into JS) vs **runtime** (read from environment).

130
build.sh
View File

@@ -1,130 +0,0 @@
#!/bin/bash
# Build script for sexy.pivoine.art Docker image
set -e # Exit on error
# Colors for output
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
NC='\033[0m' # No Color
# Default values
IMAGE_NAME="sexy.pivoine.art"
TAG="latest"
PUSH=false
PLATFORM=""
# Parse arguments
while [[ $# -gt 0 ]]; do
case $1 in
-t|--tag)
TAG="$2"
shift 2
;;
-n|--name)
IMAGE_NAME="$2"
shift 2
;;
-p|--push)
PUSH=true
shift
;;
--platform)
PLATFORM="$2"
shift 2
;;
-h|--help)
echo "Usage: $0 [OPTIONS]"
echo ""
echo "Options:"
echo " -t, --tag TAG Set image tag (default: latest)"
echo " -n, --name NAME Set image name (default: sexy.pivoine.art)"
echo " -p, --push Push image after build"
echo " --platform PLATFORM Build for specific platform (e.g., linux/amd64,linux/arm64)"
echo " -h, --help Show this help message"
echo ""
echo "Examples:"
echo " $0 # Build with defaults"
echo " $0 -t v1.0.0 # Build with version tag"
echo " $0 --platform linux/amd64,linux/arm64 -p # Multi-platform build and push"
exit 0
;;
*)
echo -e "${RED}Unknown option: $1${NC}"
exit 1
;;
esac
done
FULL_IMAGE="${IMAGE_NAME}:${TAG}"
echo -e "${GREEN}=== Building Docker Image ===${NC}"
echo "Image: ${FULL_IMAGE}"
echo "Platform: ${PLATFORM:-default}"
echo ""
# Check if Docker is running
if ! docker info > /dev/null 2>&1; then
echo -e "${RED}Error: Docker is not running${NC}"
exit 1
fi
# Build command
BUILD_CMD="docker build"
if [ -n "$PLATFORM" ]; then
# Multi-platform build requires buildx
echo -e "${YELLOW}Using buildx for multi-platform build${NC}"
BUILD_CMD="docker buildx build --platform ${PLATFORM}"
if [ "$PUSH" = true ]; then
BUILD_CMD="${BUILD_CMD} --push"
fi
else
# Regular build
if [ "$PUSH" = true ]; then
echo -e "${YELLOW}Note: --push only works with multi-platform builds. Use 'docker push' after build.${NC}"
fi
fi
# Execute build
echo -e "${GREEN}Building...${NC}"
$BUILD_CMD -t "${FULL_IMAGE}" .
if [ $? -eq 0 ]; then
echo -e "${GREEN}✓ Build successful!${NC}"
echo "Image: ${FULL_IMAGE}"
# Show image size
if [ -z "$PLATFORM" ]; then
SIZE=$(docker images "${FULL_IMAGE}" --format "{{.Size}}")
echo "Size: ${SIZE}"
fi
# Push if requested and not multi-platform
if [ "$PUSH" = true ] && [ -z "$PLATFORM" ]; then
echo -e "${GREEN}Pushing image...${NC}"
docker push "${FULL_IMAGE}"
if [ $? -eq 0 ]; then
echo -e "${GREEN}✓ Push successful!${NC}"
else
echo -e "${RED}✗ Push failed${NC}"
exit 1
fi
fi
echo ""
echo -e "${GREEN}Next steps:${NC}"
echo "1. Run locally:"
echo " docker run -d -p 3000:3000 --env-file .env.production ${FULL_IMAGE}"
echo ""
echo "2. Run with docker-compose:"
echo " docker-compose -f docker-compose.production.yml up -d"
echo ""
echo "3. View logs:"
echo " docker logs -f <container-name>"
else
echo -e "${RED}✗ Build failed${NC}"
exit 1
fi

View File

@@ -1,130 +0,0 @@
include:
- compose.yml
# Production compose file - extends base compose.yml
# Usage: docker-compose -f compose.production.yml up -d
networks:
compose_network:
external: true
name: compose_network
services:
# Disable local postgres for production (use external DB)
postgres:
deploy:
replicas: 0
# Disable local redis for production (use external Redis)
redis:
deploy:
replicas: 0
# Override Directus for production
directus:
networks:
- compose_network
ports: [] # Remove exposed ports, use Traefik instead
# Override volumes for production paths
volumes:
- ${SEXY_DIRECTUS_UPLOADS:-./uploads}:/directus/uploads
- ${SEXY_DIRECTUS_BUNDLE:-./packages/bundle/dist}:/directus/extensions/sexy.pivoine.art
# Override environment for production settings
environment:
# Database (external)
DB_HOST: ${CORE_DB_HOST}
DB_PORT: ${CORE_DB_PORT:-5432}
DB_DATABASE: ${SEXY_DB_NAME}
DB_USER: ${DB_USER}
DB_PASSWORD: ${DB_PASSWORD}
# General
SECRET: ${SEXY_DIRECTUS_SECRET}
ADMIN_EMAIL: ${ADMIN_EMAIL}
ADMIN_PASSWORD: ${ADMIN_PASSWORD}
PUBLIC_URL: ${SEXY_PUBLIC_URL}
# Cache (external Redis)
REDIS: redis://${CORE_REDIS_HOST}:${CORE_REDIS_PORT:-6379}
# CORS
CORS_ORIGIN: ${SEXY_CORS_ORIGIN}
# Security (production settings)
SESSION_COOKIE_SECURE: ${SEXY_SESSION_COOKIE_SECURE:-true}
SESSION_COOKIE_SAME_SITE: ${SEXY_SESSION_COOKIE_SAME_SITE:-strict}
SESSION_COOKIE_DOMAIN: ${SEXY_SESSION_COOKIE_DOMAIN}
# Extensions
EXTENSIONS_AUTO_RELOAD: ${SEXY_EXTENSIONS_AUTO_RELOAD:-false}
# Email (production SMTP)
EMAIL_TRANSPORT: ${EMAIL_TRANSPORT:-smtp}
EMAIL_FROM: ${EMAIL_FROM}
EMAIL_SMTP_HOST: ${EMAIL_SMTP_HOST}
EMAIL_SMTP_PORT: ${EMAIL_SMTP_PORT:-587}
EMAIL_SMTP_USER: ${EMAIL_SMTP_USER}
EMAIL_SMTP_PASSWORD: ${EMAIL_SMTP_PASSWORD}
# User URLs
USER_REGISTER_URL_ALLOW_LIST: ${SEXY_USER_REGISTER_URL_ALLOW_LIST}
PASSWORD_RESET_URL_ALLOW_LIST: ${SEXY_PASSWORD_RESET_URL_ALLOW_LIST}
# Remove local dependencies
depends_on: []
labels:
# Traefik labels for reverse proxy
- 'traefik.enable=${SEXY_TRAEFIK_ENABLED:-true}'
- 'traefik.http.middlewares.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-redirect-web-secure.redirectscheme.scheme=https'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web.middlewares=${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-redirect-web-secure'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web.rule=Host(`${SEXY_TRAEFIK_HOST}`) && PathPrefix(`/api`)'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web.entrypoints=web'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web-secure.rule=Host(`${SEXY_TRAEFIK_HOST}`) && PathPrefix(`/api`)'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web-secure.tls.certresolver=resolver'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web-secure.entrypoints=web-secure'
- 'traefik.http.middlewares.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web-secure-compress.compress=true'
- 'traefik.http.middlewares.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-strip.stripprefix.prefixes=/api'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web-secure.middlewares=${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-strip,${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web-secure-compress'
- 'traefik.http.services.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web-secure.loadbalancer.server.port=8055'
- 'traefik.docker.network=compose_network'
# Override Frontend for production
frontend:
networks:
- compose_network
ports: [] # Remove exposed ports, use Traefik instead
# Override environment for production
environment:
NODE_ENV: production
PUBLIC_API_URL: ${SEXY_FRONTEND_PUBLIC_API_URL}
PUBLIC_URL: ${SEXY_FRONTEND_PUBLIC_URL}
PUBLIC_UMAMI_ID: ${SEXY_FRONTEND_PUBLIC_UMAMI_ID:-}
LETTERSPACE_API_URL: ${SEXY_FRONTEND_LETTERSPACE_API_URL:-}
LETTERSPACE_API_KEY: ${SEXY_FRONTEND_LETTERSPACE_API_KEY:-}
LETTERSPACE_LIST_ID: ${SEXY_FRONTEND_LETTERSPACE_LIST_ID:-}
# Override volume for production path
volumes:
- ${SEXY_FRONTEND_PATH:-/var/www/sexy.pivoine.art}:/home/node/app
# Remove local dependency
depends_on: []
labels:
# Traefik labels for reverse proxy
- 'traefik.enable=${SEXY_TRAEFIK_ENABLED:-true}'
- 'traefik.http.middlewares.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-redirect-web-secure.redirectscheme.scheme=https'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web.middlewares=${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-redirect-web-secure'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web.rule=Host(`${SEXY_TRAEFIK_HOST}`)'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web.entrypoints=web'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web-secure.rule=Host(`${SEXY_TRAEFIK_HOST}`)'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web-secure.tls.certresolver=resolver'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web-secure.entrypoints=web-secure'
- 'traefik.http.middlewares.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web-secure-compress.compress=true'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web-secure.middlewares=${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web-secure-compress'
- 'traefik.http.services.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web-secure.loadbalancer.server.port=3000'
- 'traefik.docker.network=compose_network'

View File

@@ -1,112 +1,71 @@
name: sexy
services:
# PostgreSQL Database (local only)
postgres:
image: postgres:16-alpine
container_name: ${SEXY_COMPOSE_PROJECT_NAME:-sexy}_postgres
container_name: sexy_postgres
restart: unless-stopped
networks:
- sexy-network
volumes:
- postgres-data:/var/lib/postgresql/data
- postgres_data:/var/lib/postgresql/data
environment:
POSTGRES_DB: ${DB_DATABASE:-sexy}
POSTGRES_USER: ${DB_USER:-sexy}
POSTGRES_PASSWORD: ${DB_PASSWORD:-sexy}
POSTGRES_DB: sexy
POSTGRES_USER: sexy
POSTGRES_PASSWORD: sexy
healthcheck:
test: ["CMD-SHELL", "pg_isready -U ${DB_USER:-sexy}"]
test: ["CMD-SHELL", "pg_isready -U sexy"]
interval: 10s
timeout: 5s
retries: 5
# Redis Cache (local only)
redis:
image: redis:7-alpine
container_name: ${SEXY_COMPOSE_PROJECT_NAME:-sexy}_redis
container_name: sexy_redis
restart: unless-stopped
networks:
- sexy-network
volumes:
- redis-data:/data
- redis_data:/data
command: redis-server --appendonly yes
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 10s
timeout: 5s
retries: 5
# Directus CMS
directus:
image: ${SEXY_DIRECTUS_IMAGE:-directus/directus:11}
container_name: ${SEXY_COMPOSE_PROJECT_NAME:-sexy}_api
image: directus/directus:11
container_name: sexy_directus
restart: unless-stopped
networks:
- sexy-network
ports:
- "8055:8055"
volumes:
- directus-uploads:/directus/uploads
- ${SEXY_DIRECTUS_BUNDLE:-./packages/bundle}:/directus/extensions/sexy.pivoine.art
- directus_uploads:/directus/uploads
- ./packages/bundle:/directus/extensions/sexy.pivoine.art
environment:
# Database
DB_CLIENT: pg
DB_HOST: ${CORE_DB_HOST:-postgres}
DB_PORT: ${CORE_DB_PORT:-5432}
DB_DATABASE: ${SEXY_DB_NAME:-sexy}
DB_USER: ${DB_USER:-sexy}
DB_PASSWORD: ${DB_PASSWORD:-sexy}
# General
SECRET: ${SEXY_DIRECTUS_SECRET:-replace-with-random-secret-min-32-chars}
ADMIN_EMAIL: ${ADMIN_EMAIL:-admin@sexy.pivoine.art}
ADMIN_PASSWORD: ${ADMIN_PASSWORD:-admin}
PUBLIC_URL: ${SEXY_PUBLIC_URL:-http://localhost:8055}
# Cache
CACHE_ENABLED: ${SEXY_CACHE_ENABLED:-true}
CACHE_AUTO_PURGE: ${SEXY_CACHE_AUTO_PURGE:-true}
DB_HOST: sexy_postgres
DB_PORT: 5432
DB_DATABASE: sexy
DB_USER: sexy
DB_PASSWORD: sexy
ADMIN_EMAIL: admin@sexy
ADMIN_PASSWORD: admin
PUBLIC_URL: http://localhost:3000/api
CACHE_ENABLED: true
CACHE_AUTO_PURGE: true
CACHE_STORE: redis
REDIS: redis://${CORE_REDIS_HOST:-redis}:${CORE_REDIS_PORT:-6379}
# CORS
CORS_ENABLED: ${SEXY_CORS_ENABLED:-true}
CORS_ORIGIN: ${SEXY_CORS_ORIGIN:-http://localhost:3000}
# Security
SESSION_COOKIE_SECURE: ${SEXY_SESSION_COOKIE_SECURE:-false}
SESSION_COOKIE_SAME_SITE: ${SEXY_SESSION_COOKIE_SAME_SITE:-lax}
SESSION_COOKIE_DOMAIN: ${SEXY_SESSION_COOKIE_DOMAIN:-localhost}
# Extensions
EXTENSIONS_PATH: ${SEXY_EXTENSIONS_PATH:-/directus/extensions}
EXTENSIONS_AUTO_RELOAD: ${SEXY_EXTENSIONS_AUTO_RELOAD:-true}
# WebSockets
WEBSOCKETS_ENABLED: ${SEXY_WEBSOCKETS_ENABLED:-true}
# Email (optional for local dev)
EMAIL_TRANSPORT: ${EMAIL_TRANSPORT:-sendmail}
EMAIL_FROM: ${EMAIL_FROM:-noreply@sexy.pivoine.art}
EMAIL_SMTP_HOST: ${EMAIL_SMTP_HOST:-}
EMAIL_SMTP_PORT: ${EMAIL_SMTP_PORT:-587}
EMAIL_SMTP_USER: ${EMAIL_SMTP_USER:-}
EMAIL_SMTP_PASSWORD: ${EMAIL_SMTP_PASSWORD:-}
# User Registration & Password Reset URLs
USER_REGISTER_URL_ALLOW_LIST: ${SEXY_USER_REGISTER_URL_ALLOW_LIST:-http://localhost:3000}
PASSWORD_RESET_URL_ALLOW_LIST: ${SEXY_PASSWORD_RESET_URL_ALLOW_LIST:-http://localhost:3000}
# Content Security Policy
CONTENT_SECURITY_POLICY_DIRECTIVES__FRAME_SRC: ${SEXY_CONTENT_SECURITY_POLICY_DIRECTIVES__FRAME_SRC:-}
# Timezone
TZ: ${TIMEZONE:-Europe/Amsterdam}
REDIS: redis://sexy_redis:6379
CORS_ENABLED: true
CORS_ORIGIN: http://localhost:3000
SESSION_COOKIE_SECURE: false
SESSION_COOKIE_SAME_SITE: lax
SESSION_COOKIE_DOMAIN: localhost
EXTENSIONS_PATH: /directus/extensions
EXTENSIONS_AUTO_RELOAD: true
WEBSOCKETS_ENABLED: true
USER_REGISTER_URL_ALLOW_LIST: http://localhost:3000
PASSWORD_RESET_URL_ALLOW_LIST: http://localhost:3000
TZ: Europe/Amsterdam
depends_on:
postgres:
condition: service_healthy
redis:
condition: service_healthy
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:8055/server/health"]
interval: 30s
@@ -114,70 +73,10 @@ services:
retries: 3
start_period: 40s
# Frontend (local development - optional, usually run via pnpm dev)
frontend:
image: ${SEXY_FRONTEND_IMAGE:-ghcr.io/valknarxxx/sexy:latest}
container_name: ${SEXY_COMPOSE_PROJECT_NAME:-sexy}_frontend
restart: unless-stopped
user: node
working_dir: /home/node/app/packages/frontend
networks:
- sexy-network
ports:
- "3000:3000"
environment:
# Node
NODE_ENV: ${NODE_ENV:-development}
PORT: 3000
HOST: 0.0.0.0
# Public environment variables
PUBLIC_API_URL: ${SEXY_FRONTEND_PUBLIC_API_URL:-http://localhost:8055}
PUBLIC_URL: ${SEXY_FRONTEND_PUBLIC_URL:-http://localhost:3000}
PUBLIC_UMAMI_ID: ${SEXY_FRONTEND_PUBLIC_UMAMI_ID:-}
# Letterspace newsletter integration
LETTERSPACE_API_URL: ${SEXY_FRONTEND_LETTERSPACE_API_URL:-}
LETTERSPACE_API_KEY: ${SEXY_FRONTEND_LETTERSPACE_API_KEY:-}
LETTERSPACE_LIST_ID: ${SEXY_FRONTEND_LETTERSPACE_LIST_ID:-}
# Timezone
TZ: ${TIMEZONE:-Europe/Amsterdam}
volumes:
- ${SEXY_FRONTEND_PATH:-./}:/home/node/app
command: ["node", "build/index.js"]
depends_on:
- directus
healthcheck:
test: ["CMD", "node", "-e", "require('http').get('http://localhost:3000/', (r) => {process.exit(r.statusCode === 200 ? 0 : 1)})"]
interval: 30s
timeout: 3s
retries: 3
start_period: 40s
# Uncomment to run frontend in development mode with live reload
# build:
# context: .
# dockerfile: Dockerfile
# volumes:
# - ./packages/frontend:/home/node/app/packages/frontend
# - /home/node/app/packages/frontend/node_modules
# environment:
# NODE_ENV: development
networks:
sexy-network:
driver: bridge
name: ${SEXY_COMPOSE_PROJECT_NAME:-sexy}_network
volumes:
directus-uploads:
directus_uploads:
driver: local
postgres-data:
postgres_data:
driver: local
redis-data:
redis_data:
driver: local

View File

@@ -102,34 +102,6 @@ collections:
versioning: false
schema:
name: sexy_videos_directus_users
- collection: sexy_recordings
meta:
accountability: all
archive_app_filter: true
archive_field: status
archive_value: archived
collapse: open
collection: sexy_recordings
color: null
display_template: null
group: null
hidden: false
icon: fiber_manual_record
item_duplication_fields: null
note: null
preview_url: null
singleton: false
sort: null
sort_field: null
translations:
- language: en-US
plural: Recordings
singular: Recording
translation: Sexy Recordings
unarchive_value: draft
versioning: false
schema:
name: sexy_recordings
fields:
- collection: directus_users
field: website
@@ -206,7 +178,7 @@ fields:
max_length: 255
numeric_precision: null
numeric_scale: null
is_nullable: true
is_nullable: false
is_unique: true
is_indexed: true
is_primary_key: false
@@ -1908,639 +1880,6 @@ fields:
has_auto_increment: false
foreign_key_table: directus_users
foreign_key_column: id
- collection: sexy_recordings
field: id
type: uuid
meta:
collection: sexy_recordings
conditions: null
display: null
display_options: null
field: id
group: null
hidden: true
interface: input
note: null
options: null
readonly: true
required: false
sort: 1
special:
- uuid
translations: null
validation: null
validation_message: null
width: full
schema:
name: id
table: sexy_recordings
data_type: uuid
default_value: null
max_length: null
numeric_precision: null
numeric_scale: null
is_nullable: false
is_unique: true
is_indexed: false
is_primary_key: true
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: status
type: string
meta:
collection: sexy_recordings
conditions: null
display: labels
display_options:
choices:
- background: var(--theme--primary-background)
color: var(--theme--primary)
foreground: var(--theme--primary)
text: $t:published
value: published
- background: var(--theme--background-normal)
color: var(--theme--foreground)
foreground: var(--theme--foreground)
text: $t:draft
value: draft
- background: var(--theme--warning-background)
color: var(--theme--warning)
foreground: var(--theme--warning)
text: $t:archived
value: archived
showAsDot: true
field: status
group: null
hidden: false
interface: select-dropdown
note: null
options:
choices:
- color: var(--theme--primary)
text: $t:published
value: published
- color: var(--theme--foreground)
text: $t:draft
value: draft
- color: var(--theme--warning)
text: $t:archived
value: archived
readonly: false
required: false
sort: 2
special: null
translations: null
validation: null
validation_message: null
width: full
schema:
name: status
table: sexy_recordings
data_type: character varying
default_value: draft
max_length: 255
numeric_precision: null
numeric_scale: null
is_nullable: false
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: user_created
type: uuid
meta:
collection: sexy_recordings
conditions: null
display: user
display_options: null
field: user_created
group: null
hidden: true
interface: select-dropdown-m2o
note: null
options:
template: '{{avatar}} {{first_name}} {{last_name}}'
readonly: true
required: false
sort: 3
special:
- user-created
translations: null
validation: null
validation_message: null
width: half
schema:
name: user_created
table: sexy_recordings
data_type: uuid
default_value: null
max_length: null
numeric_precision: null
numeric_scale: null
is_nullable: true
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: directus_users
foreign_key_column: id
- collection: sexy_recordings
field: date_created
type: timestamp
meta:
collection: sexy_recordings
conditions: null
display: datetime
display_options:
relative: true
field: date_created
group: null
hidden: true
interface: datetime
note: null
options: null
readonly: true
required: false
sort: 4
special:
- date-created
translations: null
validation: null
validation_message: null
width: half
schema:
name: date_created
table: sexy_recordings
data_type: timestamp with time zone
default_value: null
max_length: null
numeric_precision: null
numeric_scale: null
is_nullable: true
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: date_updated
type: timestamp
meta:
collection: sexy_recordings
conditions: null
display: datetime
display_options:
relative: true
field: date_updated
group: null
hidden: true
interface: datetime
note: null
options: null
readonly: true
required: false
sort: 5
special:
- date-updated
translations: null
validation: null
validation_message: null
width: half
schema:
name: date_updated
table: sexy_recordings
data_type: timestamp with time zone
default_value: null
max_length: null
numeric_precision: null
numeric_scale: null
is_nullable: true
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: title
type: string
meta:
collection: sexy_recordings
conditions: null
display: null
display_options: null
field: title
group: null
hidden: false
interface: input
note: null
options: null
readonly: false
required: true
sort: 6
special: null
translations: null
validation: null
validation_message: null
width: full
schema:
name: title
table: sexy_recordings
data_type: character varying
default_value: null
max_length: 255
numeric_precision: null
numeric_scale: null
is_nullable: true
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: description
type: text
meta:
collection: sexy_recordings
conditions: null
display: null
display_options: null
field: description
group: null
hidden: false
interface: input-multiline
note: null
options:
trim: true
readonly: false
required: false
sort: 7
special: null
translations: null
validation: null
validation_message: null
width: full
schema:
name: description
table: sexy_recordings
data_type: text
default_value: null
max_length: null
numeric_precision: null
numeric_scale: null
is_nullable: true
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: slug
type: string
meta:
collection: sexy_recordings
conditions: null
display: null
display_options: null
field: slug
group: null
hidden: false
interface: input
note: null
options:
slug: true
trim: true
readonly: false
required: true
sort: 8
special: null
translations: null
validation: null
validation_message: null
width: full
schema:
name: slug
table: sexy_recordings
data_type: character varying
default_value: null
max_length: 255
numeric_precision: null
numeric_scale: null
is_nullable: false
is_unique: true
is_indexed: true
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: duration
type: float
meta:
collection: sexy_recordings
conditions: null
display: null
display_options: null
field: duration
group: null
hidden: false
interface: input
note: Duration in milliseconds
options: null
readonly: false
required: true
sort: 9
special: null
translations: null
validation: null
validation_message: null
width: full
schema:
name: duration
table: sexy_recordings
data_type: double precision
default_value: null
max_length: null
numeric_precision: 53
numeric_scale: null
is_nullable: false
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: events
type: json
meta:
collection: sexy_recordings
conditions: null
display: null
display_options: null
field: events
group: null
hidden: false
interface: input-code
note: Array of recorded events with timestamps
options:
language: json
readonly: false
required: true
sort: 10
special:
- cast-json
translations: null
validation: null
validation_message: null
width: full
schema:
name: events
table: sexy_recordings
data_type: json
default_value: null
max_length: null
numeric_precision: null
numeric_scale: null
is_nullable: false
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: device_info
type: json
meta:
collection: sexy_recordings
conditions: null
display: null
display_options: null
field: device_info
group: null
hidden: false
interface: input-code
note: Array of device metadata
options:
language: json
readonly: false
required: true
sort: 11
special:
- cast-json
translations: null
validation: null
validation_message: null
width: full
schema:
name: device_info
table: sexy_recordings
data_type: json
default_value: null
max_length: null
numeric_precision: null
numeric_scale: null
is_nullable: false
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: tags
type: json
meta:
collection: sexy_recordings
conditions: null
display: null
display_options: null
field: tags
group: null
hidden: false
interface: tags
note: null
options: null
readonly: false
required: false
sort: 12
special:
- cast-json
translations: null
validation: null
validation_message: null
width: full
schema:
name: tags
table: sexy_recordings
data_type: json
default_value: null
max_length: null
numeric_precision: null
numeric_scale: null
is_nullable: true
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: linked_video
type: uuid
meta:
collection: sexy_recordings
conditions: null
display: null
display_options: null
field: linked_video
group: null
hidden: false
interface: select-dropdown-m2o
note: null
options:
enableLink: true
readonly: false
required: false
sort: 13
special:
- m2o
translations: null
validation: null
validation_message: null
width: full
schema:
name: linked_video
table: sexy_recordings
data_type: uuid
default_value: null
max_length: null
numeric_precision: null
numeric_scale: null
is_nullable: true
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: sexy_videos
foreign_key_column: id
- collection: sexy_recordings
field: featured
type: boolean
meta:
collection: sexy_recordings
conditions: null
display: null
display_options: null
field: featured
group: null
hidden: false
interface: boolean
note: null
options:
label: Featured
readonly: false
required: false
sort: 14
special:
- cast-boolean
translations: null
validation: null
validation_message: null
width: full
schema:
name: featured
table: sexy_recordings
data_type: boolean
default_value: false
max_length: null
numeric_precision: null
numeric_scale: null
is_nullable: true
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: public
type: boolean
meta:
collection: sexy_recordings
conditions: null
display: null
display_options: null
field: public
group: null
hidden: false
interface: boolean
note: null
options:
label: Public
readonly: false
required: false
sort: 15
special:
- cast-boolean
translations: null
validation: null
validation_message: null
width: full
schema:
name: public
table: sexy_recordings
data_type: boolean
default_value: false
max_length: null
numeric_precision: null
numeric_scale: null
is_nullable: true
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
relations:
- collection: directus_users
field: banner
@@ -2773,45 +2112,3 @@ relations:
constraint_name: sexy_videos_directus_users_sexy_videos_id_foreign
on_update: NO ACTION
on_delete: SET NULL
- collection: sexy_recordings
field: user_created
related_collection: directus_users
meta:
junction_field: null
many_collection: sexy_recordings
many_field: user_created
one_allowed_collections: null
one_collection: directus_users
one_collection_field: null
one_deselect_action: nullify
one_field: null
sort_field: null
schema:
table: sexy_recordings
column: user_created
foreign_key_table: directus_users
foreign_key_column: id
constraint_name: sexy_recordings_user_created_foreign
on_update: NO ACTION
on_delete: NO ACTION
- collection: sexy_recordings
field: linked_video
related_collection: sexy_videos
meta:
junction_field: null
many_collection: sexy_recordings
many_field: linked_video
one_allowed_collections: null
one_collection: sexy_videos
one_collection_field: null
one_deselect_action: nullify
one_field: null
sort_field: null
schema:
table: sexy_recordings
column: linked_video
foreign_key_table: sexy_videos
foreign_key_column: id
constraint_name: sexy_recordings_linked_video_foreign
on_update: NO ACTION
on_delete: SET NULL

View File

@@ -1,177 +0,0 @@
-- Gamification System Schema for Sexy Recordings Platform
-- Created: 2025-10-28
-- Description: Recording-focused gamification with time-weighted scoring
-- ====================
-- Table: sexy_recording_plays
-- ====================
-- Tracks when users play recordings (similar to video plays)
CREATE TABLE IF NOT EXISTS sexy_recording_plays (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID NOT NULL REFERENCES directus_users(id) ON DELETE CASCADE,
recording_id UUID NOT NULL REFERENCES sexy_recordings(id) ON DELETE CASCADE,
duration_played INTEGER, -- Duration played in milliseconds
completed BOOLEAN DEFAULT FALSE, -- True if >= 90% watched
date_created TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
date_updated TIMESTAMP WITH TIME ZONE DEFAULT NOW()
);
CREATE INDEX IF NOT EXISTS idx_recording_plays_user ON sexy_recording_plays(user_id);
CREATE INDEX IF NOT EXISTS idx_recording_plays_recording ON sexy_recording_plays(recording_id);
CREATE INDEX IF NOT EXISTS idx_recording_plays_date ON sexy_recording_plays(date_created);
COMMENT ON TABLE sexy_recording_plays IS 'Tracks user playback of recordings for analytics and gamification';
COMMENT ON COLUMN sexy_recording_plays.completed IS 'True if user watched at least 90% of the recording';
-- ====================
-- Table: sexy_user_points
-- ====================
-- Tracks individual point-earning actions with timestamps for time-weighted scoring
CREATE TABLE IF NOT EXISTS sexy_user_points (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID NOT NULL REFERENCES directus_users(id) ON DELETE CASCADE,
action VARCHAR(50) NOT NULL, -- e.g., "RECORDING_CREATE", "RECORDING_PLAY", "COMMENT_CREATE"
points INTEGER NOT NULL, -- Raw points earned
recording_id UUID REFERENCES sexy_recordings(id) ON DELETE SET NULL, -- Optional reference
date_created TIMESTAMP WITH TIME ZONE DEFAULT NOW()
);
CREATE INDEX IF NOT EXISTS idx_user_points_user ON sexy_user_points(user_id);
CREATE INDEX IF NOT EXISTS idx_user_points_date ON sexy_user_points(date_created);
CREATE INDEX IF NOT EXISTS idx_user_points_action ON sexy_user_points(action);
COMMENT ON TABLE sexy_user_points IS 'Individual point-earning actions for gamification system';
COMMENT ON COLUMN sexy_user_points.action IS 'Type of action: RECORDING_CREATE, RECORDING_PLAY, RECORDING_COMPLETE, COMMENT_CREATE, RECORDING_FEATURED';
COMMENT ON COLUMN sexy_user_points.points IS 'Raw points before time-weighted decay calculation';
-- ====================
-- Table: sexy_achievements
-- ====================
-- Predefined achievement definitions
CREATE TABLE IF NOT EXISTS sexy_achievements (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
code VARCHAR(50) UNIQUE NOT NULL, -- Unique identifier (e.g., "first_recording", "recording_100")
name VARCHAR(255) NOT NULL, -- Display name
description TEXT, -- Achievement description
icon VARCHAR(255), -- Icon identifier or emoji
category VARCHAR(50) NOT NULL, -- e.g., "recordings", "playback", "social", "special"
required_count INTEGER, -- Number of actions needed to unlock
points_reward INTEGER DEFAULT 0, -- Bonus points awarded upon unlock
sort INTEGER DEFAULT 0, -- Display order
status VARCHAR(20) DEFAULT 'published' -- published, draft, archived
);
CREATE INDEX IF NOT EXISTS idx_achievements_category ON sexy_achievements(category);
CREATE INDEX IF NOT EXISTS idx_achievements_code ON sexy_achievements(code);
COMMENT ON TABLE sexy_achievements IS 'Predefined achievement definitions for gamification';
COMMENT ON COLUMN sexy_achievements.code IS 'Unique code used in backend logic (e.g., first_recording, play_100)';
COMMENT ON COLUMN sexy_achievements.category IS 'Achievement category: recordings, playback, social, special';
-- ====================
-- Table: sexy_user_achievements
-- ====================
-- Junction table tracking unlocked achievements per user
CREATE TABLE IF NOT EXISTS sexy_user_achievements (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID NOT NULL REFERENCES directus_users(id) ON DELETE CASCADE,
achievement_id UUID NOT NULL REFERENCES sexy_achievements(id) ON DELETE CASCADE,
progress INTEGER DEFAULT 0, -- Current progress toward unlocking
date_unlocked TIMESTAMP WITH TIME ZONE, -- NULL if not yet unlocked
UNIQUE(user_id, achievement_id)
);
CREATE INDEX IF NOT EXISTS idx_user_achievements_user ON sexy_user_achievements(user_id);
CREATE INDEX IF NOT EXISTS idx_user_achievements_achievement ON sexy_user_achievements(achievement_id);
CREATE INDEX IF NOT EXISTS idx_user_achievements_unlocked ON sexy_user_achievements(date_unlocked) WHERE date_unlocked IS NOT NULL;
COMMENT ON TABLE sexy_user_achievements IS 'Tracks which achievements users have unlocked';
COMMENT ON COLUMN sexy_user_achievements.progress IS 'Current progress (e.g., 7/10 recordings created)';
COMMENT ON COLUMN sexy_user_achievements.date_unlocked IS 'NULL if achievement not yet unlocked';
-- ====================
-- Table: sexy_user_stats
-- ====================
-- Cached aggregate statistics for efficient leaderboard queries
CREATE TABLE IF NOT EXISTS sexy_user_stats (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID UNIQUE NOT NULL REFERENCES directus_users(id) ON DELETE CASCADE,
total_raw_points INTEGER DEFAULT 0, -- Sum of all points (no decay)
total_weighted_points NUMERIC(10,2) DEFAULT 0, -- Time-weighted score for rankings
recordings_count INTEGER DEFAULT 0, -- Number of published recordings
playbacks_count INTEGER DEFAULT 0, -- Number of recordings played
comments_count INTEGER DEFAULT 0, -- Number of comments on recordings
achievements_count INTEGER DEFAULT 0, -- Number of unlocked achievements
last_updated TIMESTAMP WITH TIME ZONE DEFAULT NOW() -- Cache timestamp
);
CREATE INDEX IF NOT EXISTS idx_user_stats_weighted ON sexy_user_stats(total_weighted_points DESC);
CREATE INDEX IF NOT EXISTS idx_user_stats_user ON sexy_user_stats(user_id);
COMMENT ON TABLE sexy_user_stats IS 'Cached user statistics for fast leaderboard queries';
COMMENT ON COLUMN sexy_user_stats.total_raw_points IS 'Sum of all points without time decay';
COMMENT ON COLUMN sexy_user_stats.total_weighted_points IS 'Time-weighted score using exponential decay (λ=0.005)';
COMMENT ON COLUMN sexy_user_stats.last_updated IS 'Timestamp for cache invalidation';
-- ====================
-- Insert Initial Achievements
-- ====================
-- 🎬 Recordings (Creation)
INSERT INTO sexy_achievements (code, name, description, icon, category, required_count, points_reward, sort) VALUES
('first_recording', 'First Recording', 'Create your first recording', '🎬', 'recordings', 1, 50, 1),
('recording_10', 'Recording Enthusiast', 'Create 10 recordings', '📹', 'recordings', 10, 100, 2),
('recording_50', 'Prolific Creator', 'Create 50 recordings', '🎥', 'recordings', 50, 500, 3),
('recording_100', 'Recording Master', 'Create 100 recordings', '🏆', 'recordings', 100, 1000, 4),
('featured_recording', 'Featured Creator', 'Get a recording featured', '', 'recordings', 1, 200, 5)
ON CONFLICT (code) DO NOTHING;
-- ▶️ Playback (Consumption)
INSERT INTO sexy_achievements (code, name, description, icon, category, required_count, points_reward, sort) VALUES
('first_play', 'First Play', 'Play your first recording', '▶️', 'playback', 1, 25, 10),
('play_100', 'Active Player', 'Play 100 recordings', '🎮', 'playback', 100, 250, 11),
('play_500', 'Playback Enthusiast', 'Play 500 recordings', '🔥', 'playback', 500, 1000, 12),
('completionist_10', 'Completionist', 'Complete 10 recordings to 90%+', '', 'playback', 10, 100, 13),
('completionist_100', 'Super Completionist', 'Complete 100 recordings', '💯', 'playback', 100, 500, 14)
ON CONFLICT (code) DO NOTHING;
-- 💬 Social (Community)
INSERT INTO sexy_achievements (code, name, description, icon, category, required_count, points_reward, sort) VALUES
('first_comment', 'First Comment', 'Leave your first comment', '💬', 'social', 1, 25, 20),
('comment_50', 'Conversationalist', 'Leave 50 comments', '💭', 'social', 50, 200, 21),
('comment_250', 'Community Voice', 'Leave 250 comments', '📣', 'social', 250, 750, 22)
ON CONFLICT (code) DO NOTHING;
-- ⭐ Special (Milestones)
INSERT INTO sexy_achievements (code, name, description, icon, category, required_count, points_reward, sort) VALUES
('early_adopter', 'Early Adopter', 'Join in the first month', '🚀', 'special', 1, 500, 30),
('one_year', 'One Year Anniversary', 'Be a member for 1 year', '🎂', 'special', 1, 1000, 31),
('balanced_creator', 'Balanced Creator', '50 recordings + 100 plays', '⚖️', 'special', 1, 500, 32),
('top_10_rank', 'Top 10 Leaderboard', 'Reach top 10 on leaderboard', '🏅', 'special', 1, 2000, 33)
ON CONFLICT (code) DO NOTHING;
-- ====================
-- Verification Queries
-- ====================
-- Count tables created
SELECT
'sexy_recording_plays' as table_name,
COUNT(*) as row_count
FROM sexy_recording_plays
UNION ALL
SELECT 'sexy_user_points', COUNT(*) FROM sexy_user_points
UNION ALL
SELECT 'sexy_achievements', COUNT(*) FROM sexy_achievements
UNION ALL
SELECT 'sexy_user_achievements', COUNT(*) FROM sexy_user_achievements
UNION ALL
SELECT 'sexy_user_stats', COUNT(*) FROM sexy_user_stats;
-- Show created achievements
SELECT
category,
COUNT(*) as achievement_count
FROM sexy_achievements
GROUP BY category
ORDER BY category;

View File

@@ -7,13 +7,16 @@
"test": "echo \"Error: no test specified\" && exit 1",
"build:bundle": "git pull && pnpm install && pnpm --filter @sexy.pivoine.art/bundle build",
"build:frontend": "git pull && pnpm install && pnpm --filter @sexy.pivoine.art/frontend build",
"dev:data": "cd ../compose/data && docker compose up -d",
"dev:directus": "cd ../compose/sexy && docker compose --env-file=.env.local up -d directus",
"dev": "pnpm dev:data && pnpm dev:directus && pnpm --filter @sexy.pivoine.art/frontend dev"
"dev": "pnpm build:bundle && docker compose up -d && pnpm --filter @sexy.pivoine.art/frontend dev",
"schema:export": "docker compose exec directus node /directus/cli.js schema snapshot --yes /tmp/snapshot.yml && docker compose cp directus:/tmp/snapshot.yml ./directus.yml && docker compose exec db pg_dump -U sexy --schema-only sexy > schema.sql",
"schema:import": "docker compose exec -T postgres psql -U sexy sexy < schema.sql && docker compose cp ./directus.yml directus:/tmp/snapshot.yml && docker compose exec directus node /directus/cli.js schema apply --yes /tmp/snapshot.yml"
},
"keywords": [],
"author": "",
"license": "ISC",
"author": {
"name": "Valknar",
"email": "valknar@pivoine.art"
},
"license": "MIT",
"packageManager": "pnpm@10.19.0",
"pnpm": {
"onlyBuiltDependencies": [

View File

@@ -45,7 +45,7 @@
"add": "directus-extension add"
},
"devDependencies": {
"@directus/extensions-sdk": "16.0.2"
"@directus/extensions-sdk": "17.0.9"
},
"dependencies": {
"@sindresorhus/slugify": "^3.0.0",

View File

@@ -177,7 +177,7 @@ checksum = "46c5e41b57b8bba42a04676d81cb89e9ee8e859a1a66f80a5a72e1cb76b34d43"
[[package]]
name = "buttplug_core"
version = "10.0.0"
source = "git+https://github.com/valknarthing/buttplug.git#c569409c51ad15f343c3f97a57711cdaa358f2ea"
source = "git+https://github.com/valknarthing/buttplug.git?rev=fad6c9d#fad6c9d97895218b01ceb55fd4a872a89043194a"
dependencies = [
"async-stream",
"cfg-if",
@@ -203,7 +203,7 @@ dependencies = [
[[package]]
name = "buttplug_server"
version = "10.0.0"
source = "git+https://github.com/valknarthing/buttplug.git#c569409c51ad15f343c3f97a57711cdaa358f2ea"
source = "git+https://github.com/valknarthing/buttplug.git?rev=fad6c9d#fad6c9d97895218b01ceb55fd4a872a89043194a"
dependencies = [
"aes",
"async-trait",
@@ -243,8 +243,8 @@ dependencies = [
[[package]]
name = "buttplug_server_device_config"
version = "10.0.0"
source = "git+https://github.com/valknarthing/buttplug.git#c569409c51ad15f343c3f97a57711cdaa358f2ea"
version = "10.0.1"
source = "git+https://github.com/valknarthing/buttplug.git?rev=fad6c9d#fad6c9d97895218b01ceb55fd4a872a89043194a"
dependencies = [
"buttplug_core",
"dashmap",
@@ -913,9 +913,9 @@ checksum = "4a5f13b858c8d314ee3e8f639011f7ccefe71f97f96e50151fb991f267928e2c"
[[package]]
name = "js-sys"
version = "0.3.80"
version = "0.3.87"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "852f13bec5eba4ba9afbeb93fd7c13fe56147f055939ae21c43a29a0ecb2702e"
checksum = "93f0862381daaec758576dcc22eb7bbf4d7efd67328553f3b45a412a51a3fb21"
dependencies = [
"once_cell",
"wasm-bindgen",
@@ -1860,9 +1860,9 @@ dependencies = [
[[package]]
name = "wasm-bindgen"
version = "0.2.103"
version = "0.2.110"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ab10a69fbd0a177f5f649ad4d8d3305499c42bab9aef2f7ff592d0ec8f833819"
checksum = "1de241cdc66a9d91bd84f097039eb140cdc6eec47e0cdbaf9d932a1dd6c35866"
dependencies = [
"cfg-if",
"once_cell",
@@ -1873,27 +1873,14 @@ dependencies = [
"wasm-bindgen-shared",
]
[[package]]
name = "wasm-bindgen-backend"
version = "0.2.103"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0bb702423545a6007bbc368fde243ba47ca275e549c8a28617f56f6ba53b1d1c"
dependencies = [
"bumpalo",
"log",
"proc-macro2",
"quote",
"syn",
"wasm-bindgen-shared",
]
[[package]]
name = "wasm-bindgen-futures"
version = "0.4.53"
version = "0.4.60"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a0b221ff421256839509adbb55998214a70d829d3a28c69b4a6672e9d2a42f67"
checksum = "a42e96ea38f49b191e08a1bab66c7ffdba24b06f9995b39a9dd60222e5b6f1da"
dependencies = [
"cfg-if",
"futures-util",
"js-sys",
"once_cell",
"wasm-bindgen",
@@ -1902,9 +1889,9 @@ dependencies = [
[[package]]
name = "wasm-bindgen-macro"
version = "0.2.103"
version = "0.2.110"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fc65f4f411d91494355917b605e1480033152658d71f722a90647f56a70c88a0"
checksum = "e12fdf6649048f2e3de6d7d5ff3ced779cdedee0e0baffd7dff5cdfa3abc8a52"
dependencies = [
"quote",
"wasm-bindgen-macro-support",
@@ -1912,22 +1899,22 @@ dependencies = [
[[package]]
name = "wasm-bindgen-macro-support"
version = "0.2.103"
version = "0.2.110"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ffc003a991398a8ee604a401e194b6b3a39677b3173d6e74495eb51b82e99a32"
checksum = "0e63d1795c565ac3462334c1e396fd46dbf481c40f51f5072c310717bc4fb309"
dependencies = [
"bumpalo",
"proc-macro2",
"quote",
"syn",
"wasm-bindgen-backend",
"wasm-bindgen-shared",
]
[[package]]
name = "wasm-bindgen-shared"
version = "0.2.103"
version = "0.2.110"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "293c37f4efa430ca14db3721dfbe48d8c33308096bd44d80ebaa775ab71ba1cf"
checksum = "e9f9cdac23a5ce71f6bf9f8824898a501e511892791ea2a0c6b8568c68b9cb53"
dependencies = [
"unicode-ident",
]
@@ -1948,9 +1935,9 @@ dependencies = [
[[package]]
name = "web-sys"
version = "0.3.80"
version = "0.3.87"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fbe734895e869dc429d78c4b433f8d17d95f8d05317440b4fad5ab2d33e596dc"
checksum = "f2c7c5718134e770ee62af3b6b4a84518ec10101aad610c024b64d6ff29bb1ff"
dependencies = [
"js-sys",
"wasm-bindgen",

View File

@@ -16,15 +16,15 @@ name = "buttplug_wasm"
path = "src/lib.rs"
[dependencies]
buttplug_core = { git = "https://github.com/valknarthing/buttplug.git", default-features = false, features = ["wasm"] }
buttplug_server = { git = "https://github.com/valknarthing/buttplug.git", default-features = false, features = ["wasm"] }
buttplug_server_device_config = { git = "https://github.com/valknarthing/buttplug.git" }
js-sys = "0.3.80"
buttplug_core = { git = "https://github.com/valknarthing/buttplug.git", rev = "fad6c9d", default-features = false, features = ["wasm"] }
buttplug_server = { git = "https://github.com/valknarthing/buttplug.git", rev = "fad6c9d", default-features = false, features = ["wasm"] }
buttplug_server_device_config = { git = "https://github.com/valknarthing/buttplug.git", rev = "fad6c9d" }
js-sys = "0.3.87"
tracing-wasm = "0.2.1"
log-panics = { version = "2.1.0", features = ["with-backtrace"] }
console_error_panic_hook = "0.1.7"
wasmtimer = "0.4.3"
wasm-bindgen = { version = "0.2.103", features = ["serde-serialize"] }
wasm-bindgen = { version = "0.2.110", features = ["serde-serialize"] }
tokio = { version = "1.47.1", features = ["sync", "macros", "io-util"] }
tokio-stream = "0.1.17"
tracing = "0.1.41"
@@ -33,12 +33,12 @@ tracing-subscriber = { version = "0.3.20", features = ["json"] }
futures = "0.3.31"
futures-util = "0.3.31"
async-trait = "0.1.89"
wasm-bindgen-futures = "0.4.53"
wasm-bindgen-futures = "0.4.60"
getrandom = { version = "0.3", features = ["wasm_js"] }
parking_lot = { version = "0.11.1", features = ["wasm-bindgen"]}
[dependencies.web-sys]
version = "0.3.80"
version = "0.3.87"
# path = "../../wasm-bindgen/crates/web-sys"
#git = "https://github.com/rustwasm/wasm-bindgen"
features = [

View File

@@ -13,13 +13,13 @@
"build:wasm": "wasm-pack build --out-dir wasm --out-name index --target bundler --release"
},
"dependencies": {
"eventemitter3": "^5.0.1",
"typescript": "^5.9.2",
"vite": "^7.1.4",
"eventemitter3": "^5.0.4",
"typescript": "^5.9.3",
"vite": "^7.3.1",
"vite-plugin-wasm": "3.5.0",
"ws": "^8.18.3"
"ws": "^8.19.0"
},
"devDependencies": {
"wasm-pack": "^0.13.1"
"wasm-pack": "^0.14.0"
}
}

View File

@@ -7,7 +7,6 @@ use buttplug_server::device::hardware::communication::{
HardwareCommunicationManagerEvent,
};
use futures::future;
use js_sys::Array;
use tokio::sync::mpsc::Sender;
use wasm_bindgen::prelude::*;
use wasm_bindgen_futures::{spawn_local, JsFuture};
@@ -63,8 +62,8 @@ impl HardwareCommunicationManager for WebBluetoothCommunicationManager {
// way for anyone to add device configurations through FFI yet anyways.
let config_manager = create_test_dcm(false);
let options = web_sys::RequestDeviceOptions::new();
let filters = Array::new();
let optional_services = Array::new();
let mut filters = Vec::new();
let mut optional_services = Vec::new();
for vals in config_manager.base_communication_specifiers().iter() {
for config in vals.1.iter() {
if let ProtocolCommunicationSpecifier::BluetoothLE(btle) = &config {
@@ -77,16 +76,16 @@ impl HardwareCommunicationManager for WebBluetoothCommunicationManager {
} else {
filter.set_name(&name);
}
filters.push(&filter.into());
filters.push(filter);
}
for (service, _) in btle.services() {
optional_services.push(&service.to_string().into());
optional_services.push(js_sys::JsString::from(service.to_string()));
}
}
}
}
options.set_filters(&filters.into());
options.set_optional_services(&optional_services.into());
options.set_filters(&filters);
options.set_optional_services(&optional_services);
let nav = web_sys::window().unwrap().navigator();
//nav.bluetooth().get_availability();
//JsFuture::from(nav.bluetooth().request_device()).await;

View File

@@ -1,6 +0,0 @@
PUBLIC_API_URL=https://sexy.pivoine.art/api
PUBLIC_URL=https://sexy.pivoine.art
PUBLIC_UMAMI_ID=
LETTERSPACE_API_URL=
LETTERSPACE_API_KEY=
LETTERSPACE_LIST_ID=

View File

@@ -11,39 +11,39 @@
"start": "node ./build"
},
"devDependencies": {
"@iconify-json/ri": "^1.2.5",
"@iconify/tailwind4": "^1.0.6",
"@internationalized/date": "^3.8.2",
"@lucide/svelte": "^0.544.0",
"@sveltejs/adapter-node": "^5.3.1",
"@sveltejs/adapter-static": "^3.0.9",
"@sveltejs/kit": "^2.37.0",
"@sveltejs/vite-plugin-svelte": "^6.1.4",
"@tailwindcss/forms": "^0.5.9",
"@tailwindcss/typography": "^0.5.15",
"@tailwindcss/vite": "^4.0.0",
"@tsconfig/svelte": "^5.0.5",
"bits-ui": "2.11.0",
"@iconify-json/ri": "^1.2.10",
"@iconify/tailwind4": "^1.2.1",
"@internationalized/date": "^3.11.0",
"@lucide/svelte": "^0.577.0",
"@sveltejs/adapter-node": "^5.5.4",
"@sveltejs/adapter-static": "^3.0.10",
"@sveltejs/kit": "^2.53.4",
"@sveltejs/vite-plugin-svelte": "^6.2.4",
"@tailwindcss/forms": "^0.5.11",
"@tailwindcss/typography": "^0.5.19",
"@tailwindcss/vite": "^4.2.1",
"@tsconfig/svelte": "^5.0.8",
"bits-ui": "2.16.2",
"clsx": "^2.1.1",
"glob": "^11.0.3",
"glob": "^13.0.6",
"mode-watcher": "^1.1.0",
"prettier-plugin-svelte": "^3.4.0",
"super-sitemap": "^1.0.5",
"svelte": "^5.38.6",
"svelte-sonner": "^1.0.5",
"tailwind-merge": "^3.3.1",
"tailwind-variants": "^1.0.0",
"tailwindcss": "^4.0.0",
"tw-animate-css": "^1.3.8",
"typescript": "^5.9.2",
"vite": "^7.1.4",
"prettier-plugin-svelte": "^3.5.1",
"super-sitemap": "^1.0.7",
"svelte": "^5.53.7",
"svelte-sonner": "^1.0.8",
"tailwind-merge": "^3.5.0",
"tailwind-variants": "^3.2.2",
"tailwindcss": "^4.2.1",
"tw-animate-css": "^1.4.0",
"typescript": "^5.9.3",
"vite": "^7.3.1",
"vite-plugin-wasm": "3.5.0"
},
"dependencies": {
"@directus/sdk": "^20.0.3",
"@directus/sdk": "^21.1.0",
"@sexy.pivoine.art/buttplug": "workspace:*",
"javascript-time-ago": "^2.5.11",
"media-chrome": "^4.13.1",
"javascript-time-ago": "^2.6.4",
"media-chrome": "^4.18.0",
"svelte-i18n": "^4.0.1"
}
}

View File

@@ -1,119 +0,0 @@
<script lang="ts">
import { _ } from "svelte-i18n";
import {
Dialog,
DialogContent,
DialogDescription,
DialogHeader,
DialogTitle,
} from "$lib/components/ui/dialog";
import { Button } from "$lib/components/ui/button";
import { Separator } from "$lib/components/ui/separator";
import type { Snippet } from "svelte";
import Label from "../ui/label/label.svelte";
import Input from "../ui/input/input.svelte";
import { toast } from "svelte-sonner";
interface Props {
open: boolean;
email: string;
children?: Snippet;
}
let isLoading = $state(false);
async function handleSubscription(e: Event) {
e.preventDefault();
try {
isLoading = true;
await fetch("/newsletter", {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({ email }),
});
toast.success(
$_("newsletter_signup.toast_subscribe", { values: { email } }),
);
} finally {
isLoading = false;
open = false;
}
}
let { open = $bindable(), email = $bindable() }: Props = $props();
</script>
<Dialog bind:open>
<DialogContent class="sm:max-w-md">
<DialogHeader class="space-y-4">
<div class="flex items-center justify-between">
<div class="flex items-center gap-3">
<div
class="w-10 h-10 rounded-full bg-gradient-to-br from-primary to-purple-600 flex items-center justify-center shrink-0 grow-0"
>
<span class="icon-[ri--newspaper-line]"></span>
</div>
<div class="">
<DialogTitle
class="text-left text-xl font-semibold text-primary-foreground"
>{$_('newsletter_signup.title')}</DialogTitle
>
<DialogDescription class="text-left text-sm">
{$_('newsletter_signup.description')}
</DialogDescription>
</div>
</div>
</div>
</DialogHeader>
<Separator class="my-4" />
<form onsubmit={handleSubscription}>
<!-- Email -->
<div class="space-y-2 flex gap-4 items-center">
<Label for="email" class="m-0">{$_('newsletter_signup.email')}</Label>
<Input
id="email"
type="email"
placeholder={$_('newsletter_signup.email_placeholder')}
bind:value={email}
required
class="bg-background/50 border-primary/20 focus:border-primary"
/>
</div>
<Separator class="my-8" />
<!-- Close Button -->
<div class="flex justify-end gap-4">
<Button
variant="ghost"
size="sm"
onclick={() => (open = false)}
class="text-muted-foreground hover:text-foreground cursor-pointer"
>
<span class="icon-[ri--close-large-line]"></span>
{$_('newsletter_signup.close')}
</Button>
<Button
variant="default"
size="sm"
type="submit"
class="cursor-pointer"
disabled={isLoading}
>
{#if isLoading}
<div
class="w-4 h-4 border-2 border-white/30 border-t-white rounded-full animate-spin mr-2"
></div>
{$_('newsletter_signup.subscribing')}
{:else}
<span class="icon-[ri--check-line]"></span>
{$_('newsletter_signup.subscribe')}
{/if}
</Button>
</div>
</form>
</DialogContent>
</Dialog>

View File

@@ -1,26 +0,0 @@
<script>
import { _ } from "svelte-i18n";
import { Button } from "../ui/button";
import { Card, CardContent } from "../ui/card";
import NewsletterSignupPopup from "./newsletter-signup-popup.svelte";
let isPopupOpen = $state(false);
let { email = "" } = $props();
</script>
<!-- Newsletter Signup -->
<Card class="p-0 not-last:bg-gradient-to-br from-primary/10 to-accent/10">
<CardContent class="p-6 text-center">
<h3 class="font-semibold mb-2">{$_('newsletter_signup.title')}</h3>
<p class="text-sm text-muted-foreground mb-4">
{$_('newsletter_signup.description')}
</p>
<Button
onclick={() => (isPopupOpen = true)}
target="_blank"
class="cursor-pointer w-full bg-gradient-to-r from-primary to-accent hover:from-primary/90 hover:to-accent/90"
>{$_('newsletter_signup.cta')}</Button
>
<NewsletterSignupPopup bind:open={isPopupOpen} {email} />
</CardContent>
</Card>

View File

@@ -21,7 +21,7 @@ export const getAssetUrl = (
if (!id) {
return null;
}
return `${directusApiUrl}/assets/${id}${transform ? "?transform=" + transform : ""}`;
return `${directusApiUrl}/assets/${id}${transform ? "?key=" + transform : ""}`;
};
export const isModel = (user: CurrentUser) => {

View File

@@ -613,8 +613,8 @@ export default {
address: {
company: "SexyArt",
name: "Sebastian Krüger",
street: "Neue Weinsteige 21",
city: "70180 Stuttgart",
street: "Berlingerstraße 48",
city: "78333 Stockach",
country: "Germany",
},
phone: {
@@ -870,18 +870,6 @@ export default {
exit: "Exit",
exit_url: "https://pivoine.art",
},
newsletter_signup: {
title: "Stay Updated",
description:
"Get the latest articles and insights delivered to your inbox.",
email: "Email",
email_placeholder: "your@email.com",
cta: "Subscribe to Newsletter",
close: "Close",
subscribe: "Subscribe",
subscribing: "Subscribing",
toast_subscribe: "Your email has been added to the newsletter list!",
},
sharing_popup_button: {
share: "Share",
},

View File

@@ -123,7 +123,7 @@ class Logger {
PUBLIC_API_URL: process.env.PUBLIC_API_URL,
PUBLIC_URL: process.env.PUBLIC_URL,
PUBLIC_UMAMI_ID: process.env.PUBLIC_UMAMI_ID ? '***set***' : 'not set',
LETTERSPACE_API_URL: process.env.LETTERSPACE_API_URL || 'not set',
PUBLIC_UMAMI_SCRIPT: process.env.PUBLIC_UMAMI_SCRIPT || 'not set',
PORT: process.env.PORT || '3000',
HOST: process.env.HOST || '0.0.0.0',
};

View File

@@ -7,7 +7,7 @@ import Footer from "$lib/components/footer/footer.svelte";
import { Toaster } from "$lib/components/ui/sonner";
import Header from "$lib/components/header/header.svelte";
import AgeVerificationDialog from "$lib/components/age-verification-dialog/age-verification-dialog.svelte";
import { PUBLIC_UMAMI_ID } from "$env/static/public";
import { env } from "$env/dynamic/public";
onMount(async () => {
await waitLocale();
@@ -17,11 +17,11 @@ let { children, data } = $props();
</script>
<svelte:head>
{#if import.meta.env.PROD && PUBLIC_UMAMI_ID}
{#if import.meta.env.PROD && env.PUBLIC_UMAMI_ID && env.PUBLIC_UMAMI_SCRIPT}
<script
defer
src="https://umami.pivoine.art/script.js"
data-website-id={PUBLIC_UMAMI_ID}
src={env.PUBLIC_UMAMI_SCRIPT}
data-website-id={env.PUBLIC_UMAMI_ID}
></script>
{/if}
</svelte:head>

View File

@@ -9,7 +9,6 @@ import { getAssetUrl } from "$lib/directus";
import SharingPopup from "$lib/components/sharing-popup/sharing-popup.svelte";
import Meta from "$lib/components/meta/meta.svelte";
import PeonyBackground from "$lib/components/background/peony-background.svelte";
import NewsletterSignup from "$lib/components/newsletter-signup/newsletter-signup-widget.svelte";
import SharingPopupButton from "$lib/components/sharing-popup/sharing-popup-button.svelte";
const { data } = $props();
@@ -215,8 +214,6 @@ const timeAgo = new TimeAgo("en");
</Card>
-->
<!-- <NewsletterSignup email={data.authStatus.user?.email}/> -->
<!-- Back to Magazine -->
<Button
variant="outline"

View File

@@ -1,22 +0,0 @@
import {
LETTERSPACE_API_KEY,
LETTERSPACE_API_URL,
LETTERSPACE_LIST_ID,
} from "$env/static/private";
import { json } from "@sveltejs/kit";
export async function POST({ request, fetch }) {
const { email } = await request.json();
const lists = [LETTERSPACE_LIST_ID];
await fetch(`${LETTERSPACE_API_URL}/subscribers`, {
method: "POST",
headers: {
"Content-Type": "application/json",
"X-API-Key": LETTERSPACE_API_KEY,
},
body: JSON.stringify({ email, lists }),
});
return json({ email }, { status: 201 });
}

View File

@@ -16,7 +16,6 @@ import { formatVideoDuration, getUserInitials } from "$lib/utils";
import { invalidateAll } from "$app/navigation";
import { toast } from "svelte-sonner";
import { createCommentForVideo, likeVideo, unlikeVideo, recordVideoPlay, updateVideoPlay } from "$lib/services";
import NewsletterSignup from "$lib/components/newsletter-signup/newsletter-signup-widget.svelte";
import SharingPopupButton from "$lib/components/sharing-popup/sharing-popup-button.svelte";
const { data } = $props();
@@ -539,8 +538,6 @@ let showPlayer = $state(false);
</CardContent>
</Card> -->
<!-- <NewsletterSignup /> -->
<!-- Back to Videos -->
<Button
variant="outline"

4452
pnpm-lock.yaml generated

File diff suppressed because it is too large Load Diff

2667
schema.sql Normal file

File diff suppressed because it is too large Load Diff