Compare commits

..

17 Commits

Author SHA1 Message Date
493ddd7e78 chore: remove packages/bundle (Directus extension)
Some checks failed
Build and Push Docker Image to Gitea / build-and-push (push) Failing after 1m26s
All custom logic (endpoints, hooks, gamification) has been ported to
packages/backend. The Directus bundle is no longer needed.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 18:11:22 +01:00
33dd076a50 ci: add backend image build to docker workflow
Adds a second build+push step for the backend image (valknar/sexy-backend)
using Dockerfile.backend. Both images share the same tag strategy and
separate build caches. Summary step updated to show both images.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 18:10:06 +01:00
9d7afbe1b5 feat: replace Directus with custom Node.js GraphQL backend
Removes Directus 11 and replaces it with a lean, purpose-built backend:
- packages/backend/: Fastify v5 + GraphQL Yoga v5 + Pothos (code-first)
  with Drizzle ORM, Redis sessions (session_token cookie), argon2 auth,
  Nodemailer, fluent-ffmpeg, and full gamification system ported from bundle
- Frontend: @directus/sdk replaced by graphql-request v7; services.ts fully
  rewritten with identical signatures; directus.ts now re-exports from api.ts
- Cookie renamed directus_session_token → session_token
- Dev proxy target updated 8055 → 4000
- compose.yml: Directus service removed, backend service added (port 4000)
- Dockerfile.backend: new multi-stage image with ffmpeg
- Dockerfile: bundle build step and ffmpeg removed from frontend image
- data-migration.ts: one-time script to migrate all Directus/sexy_ tables

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 18:07:18 +01:00
de16b64255 fix: use env object from \$env/dynamic/public instead of named imports
All checks were successful
Build and Push Docker Image to Gitea / build-and-push (push) Successful in 5m13s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 17:19:26 +01:00
1e69d0b158 fix: create env placeholder file inline in Dockerfile for dynamic public env
Some checks failed
Build and Push Docker Image to Gitea / build-and-push (push) Failing after 4m6s
.dockerignore excludes .env files so the previous COPY failed silently.
$env/dynamic/public requires variable names to be declared at build time
to generate named exports; empty placeholders satisfy this while actual
values still come from process.env at runtime.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 17:09:42 +01:00
865787fb45 feat: switch PUBLIC_API_URL and PUBLIC_URL to dynamic env for runtime configurability
Some checks failed
Build and Push Docker Image to Gitea / build-and-push (push) Failing after 3m3s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 17:01:37 +01:00
3915dbc115 fix: .env inclusion
All checks were successful
Build and Push Docker Image to Gitea / build-and-push (push) Successful in 5m31s
2026-03-04 16:47:54 +01:00
83ca9d4fb5 chore: update all dependencies to latest versions
Some checks failed
Build and Push Docker Image to Gitea / build-and-push (push) Failing after 46s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 16:43:56 +01:00
225b9d41f5 chore: clean up repo and fix docker compose configuration
- Remove outdated docs (COMPOSE.md, DOCKER.md, QUICKSTART.md, REBUILD_GUIDE.md)
- Remove build.sh, compose.production.yml, gamification-schema.sql, directus.yaml
- Simplify compose.yml for local dev (remove env var indirection)
- Add directus.yml schema snapshot and schema.sql from VPS
- Add schema:export and schema:import scripts to package.json
- Ignore .env files (vars set via compose environment)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 16:36:49 +01:00
ad83fb553a feat: switch to dynamic env for Umami to allow runtime configuration
All checks were successful
Build and Push Docker Image to Gitea / build-and-push (push) Successful in 4m41s
2026-02-21 11:47:32 +01:00
2be36a679d fix: resolve buttplug-wasm build error by using Vec and slices
All checks were successful
Build and Push Docker Image to Gitea / build-and-push (push) Successful in 5m26s
2026-02-21 11:33:12 +01:00
75d4b4227c chore: update buttplug dependencies to commit fad6c9d
Some checks failed
Build and Push Docker Image to Gitea / build-and-push (push) Failing after 2m10s
2026-02-21 11:29:36 +01:00
2277e4f686 fix: resolve buttplug-wasm build error by using as_ref() for JS arrays
Some checks failed
Build and Push Docker Image to Gitea / build-and-push (push) Failing after 2m7s
2026-02-21 11:13:53 +01:00
13c6977e59 feat: add PUBLIC_UMAMI_SCRIPT variable and integrate into layout
Some checks failed
Build and Push Docker Image to Gitea / build-and-push (push) Failing after 2m13s
2026-02-21 11:05:30 +01:00
c85fa7798e chore: remove Letterspace newsletter integration and all LETTERSPACE_* variables 2026-02-21 10:56:43 +01:00
ce30eca574 chore: new imprint address 2026-02-21 10:37:19 +01:00
6724afa939 fix: use correct Directus preset parameter for asset transforms
All checks were successful
Build and Push Docker Image to Gitea / build-and-push (push) Successful in 6m4s
Changed ?transform= to ?key= so Directus storage asset presets
(mini, thumbnail, preview, medium, banner) are actually applied.
Previously all images were served at full resolution.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-16 08:42:38 +01:00
80 changed files with 9833 additions and 6864 deletions

View File

@@ -20,6 +20,7 @@ on:
env:
REGISTRY: dev.pivoine.art
IMAGE_NAME: valknar/sexy
BACKEND_IMAGE_NAME: valknar/sexy-backend
jobs:
build-and-push:
@@ -67,10 +68,11 @@ jobs:
org.opencontainers.image.vendor=valknar
org.opencontainers.image.source=https://dev.pivoine.art/${{ gitea.repository }}
- name: Build and push Docker image
- name: Build and push frontend Docker image
uses: docker/build-push-action@v5
with:
context: .
dockerfile: Dockerfile
platforms: linux/amd64
push: ${{ gitea.event_name != 'pull_request' }}
tags: ${{ steps.meta.outputs.tags }}
@@ -81,32 +83,77 @@ jobs:
NODE_ENV=production
CI=true
- name: Extract metadata for backend image
id: meta-backend
uses: docker/metadata-action@v5
with:
images: ${{ env.REGISTRY }}/${{ env.BACKEND_IMAGE_NAME }}
tags: |
type=raw,value=latest,enable={{is_default_branch}}
type=ref,event=branch
type=ref,event=pr
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=semver,pattern={{major}}
type=sha,prefix={{branch}}-
type=raw,value=${{ gitea.event.inputs.tag }},enable=${{ gitea.event_name == 'workflow_dispatch' }}
labels: |
org.opencontainers.image.title=sexy.pivoine.art backend
org.opencontainers.image.description=GraphQL backend for sexy.pivoine.art (Fastify + Drizzle + Pothos)
org.opencontainers.image.vendor=valknar
org.opencontainers.image.source=https://dev.pivoine.art/${{ gitea.repository }}
- name: Build and push backend Docker image
uses: docker/build-push-action@v5
with:
context: .
dockerfile: Dockerfile.backend
platforms: linux/amd64
push: ${{ gitea.event_name != 'pull_request' }}
tags: ${{ steps.meta-backend.outputs.tags }}
labels: ${{ steps.meta-backend.outputs.labels }}
cache-from: type=registry,ref=${{ env.REGISTRY }}/${{ env.BACKEND_IMAGE_NAME }}:buildcache
cache-to: type=registry,ref=${{ env.REGISTRY }}/${{ env.BACKEND_IMAGE_NAME }}:buildcache,mode=max
build-args: |
NODE_ENV=production
CI=true
- name: Generate image digest
if: gitea.event_name != 'pull_request'
run: |
echo "### Docker Image Published :rocket:" >> $GITEA_STEP_SUMMARY
echo "### Docker Images Published :rocket:" >> $GITEA_STEP_SUMMARY
echo "" >> $GITEA_STEP_SUMMARY
echo "**Registry:** \`${{ env.REGISTRY }}\`" >> $GITEA_STEP_SUMMARY
echo "**Image:** \`${{ env.IMAGE_NAME }}\`" >> $GITEA_STEP_SUMMARY
echo "" >> $GITEA_STEP_SUMMARY
echo "**Tags:**" >> $GITEA_STEP_SUMMARY
echo "**Frontend (\`${{ env.IMAGE_NAME }}\`):**" >> $GITEA_STEP_SUMMARY
echo "\`\`\`" >> $GITEA_STEP_SUMMARY
echo "${{ steps.meta.outputs.tags }}" >> $GITEA_STEP_SUMMARY
echo "\`\`\`" >> $GITEA_STEP_SUMMARY
echo "" >> $GITEA_STEP_SUMMARY
echo "**Pull command:**" >> $GITEA_STEP_SUMMARY
echo "**Backend (\`${{ env.BACKEND_IMAGE_NAME }}\`):**" >> $GITEA_STEP_SUMMARY
echo "\`\`\`" >> $GITEA_STEP_SUMMARY
echo "${{ steps.meta-backend.outputs.tags }}" >> $GITEA_STEP_SUMMARY
echo "\`\`\`" >> $GITEA_STEP_SUMMARY
echo "" >> $GITEA_STEP_SUMMARY
echo "**Pull commands:**" >> $GITEA_STEP_SUMMARY
echo "\`\`\`bash" >> $GITEA_STEP_SUMMARY
echo "docker pull ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:latest" >> $GITEA_STEP_SUMMARY
echo "docker pull ${{ env.REGISTRY }}/${{ env.BACKEND_IMAGE_NAME }}:latest" >> $GITEA_STEP_SUMMARY
echo "\`\`\`" >> $GITEA_STEP_SUMMARY
- name: PR Comment - Image built but not pushed
- name: PR Comment - Images built but not pushed
if: gitea.event_name == 'pull_request'
run: |
echo "### Docker Image Built Successfully :white_check_mark:" >> $GITEA_STEP_SUMMARY
echo "### Docker Images Built Successfully :white_check_mark:" >> $GITEA_STEP_SUMMARY
echo "" >> $GITEA_STEP_SUMMARY
echo "Image was built successfully but **not pushed** (PR builds are not published)." >> $GITEA_STEP_SUMMARY
echo "Images were built successfully but **not pushed** (PR builds are not published)." >> $GITEA_STEP_SUMMARY
echo "" >> $GITEA_STEP_SUMMARY
echo "**Would be tagged as:**" >> $GITEA_STEP_SUMMARY
echo "**Frontend would be tagged as:**" >> $GITEA_STEP_SUMMARY
echo "\`\`\`" >> $GITEA_STEP_SUMMARY
echo "${{ steps.meta.outputs.tags }}" >> $GITEA_STEP_SUMMARY
echo "\`\`\`" >> $GITEA_STEP_SUMMARY
echo "" >> $GITEA_STEP_SUMMARY
echo "**Backend would be tagged as:**" >> $GITEA_STEP_SUMMARY
echo "\`\`\`" >> $GITEA_STEP_SUMMARY
echo "${{ steps.meta-backend.outputs.tags }}" >> $GITEA_STEP_SUMMARY
echo "\`\`\`" >> $GITEA_STEP_SUMMARY

3
.gitignore vendored
View File

@@ -3,5 +3,4 @@ dist/
target/
pkg/
.env.*
.claude/

View File

@@ -1,424 +0,0 @@
# Docker Compose Guide
This guide explains the Docker Compose setup for sexy.pivoine.art with local development and production configurations.
## Architecture Overview
The application uses a **multi-file compose setup** with two configurations:
1. **`compose.yml`** - Base configuration for local development
2. **`compose.production.yml`** - Production overrides with Traefik integration
### Service Architecture
```
┌─────────────────────────────────────────────────────────────┐
│ 🌐 Traefik Reverse Proxy (Production Only) │
│ ├─ HTTPS Termination │
│ ├─ Automatic Let's Encrypt │
│ └─ Routes traffic to frontend & Directus API │
├─────────────────────────────────────────────────────────────┤
│ 💄 Frontend (SvelteKit) │
│ ├─ Port 3000 (internal) │
│ ├─ Serves on https://sexy.pivoine.art │
│ └─ Proxies /api to Directus │
├─────────────────────────────────────────────────────────────┤
│ 🎭 Directus CMS │
│ ├─ Port 8055 (internal) │
│ ├─ Serves on https://sexy.pivoine.art/api │
│ ├─ Custom bundle extensions mounted │
│ └─ Uploads volume │
├─────────────────────────────────────────────────────────────┤
│ 🗄️ PostgreSQL (Local) / External (Production) │
│ └─ Database for Directus │
├─────────────────────────────────────────────────────────────┤
│ 💾 Redis (Local) / External (Production) │
│ └─ Cache & session storage │
└─────────────────────────────────────────────────────────────┘
```
## Local Development Setup
### Prerequisites
- Docker 20.10+
- Docker Compose 2.0+
### Quick Start
1. **Create environment file:**
```bash
cp .env.example .env
# Edit .env with your local settings (defaults work fine)
```
2. **Start all services:**
```bash
docker-compose up -d
```
3. **Access services:**
- Frontend: http://localhost:3000 (if enabled)
- Directus: http://localhost:8055
- Directus Admin: http://localhost:8055/admin
4. **View logs:**
```bash
docker-compose logs -f
```
5. **Stop services:**
```bash
docker-compose down
```
### Local Services
#### PostgreSQL
- **Image:** `postgres:16-alpine`
- **Port:** 5432 (internal only)
- **Volume:** `postgres-data`
- **Database:** `sexy`
#### Redis
- **Image:** `redis:7-alpine`
- **Port:** 6379 (internal only)
- **Volume:** `redis-data`
- **Persistence:** AOF enabled
#### Directus
- **Image:** `directus/directus:11`
- **Port:** 8055 (exposed)
- **Volumes:**
- `directus-uploads` - File uploads
- `./packages/bundle/dist` - Custom extensions
- **Features:**
- Auto-reload extensions
- WebSockets enabled
- CORS enabled for localhost
### Local Development Workflow
```bash
# Start infrastructure (Postgres, Redis, Directus)
docker-compose up -d
# Develop frontend locally with hot reload
cd packages/frontend
pnpm dev
# Build Directus bundle
pnpm --filter @sexy.pivoine.art/bundle build
# Restart Directus to load new bundle
docker-compose restart directus
```
## Production Deployment
### Prerequisites
- External PostgreSQL database
- External Redis instance
- Traefik reverse proxy configured
- External network: `compose_network`
### Setup
The production compose file now uses the `include` directive to automatically extend `compose.yml`, making deployment simpler.
1. **Create production environment file:**
```bash
cp .env.production.example .env.production
```
2. **Edit `.env.production` with your values:**
```bash
# Database (external)
CORE_DB_HOST=your-postgres-host
SEXY_DB_NAME=sexy_production
DB_USER=sexy
DB_PASSWORD=your-secure-password
# Redis (external)
CORE_REDIS_HOST=your-redis-host
# Directus
SEXY_DIRECTUS_SECRET=your-32-char-random-secret
ADMIN_PASSWORD=your-secure-admin-password
# Traefik
SEXY_TRAEFIK_HOST=sexy.pivoine.art
# Frontend
PUBLIC_API_URL=https://sexy.pivoine.art/api
PUBLIC_URL=https://sexy.pivoine.art
# Email (SMTP)
EMAIL_SMTP_HOST=smtp.your-provider.com
EMAIL_SMTP_USER=your-email@domain.com
EMAIL_SMTP_PASSWORD=your-smtp-password
```
3. **Deploy:**
```bash
# Simple deployment - compose.production.yml includes compose.yml automatically
docker-compose -f compose.production.yml --env-file .env.production up -d
# Or use the traditional multi-file approach (same result)
docker-compose -f compose.yml -f compose.production.yml --env-file .env.production up -d
```
### Production Services
#### Directus
- **Image:** `directus/directus:11` (configurable)
- **Network:** `compose_network` (external)
- **Volumes:**
- `/var/www/sexy.pivoine.art/uploads` - Persistent uploads
- `/var/www/sexy.pivoine.art/packages/bundle/dist` - Extensions
- **Traefik routing:**
- Domain: `sexy.pivoine.art/api`
- Strips `/api` prefix before forwarding
- HTTPS with auto-certificates
#### Frontend
- **Image:** `ghcr.io/valknarxxx/sexy:latest` (from GHCR)
- **Network:** `compose_network` (external)
- **Volume:** `/var/www/sexy.pivoine.art` - Application code
- **Traefik routing:**
- Domain: `sexy.pivoine.art`
- HTTPS with auto-certificates
### Traefik Integration
Both services are configured with Traefik labels for automatic routing:
**Frontend:**
- HTTP → HTTPS redirect
- Routes `sexy.pivoine.art` to port 3000
- Gzip compression enabled
**Directus API:**
- HTTP → HTTPS redirect
- Routes `sexy.pivoine.art/api` to port 8055
- Strips `/api` prefix
- Gzip compression enabled
### Production Commands
```bash
# Deploy/update (simplified - uses include)
docker-compose -f compose.production.yml --env-file .env.production up -d
# View logs
docker-compose -f compose.production.yml logs -f
# Restart specific service
docker-compose -f compose.production.yml restart frontend
# Stop all services
docker-compose -f compose.production.yml down
# Update images
docker-compose -f compose.production.yml pull
docker-compose -f compose.production.yml up -d
```
## Environment Variables
### Local Development (`.env`)
| Variable | Default | Description |
|----------|---------|-------------|
| `DB_DATABASE` | `sexy` | Database name |
| `DB_USER` | `sexy` | Database user |
| `DB_PASSWORD` | `sexy` | Database password |
| `DIRECTUS_SECRET` | - | Secret for Directus (min 32 chars) |
| `ADMIN_EMAIL` | `admin@sexy.pivoine.art` | Admin email |
| `ADMIN_PASSWORD` | `admin` | Admin password |
| `CORS_ORIGIN` | `http://localhost:3000` | CORS allowed origins |
See `.env.example` for full list.
### Production (`.env.production`)
| Variable | Description | Required |
|----------|-------------|----------|
| `CORE_DB_HOST` | External PostgreSQL host | ✅ |
| `SEXY_DB_NAME` | Database name | ✅ |
| `DB_PASSWORD` | Database password | ✅ |
| `CORE_REDIS_HOST` | External Redis host | ✅ |
| `SEXY_DIRECTUS_SECRET` | Directus secret key | ✅ |
| `SEXY_TRAEFIK_HOST` | Domain name | ✅ |
| `EMAIL_SMTP_HOST` | SMTP server | ✅ |
| `EMAIL_SMTP_PASSWORD` | SMTP password | ✅ |
| `SEXY_FRONTEND_PUBLIC_API_URL` | Frontend API URL | ✅ |
| `SEXY_FRONTEND_PUBLIC_URL` | Frontend public URL | ✅ |
See `.env.production.example` for full list.
**Note:** All frontend-specific variables are prefixed with `SEXY_FRONTEND_` for clarity.
## Volumes
### Local Development
- `postgres-data` - PostgreSQL database
- `redis-data` - Redis persistence
- `directus-uploads` - Uploaded files
### Production
- `/var/www/sexy.pivoine.art/uploads` - Directus uploads
- `/var/www/sexy.pivoine.art` - Application code (frontend)
## Networks
### Local: `sexy-network`
- Bridge network
- Internal communication only
- Directus exposed on 8055
### Production: `compose_network`
- External network (pre-existing)
- Connects to Traefik
- No exposed ports (Traefik handles routing)
## Health Checks
All services include health checks:
**PostgreSQL:**
- Command: `pg_isready`
- Interval: 10s
**Redis:**
- Command: `redis-cli ping`
- Interval: 10s
**Directus:**
- Endpoint: `/server/health`
- Interval: 30s
**Frontend:**
- HTTP GET: `localhost:3000`
- Interval: 30s
## Troubleshooting
### Local Development
**Problem:** Directus won't start
```bash
# Check logs
docker-compose logs directus
# Common issues:
# 1. Database not ready - wait for postgres to be healthy
# 2. Wrong secret - check DIRECTUS_SECRET is at least 32 chars
```
**Problem:** Can't connect to database
```bash
# Check if postgres is running
docker-compose ps postgres
# Verify health
docker-compose exec postgres pg_isready -U sexy
```
**Problem:** Extensions not loading
```bash
# Rebuild bundle
pnpm --filter @sexy.pivoine.art/bundle build
# Verify volume mount
docker-compose exec directus ls -la /directus/extensions/
# Restart Directus
docker-compose restart directus
```
### Production
**Problem:** Services not accessible via domain
```bash
# Check Traefik labels
docker inspect sexy_frontend | grep traefik
# Verify compose_network exists
docker network ls | grep compose_network
# Check Traefik is running
docker ps | grep traefik
```
**Problem:** Can't connect to external database
```bash
# Test connection from Directus container
docker-compose exec directus sh
apk add postgresql-client
psql -h $CORE_DB_HOST -U $DB_USER -d $SEXY_DB_NAME
```
**Problem:** Frontend can't reach Directus API
```bash
# Check Directus is accessible
curl https://sexy.pivoine.art/api/server/health
# Verify CORS settings
# PUBLIC_API_URL should match the public Directus URL
```
## Migration from Old Setup
If migrating from `docker-compose.production.yml`:
1. **Rename environment variables** according to `.env.production.example`
2. **Update command** to use both compose files
3. **Verify Traefik labels** match your setup
4. **Test** with `docker-compose config` to see merged configuration
```bash
# Validate configuration
docker-compose -f compose.yml -f compose.production.yml --env-file .env.production config
# Deploy
docker-compose -f compose.yml -f compose.production.yml --env-file .env.production up -d
```
## Best Practices
### Local Development
1. Use default credentials (they're fine for local)
2. Keep `EXTENSIONS_AUTO_RELOAD=true` for quick iteration
3. Run frontend via `pnpm dev` for hot reload
4. Restart Directus after bundle changes
### Production
1. Use strong passwords for database and admin
2. Set `EXTENSIONS_AUTO_RELOAD=false` for stability
3. Use GHCR images for frontend
4. Enable Gzip compression via Traefik
5. Monitor logs regularly
6. Keep backups of uploads and database
## See Also
- [DOCKER.md](DOCKER.md) - Docker image documentation
- [QUICKSTART.md](QUICKSTART.md) - Quick start guide
- [CLAUDE.md](CLAUDE.md) - Development guide

378
DOCKER.md
View File

@@ -1,378 +0,0 @@
# Docker Deployment Guide
This guide covers building and deploying sexy.pivoine.art using Docker.
## Overview
The Dockerfile uses a multi-stage build process:
1. **Base stage**: Sets up Node.js and pnpm
2. **Builder stage**: Installs Rust, compiles WASM, builds all packages
3. **Runner stage**: Minimal production image with only runtime dependencies
## Prerequisites
- Docker 20.10+ with BuildKit support
- Docker Compose 2.0+ (optional, for orchestration)
## Building the Image
### Basic Build
```bash
docker build -t sexy.pivoine.art:latest .
```
### Build with Build Arguments
```bash
docker build \
--build-arg NODE_ENV=production \
-t sexy.pivoine.art:latest \
.
```
### Multi-platform Build (for ARM64 and AMD64)
```bash
docker buildx build \
--platform linux/amd64,linux/arm64 \
-t sexy.pivoine.art:latest \
--push \
.
```
## Running the Container
### Run with Environment Variables
```bash
docker run -d \
--name sexy-pivoine-frontend \
-p 3000:3000 \
-e PUBLIC_API_URL=https://api.pivoine.art \
-e PUBLIC_URL=https://sexy.pivoine.art \
-e PUBLIC_UMAMI_ID=your-umami-id \
-e LETTERSPACE_API_URL=https://api.letterspace.com/v1 \
-e LETTERSPACE_API_KEY=your-api-key \
-e LETTERSPACE_LIST_ID=your-list-id \
sexy.pivoine.art:latest
```
### Run with Environment File
```bash
# Create .env.production from template
cp .env.production.example .env.production
# Edit .env.production with your values
nano .env.production
# Run container
docker run -d \
--name sexy-pivoine-frontend \
-p 3000:3000 \
--env-file .env.production \
sexy.pivoine.art:latest
```
## Docker Compose Deployment
### Using docker-compose.production.yml
```bash
# 1. Create environment file
cp .env.production.example .env.production
# 2. Edit environment variables
nano .env.production
# 3. Build and start
docker-compose -f docker-compose.production.yml up -d --build
# 4. View logs
docker-compose -f docker-compose.production.yml logs -f frontend
# 5. Stop services
docker-compose -f docker-compose.production.yml down
```
### Scale the Application
```bash
docker-compose -f docker-compose.production.yml up -d --scale frontend=3
```
## Environment Variables Reference
### Required Variables
| Variable | Description | Example |
|----------|-------------|---------|
| `PUBLIC_API_URL` | Directus API backend URL | `https://api.pivoine.art` |
| `PUBLIC_URL` | Frontend application URL | `https://sexy.pivoine.art` |
### Optional Variables
| Variable | Description | Example |
|----------|-------------|---------|
| `PUBLIC_UMAMI_ID` | Umami analytics tracking ID | `abc123def-456` |
| `LETTERSPACE_API_URL` | Letterspace API endpoint | `https://api.letterspace.com/v1` |
| `LETTERSPACE_API_KEY` | Letterspace authentication key | `sk_live_...` |
| `LETTERSPACE_LIST_ID` | Mailing list identifier | `list_abc123` |
| `PORT` | Application port (inside container) | `3000` |
| `HOST` | Host binding | `0.0.0.0` |
| `NODE_ENV` | Node environment | `production` |
## Health Checks
The container includes a built-in health check that pings the HTTP server every 30 seconds:
```bash
# Check container health
docker inspect --format='{{.State.Health.Status}}' sexy-pivoine-frontend
# View health check logs
docker inspect --format='{{json .State.Health}}' sexy-pivoine-frontend | jq
```
## Logs and Debugging
### View Container Logs
```bash
# Follow logs
docker logs -f sexy-pivoine-frontend
# Last 100 lines
docker logs --tail 100 sexy-pivoine-frontend
# With timestamps
docker logs -f --timestamps sexy-pivoine-frontend
```
### Execute Commands in Running Container
```bash
# Open shell
docker exec -it sexy-pivoine-frontend sh
# Check Node.js version
docker exec sexy-pivoine-frontend node --version
# Check environment variables
docker exec sexy-pivoine-frontend env
```
### Debug Build Issues
```bash
# Build with no cache
docker build --no-cache -t sexy.pivoine.art:latest .
# Build specific stage for debugging
docker build --target builder -t sexy.pivoine.art:builder .
# Inspect builder stage
docker run -it --rm sexy.pivoine.art:builder sh
```
## Production Best Practices
### 1. Use Specific Tags
```bash
# Tag with version
docker build -t sexy.pivoine.art:1.0.0 .
docker tag sexy.pivoine.art:1.0.0 sexy.pivoine.art:latest
```
### 2. Image Scanning
```bash
# Scan for vulnerabilities (requires Docker Scout or Trivy)
docker scout cves sexy.pivoine.art:latest
# Or with Trivy
trivy image sexy.pivoine.art:latest
```
### 3. Resource Limits
```bash
docker run -d \
--name sexy-pivoine-frontend \
-p 3000:3000 \
--memory="2g" \
--cpus="2" \
--env-file .env.production \
sexy.pivoine.art:latest
```
### 4. Restart Policies
```bash
docker run -d \
--name sexy-pivoine-frontend \
--restart=unless-stopped \
-p 3000:3000 \
--env-file .env.production \
sexy.pivoine.art:latest
```
### 5. Use Docker Secrets (Docker Swarm)
```bash
# Create secrets
echo "your-api-key" | docker secret create letterspace_api_key -
# Deploy with secrets
docker service create \
--name sexy-pivoine-frontend \
--secret letterspace_api_key \
-p 3000:3000 \
sexy.pivoine.art:latest
```
## Optimization Tips
### Reduce Build Time
1. **Use BuildKit cache mounts** (already enabled in Dockerfile)
2. **Leverage layer caching** - structure Dockerfile to cache dependencies
3. **Use `.dockerignore`** - exclude unnecessary files from build context
### Reduce Image Size
Current optimizations:
- Multi-stage build (builder artifacts not in final image)
- Production-only dependencies (`pnpm install --prod`)
- Minimal base image (`node:20.19.1-slim`)
- Only necessary build artifacts copied to runner
Image size breakdown:
```bash
docker images sexy.pivoine.art:latest
```
## CI/CD Integration
### GitHub Actions (Automated)
This repository includes automated GitHub Actions workflows for building, scanning, and managing Docker images.
**Pre-configured workflows:**
- **Build & Push** (`.github/workflows/docker-build-push.yml`)
- Automatically builds and pushes to `ghcr.io/valknarxxx/sexy`
- Triggers on push to main/develop, version tags, and PRs
- Multi-platform builds (AMD64 + ARM64)
- Smart tagging: latest, branch names, semver, commit SHAs
- **Security Scan** (`.github/workflows/docker-scan.yml`)
- Daily vulnerability scans with Trivy
- Reports to GitHub Security tab
- Scans on every release
- **Cleanup** (`.github/workflows/cleanup-images.yml`)
- Weekly cleanup of old untagged images
- Keeps last 10 versions
**Using pre-built images:**
```bash
# Pull latest from GitHub Container Registry
docker pull ghcr.io/valknarxxx/sexy:latest
# Pull specific version
docker pull ghcr.io/valknarxxx/sexy:v1.0.0
# Run the image
docker run -d -p 3000:3000 --env-file .env.production ghcr.io/valknarxxx/sexy:latest
```
**Triggering builds:**
```bash
# Push to main → builds 'latest' tag
git push origin main
# Create version tag → builds semver tags
git tag v1.0.0 && git push origin v1.0.0
# Pull request → builds but doesn't push
```
See `.github/workflows/README.md` for detailed workflow documentation.
## Troubleshooting
### Build Fails at Rust Installation
**Problem**: Rust installation fails or times out
**Solution**:
- Check internet connectivity
- Use a Rust mirror if in restricted network
- Increase build timeout
### WASM Build Fails
**Problem**: `wasm-bindgen-cli` version mismatch
**Solution**:
```dockerfile
# In Dockerfile, pin wasm-bindgen-cli version
RUN cargo install wasm-bindgen-cli --version 0.2.103
```
### Container Exits Immediately
**Problem**: Container starts then exits
**Solution**: Check logs and verify:
```bash
docker logs sexy-pivoine-frontend
# Verify build output exists
docker run -it --rm sexy.pivoine.art:latest ls -la packages/frontend/build
```
### Port Already in Use
**Problem**: Port 3000 already bound
**Solution**:
```bash
# Use different host port
docker run -d -p 8080:3000 sexy.pivoine.art:latest
```
## Maintenance
### Clean Up
```bash
# Remove stopped containers
docker container prune
# Remove unused images
docker image prune -a
# Remove build cache
docker builder prune
# Complete cleanup (use with caution)
docker system prune -a --volumes
```
### Update Base Image
Regularly update the base Node.js image:
```bash
# Pull latest Node 20 LTS
docker pull node:20.19.1-slim
# Rebuild
docker build --pull -t sexy.pivoine.art:latest .
```

View File

@@ -14,9 +14,10 @@ WORKDIR /app
# Copy workspace configuration
COPY pnpm-workspace.yaml package.json pnpm-lock.yaml ./
# Copy .env to .env.production for proper svelte compiling
RUN mkdir -p ./packages/frontend
COPY packages/frontend/.env ./packages/frontend/.env.production
# Create env file with placeholder values so SvelteKit knows variable names at build time
# Actual values are injected at runtime via process.env (adapter-node)
RUN mkdir -p ./packages/frontend && \
printf 'PUBLIC_API_URL=\nPUBLIC_URL=\nPUBLIC_UMAMI_ID=\nPUBLIC_UMAMI_SCRIPT=\n' > ./packages/frontend/.env
# ============================================================================
# Builder stage - compile application with Rust/WASM support
@@ -63,9 +64,6 @@ RUN pnpm --filter @sexy.pivoine.art/buttplug build
# 3. Build frontend
RUN pnpm --filter @sexy.pivoine.art/frontend build
# 4. Build Directus bundle
RUN pnpm --filter @sexy.pivoine.art/bundle build
# Prune dev dependencies for production
RUN pnpm install -rP
@@ -77,7 +75,6 @@ FROM node:22.11.0-slim AS runner
# Install dumb-init for proper signal handling
RUN apt-get update && apt-get install -y \
dumb-init \
ffmpeg \
&& rm -rf /var/lib/apt/lists/*
# Create non-root user
@@ -95,18 +92,13 @@ COPY --from=builder --chown=node:node /app/pnpm-lock.yaml ./pnpm-lock.yaml
COPY --from=builder --chown=node:node /app/pnpm-workspace.yaml ./pnpm-workspace.yaml
# Create package directories
RUN mkdir -p packages/frontend packages/bundle packages/buttplug
RUN mkdir -p packages/frontend packages/buttplug
# Copy frontend artifacts
COPY --from=builder --chown=node:node /app/packages/frontend/build ./packages/frontend/build
COPY --from=builder --chown=node:node /app/packages/frontend/node_modules ./packages/frontend/node_modules
COPY --from=builder --chown=node:node /app/packages/frontend/package.json ./packages/frontend/package.json
# Copy bundle artifacts
COPY --from=builder --chown=node:node /app/packages/bundle/dist ./packages/bundle/dist
COPY --from=builder --chown=node:node /app/packages/bundle/node_modules ./packages/bundle/node_modules
COPY --from=builder --chown=node:node /app/packages/bundle/package.json ./packages/bundle/package.json
# Copy buttplug artifacts
COPY --from=builder --chown=node:node /app/packages/buttplug/dist ./packages/buttplug/dist
COPY --from=builder --chown=node:node /app/packages/buttplug/node_modules ./packages/buttplug/node_modules
@@ -124,9 +116,7 @@ ENV NODE_ENV=production \
ENV PUBLIC_API_URL="" \
PUBLIC_URL="" \
PUBLIC_UMAMI_ID="" \
LETTERSPACE_API_URL="" \
LETTERSPACE_API_KEY="" \
LETTERSPACE_LIST_ID=""
PUBLIC_UMAMI_SCRIPT=""
# Expose application port
EXPOSE 3000

59
Dockerfile.backend Normal file
View File

@@ -0,0 +1,59 @@
# syntax=docker/dockerfile:1
# ============================================================================
# Builder stage
# ============================================================================
FROM node:22.11.0-slim AS builder
RUN npm install -g corepack@latest && corepack enable
WORKDIR /app
COPY pnpm-workspace.yaml package.json pnpm-lock.yaml ./
COPY packages/backend/package.json ./packages/backend/package.json
RUN pnpm install --frozen-lockfile --filter @sexy.pivoine.art/backend
COPY packages/backend ./packages/backend
RUN pnpm --filter @sexy.pivoine.art/backend build
RUN pnpm install -rP --filter @sexy.pivoine.art/backend
# ============================================================================
# Runner stage
# ============================================================================
FROM node:22.11.0-slim AS runner
RUN apt-get update && apt-get install -y \
dumb-init \
ffmpeg \
wget \
&& rm -rf /var/lib/apt/lists/*
RUN userdel -r node && \
groupadd -r -g 1000 node && \
useradd -r -u 1000 -g node -m -d /home/node -s /bin/bash node
WORKDIR /home/node/app
RUN mkdir -p packages/backend
COPY --from=builder --chown=node:node /app/packages/backend/dist ./packages/backend/dist
COPY --from=builder --chown=node:node /app/packages/backend/node_modules ./packages/backend/node_modules
COPY --from=builder --chown=node:node /app/packages/backend/package.json ./packages/backend/package.json
RUN mkdir -p /data/uploads && chown node:node /data/uploads
USER node
ENV NODE_ENV=production \
PORT=4000
EXPOSE 4000
HEALTHCHECK --interval=30s --timeout=5s --start-period=20s --retries=3 \
CMD wget --no-verbose --tries=1 --spider http://localhost:4000/health
ENTRYPOINT ["dumb-init", "--"]
CMD ["node", "packages/backend/dist/index.js"]

View File

@@ -1,334 +0,0 @@
# Quick Start Guide
Get sexy.pivoine.art running in under 5 minutes using pre-built Docker images.
## Prerequisites
- Docker 20.10+
- Docker Compose 2.0+ (optional)
## Option 1: Docker Run (Fastest)
### Step 1: Pull the Image
```bash
docker pull ghcr.io/valknarxxx/sexy:latest
```
### Step 2: Create Environment File
```bash
cat > .env.production << EOF
PUBLIC_API_URL=https://api.your-domain.com
PUBLIC_URL=https://your-domain.com
PUBLIC_UMAMI_ID=
LETTERSPACE_API_URL=
LETTERSPACE_API_KEY=
LETTERSPACE_LIST_ID=
EOF
```
### Step 3: Run the Container
```bash
docker run -d \
--name sexy-pivoine \
-p 3000:3000 \
--env-file .env.production \
--restart unless-stopped \
ghcr.io/valknarxxx/sexy:latest
```
### Step 4: Verify
```bash
# Check if running
docker ps | grep sexy-pivoine
# Check logs
docker logs -f sexy-pivoine
# Test the application
curl http://localhost:3000
```
Your application is now running at `http://localhost:3000` 🎉
## Option 2: Docker Compose (Recommended)
### Step 1: Download docker-compose.production.yml
```bash
curl -O https://raw.githubusercontent.com/valknarxxx/sexy/main/docker-compose.production.yml
```
Or if you have the repository:
```bash
cd /path/to/sexy.pivoine.art
```
### Step 2: Create Environment File
```bash
cp .env.production.example .env.production
nano .env.production # Edit with your values
```
### Step 3: Start Services
```bash
docker-compose -f docker-compose.production.yml up -d
```
### Step 4: Monitor
```bash
# View logs
docker-compose -f docker-compose.production.yml logs -f
# Check status
docker-compose -f docker-compose.production.yml ps
```
Your application is now running at `http://localhost:3000` 🎉
## Accessing Private Images
If the image is in a private registry:
### Step 1: Create GitHub Personal Access Token
1. Go to https://github.com/settings/tokens
2. Click "Generate new token (classic)"
3. Select scope: `read:packages`
4. Generate and copy the token
### Step 2: Login to GitHub Container Registry
```bash
echo YOUR_GITHUB_TOKEN | docker login ghcr.io -u YOUR_GITHUB_USERNAME --password-stdin
```
### Step 3: Pull and Run
Now you can pull private images:
```bash
docker pull ghcr.io/valknarxxx/sexy:latest
```
## Environment Variables
### Required
| Variable | Description | Example |
|----------|-------------|---------|
| `PUBLIC_API_URL` | Directus API endpoint | `https://api.pivoine.art` |
| `PUBLIC_URL` | Frontend URL | `https://sexy.pivoine.art` |
### Optional
| Variable | Description | Example |
|----------|-------------|---------|
| `PUBLIC_UMAMI_ID` | Analytics tracking ID | `abc-123-def` |
| `LETTERSPACE_API_URL` | Newsletter API | `https://api.letterspace.com/v1` |
| `LETTERSPACE_API_KEY` | Newsletter API key | `sk_live_...` |
| `LETTERSPACE_LIST_ID` | Mailing list ID | `list_abc123` |
## Common Commands
### View Logs
```bash
# Follow logs (Docker Run)
docker logs -f sexy-pivoine
# Follow logs (Docker Compose)
docker-compose -f docker-compose.production.yml logs -f
```
### Restart Container
```bash
# Docker Run
docker restart sexy-pivoine
# Docker Compose
docker-compose -f docker-compose.production.yml restart
```
### Stop Container
```bash
# Docker Run
docker stop sexy-pivoine
# Docker Compose
docker-compose -f docker-compose.production.yml down
```
### Update to Latest Version
```bash
# Docker Run
docker pull ghcr.io/valknarxxx/sexy:latest
docker stop sexy-pivoine
docker rm sexy-pivoine
# Then re-run the docker run command from Step 3
# Docker Compose
docker-compose -f docker-compose.production.yml pull
docker-compose -f docker-compose.production.yml up -d
```
### Shell Access
```bash
# Docker Run
docker exec -it sexy-pivoine sh
# Docker Compose
docker-compose -f docker-compose.production.yml exec frontend sh
```
## Available Image Tags
| Tag | Description | Use Case |
|-----|-------------|----------|
| `latest` | Latest stable build from main | Production |
| `v1.0.0` | Specific version | Production (pinned) |
| `develop` | Latest from develop branch | Staging |
| `main-abc123` | Specific commit | Testing |
**Best Practice:** Use version tags in production for predictable deployments.
## Production Deployment
### 1. Use Version Tags
```bash
# Instead of :latest
docker pull ghcr.io/valknarxxx/sexy:v1.0.0
```
### 2. Add Resource Limits
```bash
docker run -d \
--name sexy-pivoine \
-p 3000:3000 \
--env-file .env.production \
--memory="2g" \
--cpus="2" \
--restart unless-stopped \
ghcr.io/valknarxxx/sexy:v1.0.0
```
### 3. Use a Reverse Proxy
Example with nginx:
```nginx
server {
listen 80;
server_name sexy.pivoine.art;
location / {
proxy_pass http://localhost:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
}
```
### 4. Enable HTTPS
Use Certbot or similar:
```bash
certbot --nginx -d sexy.pivoine.art
```
## Health Check
The container includes a built-in health check:
```bash
# Check container health
docker inspect --format='{{.State.Health.Status}}' sexy-pivoine
```
Possible statuses:
- `starting` - Container just started
- `healthy` - Application is responding
- `unhealthy` - Application is not responding
## Troubleshooting
### Container Exits Immediately
```bash
# Check logs
docker logs sexy-pivoine
# Common issues:
# - Missing environment variables
# - Port 3000 already in use
# - Invalid environment variable values
```
### Cannot Pull Image
```bash
# For private images, ensure you're logged in
docker login ghcr.io
# Check if image exists
docker pull ghcr.io/valknarxxx/sexy:latest
```
### Port Already in Use
```bash
# Use a different port
docker run -d -p 8080:3000 ghcr.io/valknarxxx/sexy:latest
# Or find what's using port 3000
lsof -i :3000
```
### Application Not Accessible
```bash
# Check if container is running
docker ps | grep sexy-pivoine
# Check logs
docker logs sexy-pivoine
# Verify port mapping
docker port sexy-pivoine
# Test from inside container
docker exec sexy-pivoine wget -O- http://localhost:3000
```
## Next Steps
- **Production setup:** See [DOCKER.md](DOCKER.md)
- **Development:** See [CLAUDE.md](CLAUDE.md)
- **CI/CD:** See [.github/workflows/README.md](.github/workflows/README.md)
## Support
- **Issues:** https://github.com/valknarxxx/sexy/issues
- **Discussions:** https://github.com/valknarxxx/sexy/discussions
- **Security:** Report privately via GitHub Security tab
## License
See [LICENSE](LICENSE) file for details.

View File

@@ -38,7 +38,6 @@ Like Beate Uhse breaking barriers in post-war Germany, we believe in the freedom
- 📱 **Responsive Design** that looks sexy on any device
- 🌍 **Internationalization** — pleasure speaks all languages
- 📊 **Analytics Integration** (Umami) — know your admirers
- 📧 **Newsletter Integration** (Letterspace) — stay connected
<div align="center">
@@ -212,10 +211,8 @@ pnpm --filter @sexy.pivoine.art/frontend start
### 💜 Optional (The Extras)
- `PUBLIC_UMAMI_ID` — Analytics tracking
- `LETTERSPACE_API_URL` — Newsletter endpoint
- `LETTERSPACE_API_KEY` — Newsletter key
- `LETTERSPACE_LIST_ID` — Mailing list
- `PUBLIC_UMAMI_ID` — Analytics tracking ID
- `PUBLIC_UMAMI_SCRIPT` — Umami script URL
See [.env.production.example](.env.production.example) for the full configuration.

View File

@@ -1,265 +0,0 @@
# 🔄 Rebuild Guide - When You Need to Rebuild the Image
## Why Rebuild?
SvelteKit's `PUBLIC_*` environment variables are **baked into the JavaScript** at build time. You need to rebuild when:
1. ✅ Changing `PUBLIC_API_URL`
2. ✅ Changing `PUBLIC_URL`
3. ✅ Changing `PUBLIC_UMAMI_ID`
4. ✅ Changing any `LETTERSPACE_*` variables
5. ❌ NOT needed for Directus env vars (those are runtime)
## Quick Rebuild Process
### 1. Update Frontend Environment Variables
Edit the frontend `.env` file:
```bash
nano packages/frontend/.env
```
Set your production values:
```bash
PUBLIC_API_URL=https://sexy.pivoine.art/api
PUBLIC_URL=https://sexy.pivoine.art
PUBLIC_UMAMI_ID=your-umami-id
LETTERSPACE_API_URL=https://api.letterspace.com/v1
LETTERSPACE_API_KEY=your-key
LETTERSPACE_LIST_ID=your-list-id
```
### 2. Rebuild the Image
```bash
# From the project root
docker build -t ghcr.io/valknarxxx/sexy:latest -t sexy.pivoine.art:latest .
```
**Expected Time:** 30-45 minutes (first build), 10-15 minutes (cached rebuild)
### 3. Restart Services
```bash
# If using docker-compose
cd /home/valknar/Projects/docker-compose/sexy
docker compose down
docker compose up -d
# Or directly
docker stop sexy_frontend
docker rm sexy_frontend
docker compose up -d frontend
```
## Monitoring the Build
### Check Build Progress
```bash
# Watch build output
docker build -t ghcr.io/valknarxxx/sexy:latest .
# Build stages:
# 1. Base (~30s) - Node.js setup
# 2. Builder (~25-40min) - Rust + WASM + packages
# - Rust installation: ~2-3 min
# - wasm-bindgen-cli: ~10-15 min
# - WASM build: ~5-10 min
# - Package builds: ~5-10 min
# 3. Runner (~2min) - Final image assembly
```
### Verify Environment Variables in Built Image
```bash
# Check what PUBLIC_API_URL is baked in
docker run --rm ghcr.io/valknarxxx/sexy:latest sh -c \
"grep -r 'PUBLIC_API_URL' /home/node/app/packages/frontend/build/ | head -3"
# Should show: https://sexy.pivoine.art/api
```
## Push to GitHub Container Registry
After successful build:
```bash
# Login to GHCR (first time only)
echo $GITHUB_TOKEN | docker login ghcr.io -u valknarxxx --password-stdin
# Push the image
docker push ghcr.io/valknarxxx/sexy:latest
```
## Alternative: Build Arguments (Future Enhancement)
To avoid rebuilding for every env change, consider adding build arguments:
```dockerfile
# In Dockerfile, before building frontend:
ARG PUBLIC_API_URL=https://sexy.pivoine.art/api
ARG PUBLIC_URL=https://sexy.pivoine.art
ARG PUBLIC_UMAMI_ID=
# Create .env.production dynamically
RUN echo "PUBLIC_API_URL=${PUBLIC_API_URL}" > packages/frontend/.env.production && \
echo "PUBLIC_URL=${PUBLIC_URL}" >> packages/frontend/.env.production && \
echo "PUBLIC_UMAMI_ID=${PUBLIC_UMAMI_ID}" >> packages/frontend/.env.production
```
Then build with:
```bash
docker build \
--build-arg PUBLIC_API_URL=https://sexy.pivoine.art/api \
--build-arg PUBLIC_URL=https://sexy.pivoine.art \
-t ghcr.io/valknarxxx/sexy:latest .
```
## Troubleshooting
### Build Fails at Rust Installation
```bash
# Check network connectivity
ping -c 3 sh.rustup.rs
# Build with verbose output
docker build --progress=plain -t ghcr.io/valknarxxx/sexy:latest .
```
### Build Fails at WASM
```bash
# Check if wasm-bindgen-cli matches package.json version
docker run --rm rust:latest cargo install wasm-bindgen-cli --version 0.2.103
```
### Frontend Still Shows Wrong URL
```bash
# Verify .env file is correct
cat packages/frontend/.env
# Check if old image is cached
docker images | grep sexy
docker rmi ghcr.io/valknarxxx/sexy:old-tag
# Force rebuild without cache
docker build --no-cache -t ghcr.io/valknarxxx/sexy:latest .
```
### Container Starts But Can't Connect to API
1. Check Traefik routing:
```bash
docker logs traefik | grep sexy
```
2. Check if Directus is accessible:
```bash
curl -I https://sexy.pivoine.art/api/server/health
```
3. Check frontend logs:
```bash
docker logs sexy_frontend
```
## Development vs Production
### Development (Local)
- Use `pnpm dev` for hot reload
- No rebuild needed for code changes
- Env vars from `.env` or shell
### Production (Docker)
- Rebuild required for PUBLIC_* changes
- Changes baked into JavaScript
- Env vars from `packages/frontend/.env`
## Optimization Tips
### Speed Up Rebuilds
1. **Use BuildKit cache:**
```bash
export DOCKER_BUILDKIT=1
docker build --build-arg BUILDKIT_INLINE_CACHE=1 -t ghcr.io/valknarxxx/sexy:latest .
```
2. **Multi-stage caching:**
- Dockerfile already optimized with multi-stage build
- Dependencies cached separately from code
3. **Parallel builds:**
```bash
# Build with more CPU cores
docker build --cpus 4 -t ghcr.io/valknarxxx/sexy:latest .
```
### Reduce Image Size
Current optimizations:
- ✅ Multi-stage build
- ✅ Production dependencies only
- ✅ Minimal base image
- ✅ No dev tools in final image
Expected sizes:
- Base: ~100MB
- Builder: ~2-3GB (not shipped)
- Runner: ~300-500MB (final)
## Automation
### GitHub Actions (Already Set Up)
The `.github/workflows/docker-build-push.yml` automatically:
1. Builds on push to main
2. Creates version tags
3. Pushes to GHCR
4. Caches layers for faster builds
**Trigger a rebuild:**
```bash
git tag v1.0.1
git push origin v1.0.1
```
### Local Build Script
Use the provided `build.sh`:
```bash
./build.sh -t v1.0.0 -p
```
## When NOT to Rebuild
You DON'T need to rebuild for:
- ❌ Directus configuration changes
- ❌ Database credentials
- ❌ Redis settings
- ❌ SMTP settings
- ❌ Session cookie settings
- ❌ Traefik labels
These are runtime environment variables and can be changed in docker-compose.
## Summary
| Change | Rebuild Needed | How to Apply |
|--------|----------------|--------------|
| `PUBLIC_API_URL` | ✅ Yes | Rebuild image |
| `PUBLIC_URL` | ✅ Yes | Rebuild image |
| `PUBLIC_UMAMI_ID` | ✅ Yes | Rebuild image |
| `LETTERSPACE_*` | ✅ Yes | Rebuild image |
| `SEXY_DIRECTUS_*` | ❌ No | Restart container |
| `DB_*` | ❌ No | Restart container |
| `EMAIL_*` | ❌ No | Restart container |
| Traefik labels | ❌ No | Restart container |
---
**Remember:** The key difference is **build-time** (compiled into JS) vs **runtime** (read from environment).

130
build.sh
View File

@@ -1,130 +0,0 @@
#!/bin/bash
# Build script for sexy.pivoine.art Docker image
set -e # Exit on error
# Colors for output
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
NC='\033[0m' # No Color
# Default values
IMAGE_NAME="sexy.pivoine.art"
TAG="latest"
PUSH=false
PLATFORM=""
# Parse arguments
while [[ $# -gt 0 ]]; do
case $1 in
-t|--tag)
TAG="$2"
shift 2
;;
-n|--name)
IMAGE_NAME="$2"
shift 2
;;
-p|--push)
PUSH=true
shift
;;
--platform)
PLATFORM="$2"
shift 2
;;
-h|--help)
echo "Usage: $0 [OPTIONS]"
echo ""
echo "Options:"
echo " -t, --tag TAG Set image tag (default: latest)"
echo " -n, --name NAME Set image name (default: sexy.pivoine.art)"
echo " -p, --push Push image after build"
echo " --platform PLATFORM Build for specific platform (e.g., linux/amd64,linux/arm64)"
echo " -h, --help Show this help message"
echo ""
echo "Examples:"
echo " $0 # Build with defaults"
echo " $0 -t v1.0.0 # Build with version tag"
echo " $0 --platform linux/amd64,linux/arm64 -p # Multi-platform build and push"
exit 0
;;
*)
echo -e "${RED}Unknown option: $1${NC}"
exit 1
;;
esac
done
FULL_IMAGE="${IMAGE_NAME}:${TAG}"
echo -e "${GREEN}=== Building Docker Image ===${NC}"
echo "Image: ${FULL_IMAGE}"
echo "Platform: ${PLATFORM:-default}"
echo ""
# Check if Docker is running
if ! docker info > /dev/null 2>&1; then
echo -e "${RED}Error: Docker is not running${NC}"
exit 1
fi
# Build command
BUILD_CMD="docker build"
if [ -n "$PLATFORM" ]; then
# Multi-platform build requires buildx
echo -e "${YELLOW}Using buildx for multi-platform build${NC}"
BUILD_CMD="docker buildx build --platform ${PLATFORM}"
if [ "$PUSH" = true ]; then
BUILD_CMD="${BUILD_CMD} --push"
fi
else
# Regular build
if [ "$PUSH" = true ]; then
echo -e "${YELLOW}Note: --push only works with multi-platform builds. Use 'docker push' after build.${NC}"
fi
fi
# Execute build
echo -e "${GREEN}Building...${NC}"
$BUILD_CMD -t "${FULL_IMAGE}" .
if [ $? -eq 0 ]; then
echo -e "${GREEN}✓ Build successful!${NC}"
echo "Image: ${FULL_IMAGE}"
# Show image size
if [ -z "$PLATFORM" ]; then
SIZE=$(docker images "${FULL_IMAGE}" --format "{{.Size}}")
echo "Size: ${SIZE}"
fi
# Push if requested and not multi-platform
if [ "$PUSH" = true ] && [ -z "$PLATFORM" ]; then
echo -e "${GREEN}Pushing image...${NC}"
docker push "${FULL_IMAGE}"
if [ $? -eq 0 ]; then
echo -e "${GREEN}✓ Push successful!${NC}"
else
echo -e "${RED}✗ Push failed${NC}"
exit 1
fi
fi
echo ""
echo -e "${GREEN}Next steps:${NC}"
echo "1. Run locally:"
echo " docker run -d -p 3000:3000 --env-file .env.production ${FULL_IMAGE}"
echo ""
echo "2. Run with docker-compose:"
echo " docker-compose -f docker-compose.production.yml up -d"
echo ""
echo "3. View logs:"
echo " docker logs -f <container-name>"
else
echo -e "${RED}✗ Build failed${NC}"
exit 1
fi

View File

@@ -1,130 +0,0 @@
include:
- compose.yml
# Production compose file - extends base compose.yml
# Usage: docker-compose -f compose.production.yml up -d
networks:
compose_network:
external: true
name: compose_network
services:
# Disable local postgres for production (use external DB)
postgres:
deploy:
replicas: 0
# Disable local redis for production (use external Redis)
redis:
deploy:
replicas: 0
# Override Directus for production
directus:
networks:
- compose_network
ports: [] # Remove exposed ports, use Traefik instead
# Override volumes for production paths
volumes:
- ${SEXY_DIRECTUS_UPLOADS:-./uploads}:/directus/uploads
- ${SEXY_DIRECTUS_BUNDLE:-./packages/bundle/dist}:/directus/extensions/sexy.pivoine.art
# Override environment for production settings
environment:
# Database (external)
DB_HOST: ${CORE_DB_HOST}
DB_PORT: ${CORE_DB_PORT:-5432}
DB_DATABASE: ${SEXY_DB_NAME}
DB_USER: ${DB_USER}
DB_PASSWORD: ${DB_PASSWORD}
# General
SECRET: ${SEXY_DIRECTUS_SECRET}
ADMIN_EMAIL: ${ADMIN_EMAIL}
ADMIN_PASSWORD: ${ADMIN_PASSWORD}
PUBLIC_URL: ${SEXY_PUBLIC_URL}
# Cache (external Redis)
REDIS: redis://${CORE_REDIS_HOST}:${CORE_REDIS_PORT:-6379}
# CORS
CORS_ORIGIN: ${SEXY_CORS_ORIGIN}
# Security (production settings)
SESSION_COOKIE_SECURE: ${SEXY_SESSION_COOKIE_SECURE:-true}
SESSION_COOKIE_SAME_SITE: ${SEXY_SESSION_COOKIE_SAME_SITE:-strict}
SESSION_COOKIE_DOMAIN: ${SEXY_SESSION_COOKIE_DOMAIN}
# Extensions
EXTENSIONS_AUTO_RELOAD: ${SEXY_EXTENSIONS_AUTO_RELOAD:-false}
# Email (production SMTP)
EMAIL_TRANSPORT: ${EMAIL_TRANSPORT:-smtp}
EMAIL_FROM: ${EMAIL_FROM}
EMAIL_SMTP_HOST: ${EMAIL_SMTP_HOST}
EMAIL_SMTP_PORT: ${EMAIL_SMTP_PORT:-587}
EMAIL_SMTP_USER: ${EMAIL_SMTP_USER}
EMAIL_SMTP_PASSWORD: ${EMAIL_SMTP_PASSWORD}
# User URLs
USER_REGISTER_URL_ALLOW_LIST: ${SEXY_USER_REGISTER_URL_ALLOW_LIST}
PASSWORD_RESET_URL_ALLOW_LIST: ${SEXY_PASSWORD_RESET_URL_ALLOW_LIST}
# Remove local dependencies
depends_on: []
labels:
# Traefik labels for reverse proxy
- 'traefik.enable=${SEXY_TRAEFIK_ENABLED:-true}'
- 'traefik.http.middlewares.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-redirect-web-secure.redirectscheme.scheme=https'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web.middlewares=${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-redirect-web-secure'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web.rule=Host(`${SEXY_TRAEFIK_HOST}`) && PathPrefix(`/api`)'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web.entrypoints=web'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web-secure.rule=Host(`${SEXY_TRAEFIK_HOST}`) && PathPrefix(`/api`)'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web-secure.tls.certresolver=resolver'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web-secure.entrypoints=web-secure'
- 'traefik.http.middlewares.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web-secure-compress.compress=true'
- 'traefik.http.middlewares.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-strip.stripprefix.prefixes=/api'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web-secure.middlewares=${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-strip,${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web-secure-compress'
- 'traefik.http.services.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web-secure.loadbalancer.server.port=8055'
- 'traefik.docker.network=compose_network'
# Override Frontend for production
frontend:
networks:
- compose_network
ports: [] # Remove exposed ports, use Traefik instead
# Override environment for production
environment:
NODE_ENV: production
PUBLIC_API_URL: ${SEXY_FRONTEND_PUBLIC_API_URL}
PUBLIC_URL: ${SEXY_FRONTEND_PUBLIC_URL}
PUBLIC_UMAMI_ID: ${SEXY_FRONTEND_PUBLIC_UMAMI_ID:-}
LETTERSPACE_API_URL: ${SEXY_FRONTEND_LETTERSPACE_API_URL:-}
LETTERSPACE_API_KEY: ${SEXY_FRONTEND_LETTERSPACE_API_KEY:-}
LETTERSPACE_LIST_ID: ${SEXY_FRONTEND_LETTERSPACE_LIST_ID:-}
# Override volume for production path
volumes:
- ${SEXY_FRONTEND_PATH:-/var/www/sexy.pivoine.art}:/home/node/app
# Remove local dependency
depends_on: []
labels:
# Traefik labels for reverse proxy
- 'traefik.enable=${SEXY_TRAEFIK_ENABLED:-true}'
- 'traefik.http.middlewares.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-redirect-web-secure.redirectscheme.scheme=https'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web.middlewares=${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-redirect-web-secure'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web.rule=Host(`${SEXY_TRAEFIK_HOST}`)'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web.entrypoints=web'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web-secure.rule=Host(`${SEXY_TRAEFIK_HOST}`)'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web-secure.tls.certresolver=resolver'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web-secure.entrypoints=web-secure'
- 'traefik.http.middlewares.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web-secure-compress.compress=true'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web-secure.middlewares=${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web-secure-compress'
- 'traefik.http.services.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web-secure.loadbalancer.server.port=3000'
- 'traefik.docker.network=compose_network'

View File

@@ -1,183 +1,87 @@
name: sexy
services:
# PostgreSQL Database (local only)
postgres:
image: postgres:16-alpine
container_name: ${SEXY_COMPOSE_PROJECT_NAME:-sexy}_postgres
container_name: sexy_postgres
restart: unless-stopped
networks:
- sexy-network
volumes:
- postgres-data:/var/lib/postgresql/data
- postgres_data:/var/lib/postgresql/data
environment:
POSTGRES_DB: ${DB_DATABASE:-sexy}
POSTGRES_USER: ${DB_USER:-sexy}
POSTGRES_PASSWORD: ${DB_PASSWORD:-sexy}
POSTGRES_DB: sexy
POSTGRES_USER: sexy
POSTGRES_PASSWORD: sexy
healthcheck:
test: ["CMD-SHELL", "pg_isready -U ${DB_USER:-sexy}"]
test: ["CMD-SHELL", "pg_isready -U sexy"]
interval: 10s
timeout: 5s
retries: 5
# Redis Cache (local only)
redis:
image: redis:7-alpine
container_name: ${SEXY_COMPOSE_PROJECT_NAME:-sexy}_redis
container_name: sexy_redis
restart: unless-stopped
networks:
- sexy-network
volumes:
- redis-data:/data
- redis_data:/data
command: redis-server --appendonly yes
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 10s
timeout: 5s
retries: 5
# Directus CMS
directus:
image: ${SEXY_DIRECTUS_IMAGE:-directus/directus:11}
container_name: ${SEXY_COMPOSE_PROJECT_NAME:-sexy}_api
backend:
build:
context: .
dockerfile: Dockerfile.backend
container_name: sexy_backend
restart: unless-stopped
networks:
- sexy-network
ports:
- "8055:8055"
- "4000:4000"
volumes:
- directus-uploads:/directus/uploads
- ${SEXY_DIRECTUS_BUNDLE:-./packages/bundle}:/directus/extensions/sexy.pivoine.art
- uploads_data:/data/uploads
environment:
# Database
DB_CLIENT: pg
DB_HOST: ${CORE_DB_HOST:-postgres}
DB_PORT: ${CORE_DB_PORT:-5432}
DB_DATABASE: ${SEXY_DB_NAME:-sexy}
DB_USER: ${DB_USER:-sexy}
DB_PASSWORD: ${DB_PASSWORD:-sexy}
# General
SECRET: ${SEXY_DIRECTUS_SECRET:-replace-with-random-secret-min-32-chars}
ADMIN_EMAIL: ${ADMIN_EMAIL:-admin@sexy.pivoine.art}
ADMIN_PASSWORD: ${ADMIN_PASSWORD:-admin}
PUBLIC_URL: ${SEXY_PUBLIC_URL:-http://localhost:8055}
# Cache
CACHE_ENABLED: ${SEXY_CACHE_ENABLED:-true}
CACHE_AUTO_PURGE: ${SEXY_CACHE_AUTO_PURGE:-true}
CACHE_STORE: redis
REDIS: redis://${CORE_REDIS_HOST:-redis}:${CORE_REDIS_PORT:-6379}
# CORS
CORS_ENABLED: ${SEXY_CORS_ENABLED:-true}
CORS_ORIGIN: ${SEXY_CORS_ORIGIN:-http://localhost:3000}
# Security
SESSION_COOKIE_SECURE: ${SEXY_SESSION_COOKIE_SECURE:-false}
SESSION_COOKIE_SAME_SITE: ${SEXY_SESSION_COOKIE_SAME_SITE:-lax}
SESSION_COOKIE_DOMAIN: ${SEXY_SESSION_COOKIE_DOMAIN:-localhost}
# Extensions
EXTENSIONS_PATH: ${SEXY_EXTENSIONS_PATH:-/directus/extensions}
EXTENSIONS_AUTO_RELOAD: ${SEXY_EXTENSIONS_AUTO_RELOAD:-true}
# WebSockets
WEBSOCKETS_ENABLED: ${SEXY_WEBSOCKETS_ENABLED:-true}
# Email (optional for local dev)
EMAIL_TRANSPORT: ${EMAIL_TRANSPORT:-sendmail}
EMAIL_FROM: ${EMAIL_FROM:-noreply@sexy.pivoine.art}
EMAIL_SMTP_HOST: ${EMAIL_SMTP_HOST:-}
EMAIL_SMTP_PORT: ${EMAIL_SMTP_PORT:-587}
EMAIL_SMTP_USER: ${EMAIL_SMTP_USER:-}
EMAIL_SMTP_PASSWORD: ${EMAIL_SMTP_PASSWORD:-}
# User Registration & Password Reset URLs
USER_REGISTER_URL_ALLOW_LIST: ${SEXY_USER_REGISTER_URL_ALLOW_LIST:-http://localhost:3000}
PASSWORD_RESET_URL_ALLOW_LIST: ${SEXY_PASSWORD_RESET_URL_ALLOW_LIST:-http://localhost:3000}
# Content Security Policy
CONTENT_SECURITY_POLICY_DIRECTIVES__FRAME_SRC: ${SEXY_CONTENT_SECURITY_POLICY_DIRECTIVES__FRAME_SRC:-}
# Timezone
TZ: ${TIMEZONE:-Europe/Amsterdam}
DATABASE_URL: postgresql://sexy:sexy@sexy_postgres:5432/sexy
REDIS_URL: redis://sexy_redis:6379
UPLOAD_DIR: /data/uploads
CORS_ORIGIN: http://localhost:3000
PORT: 4000
NODE_ENV: production
COOKIE_SECRET: change-me-in-production
SMTP_HOST: localhost
SMTP_PORT: 587
EMAIL_FROM: noreply@sexy.pivoine.art
PUBLIC_URL: http://localhost:3000
depends_on:
postgres:
condition: service_healthy
redis:
condition: service_healthy
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:8055/server/health"]
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:4000/health"]
interval: 30s
timeout: 10s
retries: 3
start_period: 40s
# Frontend (local development - optional, usually run via pnpm dev)
start_period: 20s
frontend:
image: ${SEXY_FRONTEND_IMAGE:-ghcr.io/valknarxxx/sexy:latest}
container_name: ${SEXY_COMPOSE_PROJECT_NAME:-sexy}_frontend
build:
context: .
dockerfile: Dockerfile
container_name: sexy_frontend
restart: unless-stopped
user: node
working_dir: /home/node/app/packages/frontend
networks:
- sexy-network
ports:
- "3000:3000"
environment:
# Node
NODE_ENV: ${NODE_ENV:-development}
NODE_ENV: production
PORT: 3000
HOST: 0.0.0.0
# Public environment variables
PUBLIC_API_URL: ${SEXY_FRONTEND_PUBLIC_API_URL:-http://localhost:8055}
PUBLIC_URL: ${SEXY_FRONTEND_PUBLIC_URL:-http://localhost:3000}
PUBLIC_UMAMI_ID: ${SEXY_FRONTEND_PUBLIC_UMAMI_ID:-}
# Letterspace newsletter integration
LETTERSPACE_API_URL: ${SEXY_FRONTEND_LETTERSPACE_API_URL:-}
LETTERSPACE_API_KEY: ${SEXY_FRONTEND_LETTERSPACE_API_KEY:-}
LETTERSPACE_LIST_ID: ${SEXY_FRONTEND_LETTERSPACE_LIST_ID:-}
# Timezone
TZ: ${TIMEZONE:-Europe/Amsterdam}
volumes:
- ${SEXY_FRONTEND_PATH:-./}:/home/node/app
command: ["node", "build/index.js"]
PUBLIC_API_URL: http://sexy_backend:4000
PUBLIC_URL: http://localhost:3000
depends_on:
- directus
healthcheck:
test: ["CMD", "node", "-e", "require('http').get('http://localhost:3000/', (r) => {process.exit(r.statusCode === 200 ? 0 : 1)})"]
interval: 30s
timeout: 3s
retries: 3
start_period: 40s
# Uncomment to run frontend in development mode with live reload
# build:
# context: .
# dockerfile: Dockerfile
# volumes:
# - ./packages/frontend:/home/node/app/packages/frontend
# - /home/node/app/packages/frontend/node_modules
# environment:
# NODE_ENV: development
networks:
sexy-network:
driver: bridge
name: ${SEXY_COMPOSE_PROJECT_NAME:-sexy}_network
backend:
condition: service_healthy
volumes:
directus-uploads:
uploads_data:
driver: local
postgres-data:
postgres_data:
driver: local
redis-data:
redis_data:
driver: local

View File

@@ -102,34 +102,6 @@ collections:
versioning: false
schema:
name: sexy_videos_directus_users
- collection: sexy_recordings
meta:
accountability: all
archive_app_filter: true
archive_field: status
archive_value: archived
collapse: open
collection: sexy_recordings
color: null
display_template: null
group: null
hidden: false
icon: fiber_manual_record
item_duplication_fields: null
note: null
preview_url: null
singleton: false
sort: null
sort_field: null
translations:
- language: en-US
plural: Recordings
singular: Recording
translation: Sexy Recordings
unarchive_value: draft
versioning: false
schema:
name: sexy_recordings
fields:
- collection: directus_users
field: website
@@ -206,7 +178,7 @@ fields:
max_length: 255
numeric_precision: null
numeric_scale: null
is_nullable: true
is_nullable: false
is_unique: true
is_indexed: true
is_primary_key: false
@@ -1908,639 +1880,6 @@ fields:
has_auto_increment: false
foreign_key_table: directus_users
foreign_key_column: id
- collection: sexy_recordings
field: id
type: uuid
meta:
collection: sexy_recordings
conditions: null
display: null
display_options: null
field: id
group: null
hidden: true
interface: input
note: null
options: null
readonly: true
required: false
sort: 1
special:
- uuid
translations: null
validation: null
validation_message: null
width: full
schema:
name: id
table: sexy_recordings
data_type: uuid
default_value: null
max_length: null
numeric_precision: null
numeric_scale: null
is_nullable: false
is_unique: true
is_indexed: false
is_primary_key: true
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: status
type: string
meta:
collection: sexy_recordings
conditions: null
display: labels
display_options:
choices:
- background: var(--theme--primary-background)
color: var(--theme--primary)
foreground: var(--theme--primary)
text: $t:published
value: published
- background: var(--theme--background-normal)
color: var(--theme--foreground)
foreground: var(--theme--foreground)
text: $t:draft
value: draft
- background: var(--theme--warning-background)
color: var(--theme--warning)
foreground: var(--theme--warning)
text: $t:archived
value: archived
showAsDot: true
field: status
group: null
hidden: false
interface: select-dropdown
note: null
options:
choices:
- color: var(--theme--primary)
text: $t:published
value: published
- color: var(--theme--foreground)
text: $t:draft
value: draft
- color: var(--theme--warning)
text: $t:archived
value: archived
readonly: false
required: false
sort: 2
special: null
translations: null
validation: null
validation_message: null
width: full
schema:
name: status
table: sexy_recordings
data_type: character varying
default_value: draft
max_length: 255
numeric_precision: null
numeric_scale: null
is_nullable: false
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: user_created
type: uuid
meta:
collection: sexy_recordings
conditions: null
display: user
display_options: null
field: user_created
group: null
hidden: true
interface: select-dropdown-m2o
note: null
options:
template: '{{avatar}} {{first_name}} {{last_name}}'
readonly: true
required: false
sort: 3
special:
- user-created
translations: null
validation: null
validation_message: null
width: half
schema:
name: user_created
table: sexy_recordings
data_type: uuid
default_value: null
max_length: null
numeric_precision: null
numeric_scale: null
is_nullable: true
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: directus_users
foreign_key_column: id
- collection: sexy_recordings
field: date_created
type: timestamp
meta:
collection: sexy_recordings
conditions: null
display: datetime
display_options:
relative: true
field: date_created
group: null
hidden: true
interface: datetime
note: null
options: null
readonly: true
required: false
sort: 4
special:
- date-created
translations: null
validation: null
validation_message: null
width: half
schema:
name: date_created
table: sexy_recordings
data_type: timestamp with time zone
default_value: null
max_length: null
numeric_precision: null
numeric_scale: null
is_nullable: true
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: date_updated
type: timestamp
meta:
collection: sexy_recordings
conditions: null
display: datetime
display_options:
relative: true
field: date_updated
group: null
hidden: true
interface: datetime
note: null
options: null
readonly: true
required: false
sort: 5
special:
- date-updated
translations: null
validation: null
validation_message: null
width: half
schema:
name: date_updated
table: sexy_recordings
data_type: timestamp with time zone
default_value: null
max_length: null
numeric_precision: null
numeric_scale: null
is_nullable: true
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: title
type: string
meta:
collection: sexy_recordings
conditions: null
display: null
display_options: null
field: title
group: null
hidden: false
interface: input
note: null
options: null
readonly: false
required: true
sort: 6
special: null
translations: null
validation: null
validation_message: null
width: full
schema:
name: title
table: sexy_recordings
data_type: character varying
default_value: null
max_length: 255
numeric_precision: null
numeric_scale: null
is_nullable: true
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: description
type: text
meta:
collection: sexy_recordings
conditions: null
display: null
display_options: null
field: description
group: null
hidden: false
interface: input-multiline
note: null
options:
trim: true
readonly: false
required: false
sort: 7
special: null
translations: null
validation: null
validation_message: null
width: full
schema:
name: description
table: sexy_recordings
data_type: text
default_value: null
max_length: null
numeric_precision: null
numeric_scale: null
is_nullable: true
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: slug
type: string
meta:
collection: sexy_recordings
conditions: null
display: null
display_options: null
field: slug
group: null
hidden: false
interface: input
note: null
options:
slug: true
trim: true
readonly: false
required: true
sort: 8
special: null
translations: null
validation: null
validation_message: null
width: full
schema:
name: slug
table: sexy_recordings
data_type: character varying
default_value: null
max_length: 255
numeric_precision: null
numeric_scale: null
is_nullable: false
is_unique: true
is_indexed: true
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: duration
type: float
meta:
collection: sexy_recordings
conditions: null
display: null
display_options: null
field: duration
group: null
hidden: false
interface: input
note: Duration in milliseconds
options: null
readonly: false
required: true
sort: 9
special: null
translations: null
validation: null
validation_message: null
width: full
schema:
name: duration
table: sexy_recordings
data_type: double precision
default_value: null
max_length: null
numeric_precision: 53
numeric_scale: null
is_nullable: false
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: events
type: json
meta:
collection: sexy_recordings
conditions: null
display: null
display_options: null
field: events
group: null
hidden: false
interface: input-code
note: Array of recorded events with timestamps
options:
language: json
readonly: false
required: true
sort: 10
special:
- cast-json
translations: null
validation: null
validation_message: null
width: full
schema:
name: events
table: sexy_recordings
data_type: json
default_value: null
max_length: null
numeric_precision: null
numeric_scale: null
is_nullable: false
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: device_info
type: json
meta:
collection: sexy_recordings
conditions: null
display: null
display_options: null
field: device_info
group: null
hidden: false
interface: input-code
note: Array of device metadata
options:
language: json
readonly: false
required: true
sort: 11
special:
- cast-json
translations: null
validation: null
validation_message: null
width: full
schema:
name: device_info
table: sexy_recordings
data_type: json
default_value: null
max_length: null
numeric_precision: null
numeric_scale: null
is_nullable: false
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: tags
type: json
meta:
collection: sexy_recordings
conditions: null
display: null
display_options: null
field: tags
group: null
hidden: false
interface: tags
note: null
options: null
readonly: false
required: false
sort: 12
special:
- cast-json
translations: null
validation: null
validation_message: null
width: full
schema:
name: tags
table: sexy_recordings
data_type: json
default_value: null
max_length: null
numeric_precision: null
numeric_scale: null
is_nullable: true
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: linked_video
type: uuid
meta:
collection: sexy_recordings
conditions: null
display: null
display_options: null
field: linked_video
group: null
hidden: false
interface: select-dropdown-m2o
note: null
options:
enableLink: true
readonly: false
required: false
sort: 13
special:
- m2o
translations: null
validation: null
validation_message: null
width: full
schema:
name: linked_video
table: sexy_recordings
data_type: uuid
default_value: null
max_length: null
numeric_precision: null
numeric_scale: null
is_nullable: true
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: sexy_videos
foreign_key_column: id
- collection: sexy_recordings
field: featured
type: boolean
meta:
collection: sexy_recordings
conditions: null
display: null
display_options: null
field: featured
group: null
hidden: false
interface: boolean
note: null
options:
label: Featured
readonly: false
required: false
sort: 14
special:
- cast-boolean
translations: null
validation: null
validation_message: null
width: full
schema:
name: featured
table: sexy_recordings
data_type: boolean
default_value: false
max_length: null
numeric_precision: null
numeric_scale: null
is_nullable: true
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
- collection: sexy_recordings
field: public
type: boolean
meta:
collection: sexy_recordings
conditions: null
display: null
display_options: null
field: public
group: null
hidden: false
interface: boolean
note: null
options:
label: Public
readonly: false
required: false
sort: 15
special:
- cast-boolean
translations: null
validation: null
validation_message: null
width: full
schema:
name: public
table: sexy_recordings
data_type: boolean
default_value: false
max_length: null
numeric_precision: null
numeric_scale: null
is_nullable: true
is_unique: false
is_indexed: false
is_primary_key: false
is_generated: false
generation_expression: null
has_auto_increment: false
foreign_key_table: null
foreign_key_column: null
relations:
- collection: directus_users
field: banner
@@ -2773,45 +2112,3 @@ relations:
constraint_name: sexy_videos_directus_users_sexy_videos_id_foreign
on_update: NO ACTION
on_delete: SET NULL
- collection: sexy_recordings
field: user_created
related_collection: directus_users
meta:
junction_field: null
many_collection: sexy_recordings
many_field: user_created
one_allowed_collections: null
one_collection: directus_users
one_collection_field: null
one_deselect_action: nullify
one_field: null
sort_field: null
schema:
table: sexy_recordings
column: user_created
foreign_key_table: directus_users
foreign_key_column: id
constraint_name: sexy_recordings_user_created_foreign
on_update: NO ACTION
on_delete: NO ACTION
- collection: sexy_recordings
field: linked_video
related_collection: sexy_videos
meta:
junction_field: null
many_collection: sexy_recordings
many_field: linked_video
one_allowed_collections: null
one_collection: sexy_videos
one_collection_field: null
one_deselect_action: nullify
one_field: null
sort_field: null
schema:
table: sexy_recordings
column: linked_video
foreign_key_table: sexy_videos
foreign_key_column: id
constraint_name: sexy_recordings_linked_video_foreign
on_update: NO ACTION
on_delete: SET NULL

View File

@@ -1,177 +0,0 @@
-- Gamification System Schema for Sexy Recordings Platform
-- Created: 2025-10-28
-- Description: Recording-focused gamification with time-weighted scoring
-- ====================
-- Table: sexy_recording_plays
-- ====================
-- Tracks when users play recordings (similar to video plays)
CREATE TABLE IF NOT EXISTS sexy_recording_plays (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID NOT NULL REFERENCES directus_users(id) ON DELETE CASCADE,
recording_id UUID NOT NULL REFERENCES sexy_recordings(id) ON DELETE CASCADE,
duration_played INTEGER, -- Duration played in milliseconds
completed BOOLEAN DEFAULT FALSE, -- True if >= 90% watched
date_created TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
date_updated TIMESTAMP WITH TIME ZONE DEFAULT NOW()
);
CREATE INDEX IF NOT EXISTS idx_recording_plays_user ON sexy_recording_plays(user_id);
CREATE INDEX IF NOT EXISTS idx_recording_plays_recording ON sexy_recording_plays(recording_id);
CREATE INDEX IF NOT EXISTS idx_recording_plays_date ON sexy_recording_plays(date_created);
COMMENT ON TABLE sexy_recording_plays IS 'Tracks user playback of recordings for analytics and gamification';
COMMENT ON COLUMN sexy_recording_plays.completed IS 'True if user watched at least 90% of the recording';
-- ====================
-- Table: sexy_user_points
-- ====================
-- Tracks individual point-earning actions with timestamps for time-weighted scoring
CREATE TABLE IF NOT EXISTS sexy_user_points (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID NOT NULL REFERENCES directus_users(id) ON DELETE CASCADE,
action VARCHAR(50) NOT NULL, -- e.g., "RECORDING_CREATE", "RECORDING_PLAY", "COMMENT_CREATE"
points INTEGER NOT NULL, -- Raw points earned
recording_id UUID REFERENCES sexy_recordings(id) ON DELETE SET NULL, -- Optional reference
date_created TIMESTAMP WITH TIME ZONE DEFAULT NOW()
);
CREATE INDEX IF NOT EXISTS idx_user_points_user ON sexy_user_points(user_id);
CREATE INDEX IF NOT EXISTS idx_user_points_date ON sexy_user_points(date_created);
CREATE INDEX IF NOT EXISTS idx_user_points_action ON sexy_user_points(action);
COMMENT ON TABLE sexy_user_points IS 'Individual point-earning actions for gamification system';
COMMENT ON COLUMN sexy_user_points.action IS 'Type of action: RECORDING_CREATE, RECORDING_PLAY, RECORDING_COMPLETE, COMMENT_CREATE, RECORDING_FEATURED';
COMMENT ON COLUMN sexy_user_points.points IS 'Raw points before time-weighted decay calculation';
-- ====================
-- Table: sexy_achievements
-- ====================
-- Predefined achievement definitions
CREATE TABLE IF NOT EXISTS sexy_achievements (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
code VARCHAR(50) UNIQUE NOT NULL, -- Unique identifier (e.g., "first_recording", "recording_100")
name VARCHAR(255) NOT NULL, -- Display name
description TEXT, -- Achievement description
icon VARCHAR(255), -- Icon identifier or emoji
category VARCHAR(50) NOT NULL, -- e.g., "recordings", "playback", "social", "special"
required_count INTEGER, -- Number of actions needed to unlock
points_reward INTEGER DEFAULT 0, -- Bonus points awarded upon unlock
sort INTEGER DEFAULT 0, -- Display order
status VARCHAR(20) DEFAULT 'published' -- published, draft, archived
);
CREATE INDEX IF NOT EXISTS idx_achievements_category ON sexy_achievements(category);
CREATE INDEX IF NOT EXISTS idx_achievements_code ON sexy_achievements(code);
COMMENT ON TABLE sexy_achievements IS 'Predefined achievement definitions for gamification';
COMMENT ON COLUMN sexy_achievements.code IS 'Unique code used in backend logic (e.g., first_recording, play_100)';
COMMENT ON COLUMN sexy_achievements.category IS 'Achievement category: recordings, playback, social, special';
-- ====================
-- Table: sexy_user_achievements
-- ====================
-- Junction table tracking unlocked achievements per user
CREATE TABLE IF NOT EXISTS sexy_user_achievements (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID NOT NULL REFERENCES directus_users(id) ON DELETE CASCADE,
achievement_id UUID NOT NULL REFERENCES sexy_achievements(id) ON DELETE CASCADE,
progress INTEGER DEFAULT 0, -- Current progress toward unlocking
date_unlocked TIMESTAMP WITH TIME ZONE, -- NULL if not yet unlocked
UNIQUE(user_id, achievement_id)
);
CREATE INDEX IF NOT EXISTS idx_user_achievements_user ON sexy_user_achievements(user_id);
CREATE INDEX IF NOT EXISTS idx_user_achievements_achievement ON sexy_user_achievements(achievement_id);
CREATE INDEX IF NOT EXISTS idx_user_achievements_unlocked ON sexy_user_achievements(date_unlocked) WHERE date_unlocked IS NOT NULL;
COMMENT ON TABLE sexy_user_achievements IS 'Tracks which achievements users have unlocked';
COMMENT ON COLUMN sexy_user_achievements.progress IS 'Current progress (e.g., 7/10 recordings created)';
COMMENT ON COLUMN sexy_user_achievements.date_unlocked IS 'NULL if achievement not yet unlocked';
-- ====================
-- Table: sexy_user_stats
-- ====================
-- Cached aggregate statistics for efficient leaderboard queries
CREATE TABLE IF NOT EXISTS sexy_user_stats (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID UNIQUE NOT NULL REFERENCES directus_users(id) ON DELETE CASCADE,
total_raw_points INTEGER DEFAULT 0, -- Sum of all points (no decay)
total_weighted_points NUMERIC(10,2) DEFAULT 0, -- Time-weighted score for rankings
recordings_count INTEGER DEFAULT 0, -- Number of published recordings
playbacks_count INTEGER DEFAULT 0, -- Number of recordings played
comments_count INTEGER DEFAULT 0, -- Number of comments on recordings
achievements_count INTEGER DEFAULT 0, -- Number of unlocked achievements
last_updated TIMESTAMP WITH TIME ZONE DEFAULT NOW() -- Cache timestamp
);
CREATE INDEX IF NOT EXISTS idx_user_stats_weighted ON sexy_user_stats(total_weighted_points DESC);
CREATE INDEX IF NOT EXISTS idx_user_stats_user ON sexy_user_stats(user_id);
COMMENT ON TABLE sexy_user_stats IS 'Cached user statistics for fast leaderboard queries';
COMMENT ON COLUMN sexy_user_stats.total_raw_points IS 'Sum of all points without time decay';
COMMENT ON COLUMN sexy_user_stats.total_weighted_points IS 'Time-weighted score using exponential decay (λ=0.005)';
COMMENT ON COLUMN sexy_user_stats.last_updated IS 'Timestamp for cache invalidation';
-- ====================
-- Insert Initial Achievements
-- ====================
-- 🎬 Recordings (Creation)
INSERT INTO sexy_achievements (code, name, description, icon, category, required_count, points_reward, sort) VALUES
('first_recording', 'First Recording', 'Create your first recording', '🎬', 'recordings', 1, 50, 1),
('recording_10', 'Recording Enthusiast', 'Create 10 recordings', '📹', 'recordings', 10, 100, 2),
('recording_50', 'Prolific Creator', 'Create 50 recordings', '🎥', 'recordings', 50, 500, 3),
('recording_100', 'Recording Master', 'Create 100 recordings', '🏆', 'recordings', 100, 1000, 4),
('featured_recording', 'Featured Creator', 'Get a recording featured', '', 'recordings', 1, 200, 5)
ON CONFLICT (code) DO NOTHING;
-- ▶️ Playback (Consumption)
INSERT INTO sexy_achievements (code, name, description, icon, category, required_count, points_reward, sort) VALUES
('first_play', 'First Play', 'Play your first recording', '▶️', 'playback', 1, 25, 10),
('play_100', 'Active Player', 'Play 100 recordings', '🎮', 'playback', 100, 250, 11),
('play_500', 'Playback Enthusiast', 'Play 500 recordings', '🔥', 'playback', 500, 1000, 12),
('completionist_10', 'Completionist', 'Complete 10 recordings to 90%+', '', 'playback', 10, 100, 13),
('completionist_100', 'Super Completionist', 'Complete 100 recordings', '💯', 'playback', 100, 500, 14)
ON CONFLICT (code) DO NOTHING;
-- 💬 Social (Community)
INSERT INTO sexy_achievements (code, name, description, icon, category, required_count, points_reward, sort) VALUES
('first_comment', 'First Comment', 'Leave your first comment', '💬', 'social', 1, 25, 20),
('comment_50', 'Conversationalist', 'Leave 50 comments', '💭', 'social', 50, 200, 21),
('comment_250', 'Community Voice', 'Leave 250 comments', '📣', 'social', 250, 750, 22)
ON CONFLICT (code) DO NOTHING;
-- ⭐ Special (Milestones)
INSERT INTO sexy_achievements (code, name, description, icon, category, required_count, points_reward, sort) VALUES
('early_adopter', 'Early Adopter', 'Join in the first month', '🚀', 'special', 1, 500, 30),
('one_year', 'One Year Anniversary', 'Be a member for 1 year', '🎂', 'special', 1, 1000, 31),
('balanced_creator', 'Balanced Creator', '50 recordings + 100 plays', '⚖️', 'special', 1, 500, 32),
('top_10_rank', 'Top 10 Leaderboard', 'Reach top 10 on leaderboard', '🏅', 'special', 1, 2000, 33)
ON CONFLICT (code) DO NOTHING;
-- ====================
-- Verification Queries
-- ====================
-- Count tables created
SELECT
'sexy_recording_plays' as table_name,
COUNT(*) as row_count
FROM sexy_recording_plays
UNION ALL
SELECT 'sexy_user_points', COUNT(*) FROM sexy_user_points
UNION ALL
SELECT 'sexy_achievements', COUNT(*) FROM sexy_achievements
UNION ALL
SELECT 'sexy_user_achievements', COUNT(*) FROM sexy_user_achievements
UNION ALL
SELECT 'sexy_user_stats', COUNT(*) FROM sexy_user_stats;
-- Show created achievements
SELECT
category,
COUNT(*) as achievement_count
FROM sexy_achievements
GROUP BY category
ORDER BY category;

View File

@@ -5,15 +5,18 @@
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"build:bundle": "git pull && pnpm install && pnpm --filter @sexy.pivoine.art/bundle build",
"build:frontend": "git pull && pnpm install && pnpm --filter @sexy.pivoine.art/frontend build",
"dev:data": "cd ../compose/data && docker compose up -d",
"dev:directus": "cd ../compose/sexy && docker compose --env-file=.env.local up -d directus",
"dev": "pnpm dev:data && pnpm dev:directus && pnpm --filter @sexy.pivoine.art/frontend dev"
"build:backend": "git pull && pnpm install && pnpm --filter @sexy.pivoine.art/backend build",
"dev:data": "docker compose up -d postgres redis",
"dev:backend": "pnpm --filter @sexy.pivoine.art/backend dev",
"dev": "pnpm dev:data && pnpm dev:backend & pnpm --filter @sexy.pivoine.art/frontend dev"
},
"keywords": [],
"author": "",
"license": "ISC",
"author": {
"name": "Valknar",
"email": "valknar@pivoine.art"
},
"license": "MIT",
"packageManager": "pnpm@10.19.0",
"pnpm": {
"onlyBuiltDependencies": [

View File

@@ -0,0 +1,10 @@
import { defineConfig } from "drizzle-kit";
export default defineConfig({
schema: "./src/db/schema/index.ts",
out: "./src/migrations",
dialect: "postgresql",
dbCredentials: {
url: process.env.DATABASE_URL || "postgresql://sexy:sexy@localhost:5432/sexy",
},
});

View File

@@ -0,0 +1,51 @@
{
"name": "@sexy.pivoine.art/backend",
"version": "1.0.0",
"type": "module",
"private": true,
"scripts": {
"dev": "tsx watch src/index.ts",
"build": "tsc",
"start": "node dist/index.js",
"db:generate": "drizzle-kit generate",
"db:migrate": "drizzle-kit migrate",
"db:studio": "drizzle-kit studio",
"migrate": "tsx src/scripts/data-migration.ts"
},
"dependencies": {
"@fastify/cookie": "^11.0.2",
"@fastify/cors": "^10.0.2",
"@fastify/multipart": "^9.0.3",
"@fastify/static": "^8.1.1",
"@pothos/core": "^4.4.0",
"@pothos/plugin-errors": "^4.2.0",
"argon2": "^0.43.0",
"drizzle-orm": "^0.44.1",
"fastify": "^5.4.0",
"fluent-ffmpeg": "^2.1.3",
"graphql": "^16.11.0",
"graphql-scalars": "^1.24.2",
"graphql-ws": "^6.0.4",
"graphql-yoga": "^5.13.4",
"ioredis": "^5.6.1",
"nanoid": "^5.1.5",
"nodemailer": "^7.0.3",
"pg": "^8.16.0",
"slugify": "^1.6.6",
"uuid": "^11.1.0"
},
"pnpm": {
"onlyBuiltDependencies": [
"argon2"
]
},
"devDependencies": {
"@types/fluent-ffmpeg": "^2.1.27",
"@types/nodemailer": "^6.4.17",
"@types/pg": "^8.15.4",
"@types/uuid": "^10.0.0",
"drizzle-kit": "^0.31.1",
"tsx": "^4.19.4",
"typescript": "^5.9.3"
}
}

View File

@@ -0,0 +1,11 @@
import { drizzle } from "drizzle-orm/node-postgres";
import { Pool } from "pg";
import * as schema from "./schema/index.js";
const pool = new Pool({
connectionString: process.env.DATABASE_URL || "postgresql://sexy:sexy@localhost:5432/sexy",
max: 20,
});
export const db = drizzle(pool, { schema });
export type DB = typeof db;

View File

@@ -0,0 +1,37 @@
import {
pgTable,
text,
timestamp,
boolean,
index,
uniqueIndex,
} from "drizzle-orm/pg-core";
import { users } from "./users.js";
import { files } from "./files.js";
export const articles = pgTable(
"articles",
{
id: text("id").primaryKey().$defaultFn(() => crypto.randomUUID()),
slug: text("slug").notNull(),
title: text("title").notNull(),
excerpt: text("excerpt"),
content: text("content"),
image: text("image").references(() => files.id, { onDelete: "set null" }),
tags: text("tags").array().default([]),
publish_date: timestamp("publish_date").notNull().defaultNow(),
author: text("author").references(() => users.id, { onDelete: "set null" }),
category: text("category"),
featured: boolean("featured").default(false),
date_created: timestamp("date_created").notNull().defaultNow(),
date_updated: timestamp("date_updated"),
},
(t) => [
uniqueIndex("articles_slug_idx").on(t.slug),
index("articles_publish_date_idx").on(t.publish_date),
index("articles_featured_idx").on(t.featured),
],
);
export type Article = typeof articles.$inferSelect;
export type NewArticle = typeof articles.$inferInsert;

View File

@@ -0,0 +1,30 @@
import {
pgTable,
text,
timestamp,
index,
integer,
} from "drizzle-orm/pg-core";
import { users } from "./users.js";
export const comments = pgTable(
"comments",
{
id: integer("id").primaryKey().generatedAlwaysAsIdentity(),
collection: text("collection").notNull(), // 'videos' | 'recordings'
item_id: text("item_id").notNull(),
comment: text("comment").notNull(),
user_id: text("user_id")
.notNull()
.references(() => users.id, { onDelete: "cascade" }),
date_created: timestamp("date_created").notNull().defaultNow(),
date_updated: timestamp("date_updated"),
},
(t) => [
index("comments_collection_item_idx").on(t.collection, t.item_id),
index("comments_user_idx").on(t.user_id),
],
);
export type Comment = typeof comments.$inferSelect;
export type NewComment = typeof comments.$inferInsert;

View File

@@ -0,0 +1,27 @@
import {
pgTable,
text,
timestamp,
bigint,
integer,
index,
} from "drizzle-orm/pg-core";
export const files = pgTable(
"files",
{
id: text("id").primaryKey().$defaultFn(() => crypto.randomUUID()),
title: text("title"),
description: text("description"),
filename: text("filename").notNull(),
mime_type: text("mime_type"),
filesize: bigint("filesize", { mode: "number" }),
duration: integer("duration"),
uploaded_by: text("uploaded_by"),
date_created: timestamp("date_created").notNull().defaultNow(),
},
(t) => [index("files_uploaded_by_idx").on(t.uploaded_by)],
);
export type File = typeof files.$inferSelect;
export type NewFile = typeof files.$inferInsert;

View File

@@ -0,0 +1,94 @@
import {
pgTable,
text,
timestamp,
integer,
real,
index,
pgEnum,
uniqueIndex,
} from "drizzle-orm/pg-core";
import { users } from "./users.js";
import { recordings } from "./recordings.js";
export const achievementStatusEnum = pgEnum("achievement_status", [
"draft",
"published",
]);
export const achievements = pgTable(
"achievements",
{
id: text("id").primaryKey().$defaultFn(() => crypto.randomUUID()),
code: text("code").notNull(),
name: text("name").notNull(),
description: text("description"),
icon: text("icon"),
category: text("category"),
required_count: integer("required_count").notNull().default(1),
points_reward: integer("points_reward").notNull().default(0),
status: achievementStatusEnum("status").notNull().default("published"),
sort: integer("sort").default(0),
},
(t) => [uniqueIndex("achievements_code_idx").on(t.code)],
);
export const user_achievements = pgTable(
"user_achievements",
{
id: integer("id").primaryKey().generatedAlwaysAsIdentity(),
user_id: text("user_id")
.notNull()
.references(() => users.id, { onDelete: "cascade" }),
achievement_id: text("achievement_id")
.notNull()
.references(() => achievements.id, { onDelete: "cascade" }),
progress: integer("progress").default(0),
date_unlocked: timestamp("date_unlocked"),
},
(t) => [
index("user_achievements_user_idx").on(t.user_id),
uniqueIndex("user_achievements_unique_idx").on(t.user_id, t.achievement_id),
],
);
export const user_points = pgTable(
"user_points",
{
id: integer("id").primaryKey().generatedAlwaysAsIdentity(),
user_id: text("user_id")
.notNull()
.references(() => users.id, { onDelete: "cascade" }),
action: text("action").notNull(),
points: integer("points").notNull(),
recording_id: text("recording_id").references(() => recordings.id, {
onDelete: "set null",
}),
date_created: timestamp("date_created").notNull().defaultNow(),
},
(t) => [
index("user_points_user_idx").on(t.user_id),
index("user_points_date_idx").on(t.date_created),
],
);
export const user_stats = pgTable(
"user_stats",
{
id: integer("id").primaryKey().generatedAlwaysAsIdentity(),
user_id: text("user_id")
.notNull()
.references(() => users.id, { onDelete: "cascade" }),
total_raw_points: integer("total_raw_points").default(0),
total_weighted_points: real("total_weighted_points").default(0),
recordings_count: integer("recordings_count").default(0),
playbacks_count: integer("playbacks_count").default(0),
comments_count: integer("comments_count").default(0),
achievements_count: integer("achievements_count").default(0),
last_updated: timestamp("last_updated").defaultNow(),
},
(t) => [uniqueIndex("user_stats_user_idx").on(t.user_id)],
);
export type Achievement = typeof achievements.$inferSelect;
export type UserStats = typeof user_stats.$inferSelect;

View File

@@ -0,0 +1,7 @@
export * from "./files.js";
export * from "./users.js";
export * from "./videos.js";
export * from "./articles.js";
export * from "./recordings.js";
export * from "./comments.js";
export * from "./gamification.js";

View File

@@ -0,0 +1,73 @@
import {
pgTable,
text,
timestamp,
boolean,
integer,
pgEnum,
index,
uniqueIndex,
jsonb,
} from "drizzle-orm/pg-core";
import { users } from "./users.js";
import { videos } from "./videos.js";
export const recordingStatusEnum = pgEnum("recording_status", [
"draft",
"published",
"archived",
]);
export const recordings = pgTable(
"recordings",
{
id: text("id").primaryKey().$defaultFn(() => crypto.randomUUID()),
title: text("title").notNull(),
description: text("description"),
slug: text("slug").notNull(),
duration: integer("duration").notNull(),
events: jsonb("events").$type<object[]>().default([]),
device_info: jsonb("device_info").$type<object[]>().default([]),
user_id: text("user_id")
.notNull()
.references(() => users.id, { onDelete: "cascade" }),
status: recordingStatusEnum("status").notNull().default("draft"),
tags: text("tags").array().default([]),
linked_video: text("linked_video").references(() => videos.id, {
onDelete: "set null",
}),
featured: boolean("featured").default(false),
public: boolean("public").default(false),
original_recording_id: text("original_recording_id"),
date_created: timestamp("date_created").notNull().defaultNow(),
date_updated: timestamp("date_updated"),
},
(t) => [
uniqueIndex("recordings_slug_idx").on(t.slug),
index("recordings_user_idx").on(t.user_id),
index("recordings_status_idx").on(t.status),
index("recordings_public_idx").on(t.public),
],
);
export const recording_plays = pgTable(
"recording_plays",
{
id: text("id").primaryKey().$defaultFn(() => crypto.randomUUID()),
recording_id: text("recording_id")
.notNull()
.references(() => recordings.id, { onDelete: "cascade" }),
user_id: text("user_id").references(() => users.id, { onDelete: "set null" }),
duration_played: integer("duration_played").default(0),
completed: boolean("completed").default(false),
date_created: timestamp("date_created").notNull().defaultNow(),
date_updated: timestamp("date_updated"),
},
(t) => [
index("recording_plays_recording_idx").on(t.recording_id),
index("recording_plays_user_idx").on(t.user_id),
],
);
export type Recording = typeof recordings.$inferSelect;
export type NewRecording = typeof recordings.$inferInsert;

View File

@@ -0,0 +1,60 @@
import {
pgTable,
text,
timestamp,
pgEnum,
boolean,
index,
uniqueIndex,
integer,
} from "drizzle-orm/pg-core";
import { files } from "./files.js";
export const roleEnum = pgEnum("user_role", ["model", "viewer", "admin"]);
export const users = pgTable(
"users",
{
id: text("id").primaryKey().$defaultFn(() => crypto.randomUUID()),
email: text("email").notNull(),
password_hash: text("password_hash").notNull(),
first_name: text("first_name"),
last_name: text("last_name"),
artist_name: text("artist_name"),
slug: text("slug"),
description: text("description"),
tags: text("tags").array().default([]),
role: roleEnum("role").notNull().default("viewer"),
avatar: text("avatar").references(() => files.id, { onDelete: "set null" }),
banner: text("banner").references(() => files.id, { onDelete: "set null" }),
email_verified: boolean("email_verified").notNull().default(false),
email_verify_token: text("email_verify_token"),
password_reset_token: text("password_reset_token"),
password_reset_expiry: timestamp("password_reset_expiry"),
date_created: timestamp("date_created").notNull().defaultNow(),
date_updated: timestamp("date_updated"),
},
(t) => [
uniqueIndex("users_email_idx").on(t.email),
uniqueIndex("users_slug_idx").on(t.slug),
index("users_role_idx").on(t.role),
],
);
export const user_photos = pgTable(
"user_photos",
{
id: integer("id").primaryKey().generatedAlwaysAsIdentity(),
user_id: text("user_id")
.notNull()
.references(() => users.id, { onDelete: "cascade" }),
file_id: text("file_id")
.notNull()
.references(() => files.id, { onDelete: "cascade" }),
sort: integer("sort").default(0),
},
(t) => [index("user_photos_user_idx").on(t.user_id)],
);
export type User = typeof users.$inferSelect;
export type NewUser = typeof users.$inferInsert;

View File

@@ -0,0 +1,90 @@
import {
pgTable,
text,
timestamp,
boolean,
integer,
index,
uniqueIndex,
primaryKey,
} from "drizzle-orm/pg-core";
import { users } from "./users.js";
import { files } from "./files.js";
export const videos = pgTable(
"videos",
{
id: text("id").primaryKey().$defaultFn(() => crypto.randomUUID()),
slug: text("slug").notNull(),
title: text("title").notNull(),
description: text("description"),
image: text("image").references(() => files.id, { onDelete: "set null" }),
movie: text("movie").references(() => files.id, { onDelete: "set null" }),
tags: text("tags").array().default([]),
upload_date: timestamp("upload_date").notNull().defaultNow(),
premium: boolean("premium").default(false),
featured: boolean("featured").default(false),
likes_count: integer("likes_count").default(0),
plays_count: integer("plays_count").default(0),
},
(t) => [
uniqueIndex("videos_slug_idx").on(t.slug),
index("videos_upload_date_idx").on(t.upload_date),
index("videos_featured_idx").on(t.featured),
],
);
export const video_models = pgTable(
"video_models",
{
video_id: text("video_id")
.notNull()
.references(() => videos.id, { onDelete: "cascade" }),
user_id: text("user_id")
.notNull()
.references(() => users.id, { onDelete: "cascade" }),
},
(t) => [primaryKey({ columns: [t.video_id, t.user_id] })],
);
export const video_likes = pgTable(
"video_likes",
{
id: text("id").primaryKey().$defaultFn(() => crypto.randomUUID()),
video_id: text("video_id")
.notNull()
.references(() => videos.id, { onDelete: "cascade" }),
user_id: text("user_id")
.notNull()
.references(() => users.id, { onDelete: "cascade" }),
date_created: timestamp("date_created").notNull().defaultNow(),
},
(t) => [
index("video_likes_video_idx").on(t.video_id),
index("video_likes_user_idx").on(t.user_id),
],
);
export const video_plays = pgTable(
"video_plays",
{
id: text("id").primaryKey().$defaultFn(() => crypto.randomUUID()),
video_id: text("video_id")
.notNull()
.references(() => videos.id, { onDelete: "cascade" }),
user_id: text("user_id").references(() => users.id, { onDelete: "set null" }),
session_id: text("session_id"),
duration_watched: integer("duration_watched"),
completed: boolean("completed").default(false),
date_created: timestamp("date_created").notNull().defaultNow(),
date_updated: timestamp("date_updated"),
},
(t) => [
index("video_plays_video_idx").on(t.video_id),
index("video_plays_user_idx").on(t.user_id),
index("video_plays_date_idx").on(t.date_created),
],
);
export type Video = typeof videos.$inferSelect;
export type NewVideo = typeof videos.$inferInsert;

View File

@@ -0,0 +1,30 @@
import SchemaBuilder from "@pothos/core";
import ErrorsPlugin from "@pothos/plugin-errors";
import type { DB } from "../db/connection.js";
import type { SessionUser } from "../lib/auth.js";
import type Redis from "ioredis";
import { GraphQLDateTime, GraphQLJSON } from "graphql-scalars";
export type Context = {
db: DB;
redis: InstanceType<typeof Redis>;
currentUser: SessionUser | null;
request: Request;
reply: unknown;
};
export const builder = new SchemaBuilder<{
Context: Context;
Scalars: {
DateTime: { Input: Date; Output: Date };
JSON: { Input: unknown; Output: unknown };
};
}>({
plugins: [ErrorsPlugin],
});
builder.addScalarType("DateTime", GraphQLDateTime, {});
builder.addScalarType("JSON", GraphQLJSON, {});
builder.queryType({});
builder.mutationType({});

View File

@@ -0,0 +1,29 @@
import type { YogaInitialContext } from "graphql-yoga";
import type { Context } from "./builder.js";
import { getSession } from "../lib/auth.js";
import { db } from "../db/connection.js";
import { redis } from "../lib/auth.js";
export async function buildContext(ctx: YogaInitialContext & { request: Request; reply: unknown; db: typeof db; redis: typeof redis }): Promise<Context> {
const request = ctx.request;
const cookieHeader = request.headers.get("cookie") || "";
// Parse session_token from cookies
const cookies = Object.fromEntries(
cookieHeader.split(";").map((c) => {
const [k, ...v] = c.trim().split("=");
return [k.trim(), v.join("=")];
}),
);
const token = cookies["session_token"];
const currentUser = token ? await getSession(token) : null;
return {
db: ctx.db || db,
redis: ctx.redis || redis,
currentUser,
request,
reply: ctx.reply,
};
}

View File

@@ -0,0 +1,14 @@
import "./builder.js";
import "./types/index.js";
import "./resolvers/auth.js";
import "./resolvers/users.js";
import "./resolvers/videos.js";
import "./resolvers/models.js";
import "./resolvers/articles.js";
import "./resolvers/recordings.js";
import "./resolvers/comments.js";
import "./resolvers/gamification.js";
import "./resolvers/stats.js";
import { builder } from "./builder.js";
export const schema = builder.toSchema();

View File

@@ -0,0 +1,83 @@
import { builder } from "../builder.js";
import { ArticleType } from "../types/index.js";
import { articles, users } from "../../db/schema/index.js";
import { eq, and, lte, desc } from "drizzle-orm";
builder.queryField("articles", (t) =>
t.field({
type: [ArticleType],
args: {
featured: t.arg.boolean(),
limit: t.arg.int(),
},
resolve: async (_root, args, ctx) => {
let query = ctx.db
.select()
.from(articles)
.where(lte(articles.publish_date, new Date()))
.orderBy(desc(articles.publish_date));
if (args.limit) {
query = (query as any).limit(args.limit);
}
const articleList = await query;
return Promise.all(
articleList.map(async (article: any) => {
let author = null;
if (article.author) {
const authorUser = await ctx.db
.select({
first_name: users.first_name,
last_name: users.last_name,
avatar: users.avatar,
description: users.description,
})
.from(users)
.where(eq(users.id, article.author))
.limit(1);
author = authorUser[0] || null;
}
return { ...article, author };
}),
);
},
}),
);
builder.queryField("article", (t) =>
t.field({
type: ArticleType,
nullable: true,
args: {
slug: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const article = await ctx.db
.select()
.from(articles)
.where(and(eq(articles.slug, args.slug), lte(articles.publish_date, new Date())))
.limit(1);
if (!article[0]) return null;
let author = null;
if (article[0].author) {
const authorUser = await ctx.db
.select({
first_name: users.first_name,
last_name: users.last_name,
avatar: users.avatar,
description: users.description,
})
.from(users)
.where(eq(users.id, article[0].author))
.limit(1);
author = authorUser[0] || null;
}
return { ...article[0], author };
},
}),
);

View File

@@ -0,0 +1,226 @@
import { GraphQLError } from "graphql";
import { builder } from "../builder.js";
import { CurrentUserType } from "../types/index.js";
import { users } from "../../db/schema/index.js";
import { eq } from "drizzle-orm";
import { hash, verify as verifyArgon } from "../../lib/argon.js";
import { setSession, deleteSession } from "../../lib/auth.js";
import { sendVerification, sendPasswordReset } from "../../lib/email.js";
import { slugify } from "../../lib/slugify.js";
import { nanoid } from "nanoid";
builder.mutationField("login", (t) =>
t.field({
type: CurrentUserType,
args: {
email: t.arg.string({ required: true }),
password: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const user = await ctx.db
.select()
.from(users)
.where(eq(users.email, args.email.toLowerCase()))
.limit(1);
if (!user[0]) throw new GraphQLError("Invalid credentials");
const valid = await verifyArgon(user[0].password_hash, args.password);
if (!valid) throw new GraphQLError("Invalid credentials");
const token = nanoid(32);
const sessionUser = {
id: user[0].id,
email: user[0].email,
role: user[0].role,
first_name: user[0].first_name,
last_name: user[0].last_name,
artist_name: user[0].artist_name,
slug: user[0].slug,
avatar: user[0].avatar,
};
await setSession(token, sessionUser);
// Set session cookie
const isProduction = process.env.NODE_ENV === "production";
const cookieValue = `session_token=${token}; HttpOnly; Path=/; SameSite=Lax; Max-Age=86400${isProduction ? "; Secure" : ""}`;
(ctx.reply as any).header?.("Set-Cookie", cookieValue);
// For graphql-yoga response
if ((ctx as any).serverResponse) {
(ctx as any).serverResponse.setHeader("Set-Cookie", cookieValue);
}
return user[0];
},
}),
);
builder.mutationField("logout", (t) =>
t.field({
type: "Boolean",
resolve: async (_root, _args, ctx) => {
const cookieHeader = ctx.request.headers.get("cookie") || "";
const cookies = Object.fromEntries(
cookieHeader.split(";").map((c) => {
const [k, ...v] = c.trim().split("=");
return [k.trim(), v.join("=")];
}),
);
const token = cookies["session_token"];
if (token) {
await deleteSession(token);
}
// Clear cookie
const cookieValue = "session_token=; HttpOnly; Path=/; Max-Age=0";
(ctx.reply as any).header?.("Set-Cookie", cookieValue);
return true;
},
}),
);
builder.mutationField("register", (t) =>
t.field({
type: "Boolean",
args: {
email: t.arg.string({ required: true }),
password: t.arg.string({ required: true }),
firstName: t.arg.string({ required: true }),
lastName: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const existing = await ctx.db
.select({ id: users.id })
.from(users)
.where(eq(users.email, args.email.toLowerCase()))
.limit(1);
if (existing.length > 0) throw new GraphQLError("Email already registered");
const passwordHash = await hash(args.password);
const artistName = `${args.firstName} ${args.lastName}`;
const baseSlug = slugify(artistName);
const verifyToken = nanoid(32);
// Ensure unique slug
let slug = baseSlug;
let attempt = 0;
while (true) {
const existing = await ctx.db
.select({ id: users.id })
.from(users)
.where(eq(users.slug, slug))
.limit(1);
if (existing.length === 0) break;
attempt++;
slug = `${baseSlug}-${attempt}`;
}
await ctx.db.insert(users).values({
email: args.email.toLowerCase(),
password_hash: passwordHash,
first_name: args.firstName,
last_name: args.lastName,
artist_name: artistName,
slug,
role: "viewer",
email_verify_token: verifyToken,
email_verified: false,
});
await sendVerification(args.email, verifyToken);
return true;
},
}),
);
builder.mutationField("verifyEmail", (t) =>
t.field({
type: "Boolean",
args: {
token: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const user = await ctx.db
.select()
.from(users)
.where(eq(users.email_verify_token, args.token))
.limit(1);
if (!user[0]) throw new GraphQLError("Invalid verification token");
await ctx.db
.update(users)
.set({ email_verified: true, email_verify_token: null })
.where(eq(users.id, user[0].id));
return true;
},
}),
);
builder.mutationField("requestPasswordReset", (t) =>
t.field({
type: "Boolean",
args: {
email: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const user = await ctx.db
.select()
.from(users)
.where(eq(users.email, args.email.toLowerCase()))
.limit(1);
// Always return true to prevent email enumeration
if (!user[0]) return true;
const token = nanoid(32);
const expiry = new Date(Date.now() + 60 * 60 * 1000); // 1 hour
await ctx.db
.update(users)
.set({ password_reset_token: token, password_reset_expiry: expiry })
.where(eq(users.id, user[0].id));
await sendPasswordReset(args.email, token);
return true;
},
}),
);
builder.mutationField("resetPassword", (t) =>
t.field({
type: "Boolean",
args: {
token: t.arg.string({ required: true }),
newPassword: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const user = await ctx.db
.select()
.from(users)
.where(eq(users.password_reset_token, args.token))
.limit(1);
if (!user[0]) throw new GraphQLError("Invalid or expired reset token");
if (user[0].password_reset_expiry && user[0].password_reset_expiry < new Date()) {
throw new GraphQLError("Reset token expired");
}
const passwordHash = await hash(args.newPassword);
await ctx.db
.update(users)
.set({
password_hash: passwordHash,
password_reset_token: null,
password_reset_expiry: null,
})
.where(eq(users.id, user[0].id));
return true;
},
}),
);

View File

@@ -0,0 +1,68 @@
import { GraphQLError } from "graphql";
import { builder } from "../builder.js";
import { CommentType } from "../types/index.js";
import { comments, users } from "../../db/schema/index.js";
import { eq, and, desc } from "drizzle-orm";
import { awardPoints, checkAchievements } from "../../lib/gamification.js";
builder.queryField("commentsForVideo", (t) =>
t.field({
type: [CommentType],
args: {
videoId: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const commentList = await ctx.db
.select()
.from(comments)
.where(and(eq(comments.collection, "videos"), eq(comments.item_id, args.videoId)))
.orderBy(desc(comments.date_created));
return Promise.all(
commentList.map(async (c: any) => {
const user = await ctx.db
.select({ id: users.id, first_name: users.first_name, last_name: users.last_name, avatar: users.avatar })
.from(users)
.where(eq(users.id, c.user_id))
.limit(1);
return { ...c, user: user[0] || null };
}),
);
},
}),
);
builder.mutationField("createCommentForVideo", (t) =>
t.field({
type: CommentType,
args: {
videoId: t.arg.string({ required: true }),
comment: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
const newComment = await ctx.db
.insert(comments)
.values({
collection: "videos",
item_id: args.videoId,
comment: args.comment,
user_id: ctx.currentUser.id,
})
.returning();
// Gamification
await awardPoints(ctx.db, ctx.currentUser.id, "COMMENT_CREATE");
await checkAchievements(ctx.db, ctx.currentUser.id, "social");
const user = await ctx.db
.select({ id: users.id, first_name: users.first_name, last_name: users.last_name, avatar: users.avatar })
.from(users)
.where(eq(users.id, ctx.currentUser.id))
.limit(1);
return { ...newComment[0], user: user[0] || null };
},
}),
);

View File

@@ -0,0 +1,115 @@
import { builder } from "../builder.js";
import { LeaderboardEntryType, UserGamificationType, AchievementType } from "../types/index.js";
import { user_stats, users, user_achievements, achievements, user_points } from "../../db/schema/index.js";
import { eq, desc, gt, count, isNotNull } from "drizzle-orm";
builder.queryField("leaderboard", (t) =>
t.field({
type: [LeaderboardEntryType],
args: {
limit: t.arg.int(),
offset: t.arg.int(),
},
resolve: async (_root, args, ctx) => {
const limit = Math.min(args.limit || 100, 500);
const offset = args.offset || 0;
const entries = await ctx.db
.select({
user_id: user_stats.user_id,
display_name: users.artist_name,
avatar: users.avatar,
total_weighted_points: user_stats.total_weighted_points,
total_raw_points: user_stats.total_raw_points,
recordings_count: user_stats.recordings_count,
playbacks_count: user_stats.playbacks_count,
achievements_count: user_stats.achievements_count,
})
.from(user_stats)
.leftJoin(users, eq(user_stats.user_id, users.id))
.orderBy(desc(user_stats.total_weighted_points))
.limit(limit)
.offset(offset);
return entries.map((e: any, i: number) => ({ ...e, rank: offset + i + 1 }));
},
}),
);
builder.queryField("userGamification", (t) =>
t.field({
type: UserGamificationType,
nullable: true,
args: {
userId: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const stats = await ctx.db
.select()
.from(user_stats)
.where(eq(user_stats.user_id, args.userId))
.limit(1);
let rank = 1;
if (stats[0]) {
const rankResult = await ctx.db
.select({ count: count() })
.from(user_stats)
.where(gt(user_stats.total_weighted_points, stats[0].total_weighted_points || 0));
rank = (rankResult[0]?.count || 0) + 1;
}
const userAchievements = await ctx.db
.select({
id: achievements.id,
code: achievements.code,
name: achievements.name,
description: achievements.description,
icon: achievements.icon,
category: achievements.category,
date_unlocked: user_achievements.date_unlocked,
progress: user_achievements.progress,
required_count: achievements.required_count,
})
.from(user_achievements)
.leftJoin(achievements, eq(user_achievements.achievement_id, achievements.id))
.where(eq(user_achievements.user_id, args.userId))
.where(isNotNull(user_achievements.date_unlocked))
.orderBy(desc(user_achievements.date_unlocked));
const recentPoints = await ctx.db
.select({
action: user_points.action,
points: user_points.points,
date_created: user_points.date_created,
recording_id: user_points.recording_id,
})
.from(user_points)
.where(eq(user_points.user_id, args.userId))
.orderBy(desc(user_points.date_created))
.limit(10);
return {
stats: stats[0] ? { ...stats[0], rank } : null,
achievements: userAchievements.map((a: any) => ({
...a,
date_unlocked: a.date_unlocked!,
})),
recent_points: recentPoints,
};
},
}),
);
builder.queryField("achievements", (t) =>
t.field({
type: [AchievementType],
resolve: async (_root, _args, ctx) => {
return ctx.db
.select()
.from(achievements)
.where(eq(achievements.status, "published"))
.orderBy(achievements.sort);
},
}),
);

View File

@@ -0,0 +1,63 @@
import { builder } from "../builder.js";
import { ModelType } from "../types/index.js";
import { users, user_photos, files } from "../../db/schema/index.js";
import { eq, and, desc } from "drizzle-orm";
async function enrichModel(db: any, user: any) {
// Fetch photos
const photoRows = await db
.select({ id: files.id, filename: files.filename })
.from(user_photos)
.leftJoin(files, eq(user_photos.file_id, files.id))
.where(eq(user_photos.user_id, user.id))
.orderBy(user_photos.sort);
return {
...user,
photos: photoRows.map((p: any) => ({ id: p.id, filename: p.filename })),
};
}
builder.queryField("models", (t) =>
t.field({
type: [ModelType],
args: {
featured: t.arg.boolean(),
limit: t.arg.int(),
},
resolve: async (_root, args, ctx) => {
let query = ctx.db
.select()
.from(users)
.where(eq(users.role, "model"))
.orderBy(desc(users.date_created));
if (args.limit) {
query = (query as any).limit(args.limit);
}
const modelList = await query;
return Promise.all(modelList.map((m: any) => enrichModel(ctx.db, m)));
},
}),
);
builder.queryField("model", (t) =>
t.field({
type: ModelType,
nullable: true,
args: {
slug: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const model = await ctx.db
.select()
.from(users)
.where(and(eq(users.slug, args.slug), eq(users.role, "model")))
.limit(1);
if (!model[0]) return null;
return enrichModel(ctx.db, model[0]);
},
}),
);

View File

@@ -0,0 +1,333 @@
import { GraphQLError } from "graphql";
import { builder } from "../builder.js";
import { RecordingType } from "../types/index.js";
import { recordings, recording_plays } from "../../db/schema/index.js";
import { eq, and, desc } from "drizzle-orm";
import { slugify } from "../../lib/slugify.js";
import { awardPoints, checkAchievements } from "../../lib/gamification.js";
builder.queryField("recordings", (t) =>
t.field({
type: [RecordingType],
args: {
status: t.arg.string(),
tags: t.arg.string(),
linkedVideoId: t.arg.string(),
limit: t.arg.int(),
page: t.arg.int(),
},
resolve: async (_root, args, ctx) => {
if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
const conditions = [eq(recordings.user_id, ctx.currentUser.id)];
if (args.status) conditions.push(eq(recordings.status, args.status as any));
if (args.linkedVideoId) conditions.push(eq(recordings.linked_video, args.linkedVideoId));
const limit = args.limit || 50;
const page = args.page || 1;
const offset = (page - 1) * limit;
return ctx.db
.select()
.from(recordings)
.where(and(...conditions))
.orderBy(desc(recordings.date_created))
.limit(limit)
.offset(offset);
},
}),
);
builder.queryField("recording", (t) =>
t.field({
type: RecordingType,
nullable: true,
args: {
id: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
const recording = await ctx.db
.select()
.from(recordings)
.where(eq(recordings.id, args.id))
.limit(1);
if (!recording[0]) return null;
if (recording[0].user_id !== ctx.currentUser.id && !recording[0].public) {
throw new GraphQLError("Forbidden");
}
return recording[0];
},
}),
);
builder.queryField("communityRecordings", (t) =>
t.field({
type: [RecordingType],
args: {
limit: t.arg.int(),
offset: t.arg.int(),
},
resolve: async (_root, args, ctx) => {
return ctx.db
.select()
.from(recordings)
.where(and(eq(recordings.status, "published"), eq(recordings.public, true)))
.orderBy(desc(recordings.date_created))
.limit(args.limit || 50)
.offset(args.offset || 0);
},
}),
);
builder.mutationField("createRecording", (t) =>
t.field({
type: RecordingType,
args: {
title: t.arg.string({ required: true }),
description: t.arg.string(),
duration: t.arg.int({ required: true }),
events: t.arg({ type: "JSON", required: true }),
deviceInfo: t.arg({ type: "JSON", required: true }),
tags: t.arg.stringList(),
status: t.arg.string(),
linkedVideoId: t.arg.string(),
},
resolve: async (_root, args, ctx) => {
if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
const slug = slugify(args.title);
const newRecording = await ctx.db
.insert(recordings)
.values({
title: args.title,
description: args.description || null,
slug,
duration: args.duration,
events: (args.events as object[]) || [],
device_info: (args.deviceInfo as object[]) || [],
user_id: ctx.currentUser.id,
tags: args.tags || [],
linked_video: args.linkedVideoId || null,
status: (args.status as any) || "draft",
public: false,
})
.returning();
const recording = newRecording[0];
// Gamification: award points if published
if (recording.status === "published") {
await awardPoints(ctx.db, ctx.currentUser.id, "RECORDING_CREATE", recording.id);
await checkAchievements(ctx.db, ctx.currentUser.id, "recordings");
}
return recording;
},
}),
);
builder.mutationField("updateRecording", (t) =>
t.field({
type: RecordingType,
nullable: true,
args: {
id: t.arg.string({ required: true }),
title: t.arg.string(),
description: t.arg.string(),
tags: t.arg.stringList(),
status: t.arg.string(),
public: t.arg.boolean(),
linkedVideoId: t.arg.string(),
},
resolve: async (_root, args, ctx) => {
if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
const existing = await ctx.db
.select()
.from(recordings)
.where(eq(recordings.id, args.id))
.limit(1);
if (!existing[0]) throw new GraphQLError("Recording not found");
if (existing[0].user_id !== ctx.currentUser.id) throw new GraphQLError("Forbidden");
const updates: Record<string, unknown> = { date_updated: new Date() };
if (args.title !== null && args.title !== undefined) {
updates.title = args.title;
updates.slug = slugify(args.title);
}
if (args.description !== null && args.description !== undefined) updates.description = args.description;
if (args.tags !== null && args.tags !== undefined) updates.tags = args.tags;
if (args.status !== null && args.status !== undefined) updates.status = args.status;
if (args.public !== null && args.public !== undefined) updates.public = args.public;
if (args.linkedVideoId !== null && args.linkedVideoId !== undefined) updates.linked_video = args.linkedVideoId;
const updated = await ctx.db
.update(recordings)
.set(updates as any)
.where(eq(recordings.id, args.id))
.returning();
const recording = updated[0];
// Gamification: if newly published
if (args.status === "published" && existing[0].status !== "published") {
await awardPoints(ctx.db, ctx.currentUser.id, "RECORDING_CREATE", recording.id);
await checkAchievements(ctx.db, ctx.currentUser.id, "recordings");
}
if (args.status === "published" && recording.featured && !existing[0].featured) {
await awardPoints(ctx.db, ctx.currentUser.id, "RECORDING_FEATURED", recording.id);
await checkAchievements(ctx.db, ctx.currentUser.id, "recordings");
}
return recording;
},
}),
);
builder.mutationField("deleteRecording", (t) =>
t.field({
type: "Boolean",
args: {
id: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
const existing = await ctx.db
.select()
.from(recordings)
.where(eq(recordings.id, args.id))
.limit(1);
if (!existing[0]) throw new GraphQLError("Recording not found");
if (existing[0].user_id !== ctx.currentUser.id) throw new GraphQLError("Forbidden");
await ctx.db
.update(recordings)
.set({ status: "archived", date_updated: new Date() })
.where(eq(recordings.id, args.id));
return true;
},
}),
);
builder.mutationField("duplicateRecording", (t) =>
t.field({
type: RecordingType,
args: {
id: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
const original = await ctx.db
.select()
.from(recordings)
.where(eq(recordings.id, args.id))
.limit(1);
if (!original[0]) throw new GraphQLError("Recording not found");
if (original[0].status !== "published" || !original[0].public) {
throw new GraphQLError("Recording is not publicly shared");
}
const slug = `${slugify(original[0].title)}-copy-${Date.now()}`;
const duplicated = await ctx.db
.insert(recordings)
.values({
title: `${original[0].title} (Copy)`,
description: original[0].description,
slug,
duration: original[0].duration,
events: original[0].events || [],
device_info: original[0].device_info || [],
user_id: ctx.currentUser.id,
tags: original[0].tags || [],
status: "draft",
public: false,
original_recording_id: original[0].id,
})
.returning();
return duplicated[0];
},
}),
);
builder.mutationField("recordRecordingPlay", (t) =>
t.field({
type: "JSON",
args: {
recordingId: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const recording = await ctx.db
.select()
.from(recordings)
.where(eq(recordings.id, args.recordingId))
.limit(1);
if (!recording[0]) throw new GraphQLError("Recording not found");
const play = await ctx.db
.insert(recording_plays)
.values({
recording_id: args.recordingId,
user_id: ctx.currentUser?.id || null,
duration_played: 0,
completed: false,
})
.returning({ id: recording_plays.id });
// Gamification
if (ctx.currentUser && recording[0].user_id !== ctx.currentUser.id) {
await awardPoints(ctx.db, ctx.currentUser.id, "RECORDING_PLAY", args.recordingId);
await checkAchievements(ctx.db, ctx.currentUser.id, "playback");
}
return { success: true, play_id: play[0].id };
},
}),
);
builder.mutationField("updateRecordingPlay", (t) =>
t.field({
type: "Boolean",
args: {
playId: t.arg.string({ required: true }),
durationPlayed: t.arg.int({ required: true }),
completed: t.arg.boolean({ required: true }),
},
resolve: async (_root, args, ctx) => {
const existing = await ctx.db
.select()
.from(recording_plays)
.where(eq(recording_plays.id, args.playId))
.limit(1);
if (!existing[0]) throw new GraphQLError("Play record not found");
const wasCompleted = existing[0].completed;
await ctx.db
.update(recording_plays)
.set({ duration_played: args.durationPlayed, completed: args.completed, date_updated: new Date() })
.where(eq(recording_plays.id, args.playId));
if (args.completed && !wasCompleted && ctx.currentUser) {
await awardPoints(ctx.db, ctx.currentUser.id, "RECORDING_COMPLETE", existing[0].recording_id);
await checkAchievements(ctx.db, ctx.currentUser.id, "playback");
}
return true;
},
}),
);

View File

@@ -0,0 +1,29 @@
import { builder } from "../builder.js";
import { StatsType } from "../types/index.js";
import { users, videos } from "../../db/schema/index.js";
import { eq, count } from "drizzle-orm";
builder.queryField("stats", (t) =>
t.field({
type: StatsType,
resolve: async (_root, _args, ctx) => {
const modelsCount = await ctx.db
.select({ count: count() })
.from(users)
.where(eq(users.role, "model"));
const viewersCount = await ctx.db
.select({ count: count() })
.from(users)
.where(eq(users.role, "viewer"));
const videosCount = await ctx.db
.select({ count: count() })
.from(videos);
return {
models_count: modelsCount[0]?.count || 0,
viewers_count: viewersCount[0]?.count || 0,
videos_count: videosCount[0]?.count || 0,
};
},
}),
);

View File

@@ -0,0 +1,72 @@
import { GraphQLError } from "graphql";
import { builder } from "../builder.js";
import { CurrentUserType, UserType } from "../types/index.js";
import { users } from "../../db/schema/index.js";
import { eq } from "drizzle-orm";
builder.queryField("me", (t) =>
t.field({
type: CurrentUserType,
nullable: true,
resolve: async (_root, _args, ctx) => {
if (!ctx.currentUser) return null;
const user = await ctx.db
.select()
.from(users)
.where(eq(users.id, ctx.currentUser.id))
.limit(1);
return user[0] || null;
},
}),
);
builder.queryField("userProfile", (t) =>
t.field({
type: UserType,
nullable: true,
args: {
id: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const user = await ctx.db
.select()
.from(users)
.where(eq(users.id, args.id))
.limit(1);
return user[0] || null;
},
}),
);
builder.mutationField("updateProfile", (t) =>
t.field({
type: CurrentUserType,
nullable: true,
args: {
firstName: t.arg.string(),
lastName: t.arg.string(),
artistName: t.arg.string(),
description: t.arg.string(),
tags: t.arg.stringList(),
},
resolve: async (_root, args, ctx) => {
if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
const updates: Record<string, unknown> = { date_updated: new Date() };
if (args.firstName !== undefined && args.firstName !== null) updates.first_name = args.firstName;
if (args.lastName !== undefined && args.lastName !== null) updates.last_name = args.lastName;
if (args.artistName !== undefined && args.artistName !== null) updates.artist_name = args.artistName;
if (args.description !== undefined && args.description !== null) updates.description = args.description;
if (args.tags !== undefined && args.tags !== null) updates.tags = args.tags;
await ctx.db.update(users).set(updates as any).where(eq(users.id, ctx.currentUser.id));
const updated = await ctx.db
.select()
.from(users)
.where(eq(users.id, ctx.currentUser.id))
.limit(1);
return updated[0] || null;
},
}),
);

View File

@@ -0,0 +1,320 @@
import { GraphQLError } from "graphql";
import { builder } from "../builder.js";
import { VideoType, VideoLikeResponseType, VideoPlayResponseType, VideoLikeStatusType } from "../types/index.js";
import { videos, video_models, video_likes, video_plays, users, files } from "../../db/schema/index.js";
import { eq, and, lte, desc, inArray, count } from "drizzle-orm";
async function enrichVideo(db: any, video: any) {
// Fetch models
const modelRows = await db
.select({
id: users.id,
artist_name: users.artist_name,
slug: users.slug,
avatar: users.avatar,
})
.from(video_models)
.leftJoin(users, eq(video_models.user_id, users.id))
.where(eq(video_models.video_id, video.id));
// Fetch movie file
let movieFile = null;
if (video.movie) {
const mf = await db.select().from(files).where(eq(files.id, video.movie)).limit(1);
movieFile = mf[0] || null;
}
// Count likes
const likesCount = await db.select({ count: count() }).from(video_likes).where(eq(video_likes.video_id, video.id));
const playsCount = await db.select({ count: count() }).from(video_plays).where(eq(video_plays.video_id, video.id));
return {
...video,
models: modelRows,
movie_file: movieFile,
likes_count: likesCount[0]?.count || 0,
plays_count: playsCount[0]?.count || 0,
};
}
builder.queryField("videos", (t) =>
t.field({
type: [VideoType],
args: {
modelId: t.arg.string(),
featured: t.arg.boolean(),
limit: t.arg.int(),
},
resolve: async (_root, args, ctx) => {
let query = ctx.db
.select({ v: videos })
.from(videos)
.where(lte(videos.upload_date, new Date()))
.orderBy(desc(videos.upload_date));
if (args.modelId) {
const videoIds = await ctx.db
.select({ video_id: video_models.video_id })
.from(video_models)
.where(eq(video_models.user_id, args.modelId));
if (videoIds.length === 0) return [];
query = ctx.db
.select({ v: videos })
.from(videos)
.where(and(
lte(videos.upload_date, new Date()),
inArray(videos.id, videoIds.map((v: any) => v.video_id)),
))
.orderBy(desc(videos.upload_date));
}
if (args.featured !== null && args.featured !== undefined) {
query = ctx.db
.select({ v: videos })
.from(videos)
.where(and(
lte(videos.upload_date, new Date()),
eq(videos.featured, args.featured),
))
.orderBy(desc(videos.upload_date));
}
if (args.limit) {
query = (query as any).limit(args.limit);
}
const rows = await query;
const videoList = rows.map((r: any) => r.v || r);
return Promise.all(videoList.map((v: any) => enrichVideo(ctx.db, v)));
},
}),
);
builder.queryField("video", (t) =>
t.field({
type: VideoType,
nullable: true,
args: {
slug: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const video = await ctx.db
.select()
.from(videos)
.where(and(eq(videos.slug, args.slug), lte(videos.upload_date, new Date())))
.limit(1);
if (!video[0]) return null;
return enrichVideo(ctx.db, video[0]);
},
}),
);
builder.queryField("videoLikeStatus", (t) =>
t.field({
type: VideoLikeStatusType,
args: {
videoId: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
if (!ctx.currentUser) return { liked: false };
const existing = await ctx.db
.select()
.from(video_likes)
.where(and(eq(video_likes.video_id, args.videoId), eq(video_likes.user_id, ctx.currentUser.id)))
.limit(1);
return { liked: existing.length > 0 };
},
}),
);
builder.mutationField("likeVideo", (t) =>
t.field({
type: VideoLikeResponseType,
args: {
videoId: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
const existing = await ctx.db
.select()
.from(video_likes)
.where(and(eq(video_likes.video_id, args.videoId), eq(video_likes.user_id, ctx.currentUser.id)))
.limit(1);
if (existing.length > 0) throw new GraphQLError("Already liked");
await ctx.db.insert(video_likes).values({
video_id: args.videoId,
user_id: ctx.currentUser.id,
});
await ctx.db
.update(videos)
.set({ likes_count: (await ctx.db.select({ c: videos.likes_count }).from(videos).where(eq(videos.id, args.videoId)).limit(1))[0]?.c as number + 1 || 1 })
.where(eq(videos.id, args.videoId));
const likesCount = await ctx.db.select({ count: count() }).from(video_likes).where(eq(video_likes.video_id, args.videoId));
return { liked: true, likes_count: likesCount[0]?.count || 1 };
},
}),
);
builder.mutationField("unlikeVideo", (t) =>
t.field({
type: VideoLikeResponseType,
args: {
videoId: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
const existing = await ctx.db
.select()
.from(video_likes)
.where(and(eq(video_likes.video_id, args.videoId), eq(video_likes.user_id, ctx.currentUser.id)))
.limit(1);
if (existing.length === 0) throw new GraphQLError("Not liked");
await ctx.db
.delete(video_likes)
.where(and(eq(video_likes.video_id, args.videoId), eq(video_likes.user_id, ctx.currentUser.id)));
await ctx.db
.update(videos)
.set({ likes_count: Math.max(((await ctx.db.select({ c: videos.likes_count }).from(videos).where(eq(videos.id, args.videoId)).limit(1))[0]?.c as number || 1) - 1, 0) })
.where(eq(videos.id, args.videoId));
const likesCount = await ctx.db.select({ count: count() }).from(video_likes).where(eq(video_likes.video_id, args.videoId));
return { liked: false, likes_count: likesCount[0]?.count || 0 };
},
}),
);
builder.mutationField("recordVideoPlay", (t) =>
t.field({
type: VideoPlayResponseType,
args: {
videoId: t.arg.string({ required: true }),
sessionId: t.arg.string(),
},
resolve: async (_root, args, ctx) => {
const play = await ctx.db.insert(video_plays).values({
video_id: args.videoId,
user_id: ctx.currentUser?.id || null,
session_id: args.sessionId || null,
}).returning({ id: video_plays.id });
const playsCount = await ctx.db.select({ count: count() }).from(video_plays).where(eq(video_plays.video_id, args.videoId));
await ctx.db
.update(videos)
.set({ plays_count: playsCount[0]?.count || 0 })
.where(eq(videos.id, args.videoId));
return {
success: true,
play_id: play[0].id,
plays_count: playsCount[0]?.count || 0,
};
},
}),
);
builder.mutationField("updateVideoPlay", (t) =>
t.field({
type: "Boolean",
args: {
videoId: t.arg.string({ required: true }),
playId: t.arg.string({ required: true }),
durationWatched: t.arg.int({ required: true }),
completed: t.arg.boolean({ required: true }),
},
resolve: async (_root, args, ctx) => {
await ctx.db
.update(video_plays)
.set({ duration_watched: args.durationWatched, completed: args.completed, date_updated: new Date() })
.where(eq(video_plays.id, args.playId));
return true;
},
}),
);
builder.queryField("analytics", (t) =>
t.field({
type: "JSON",
nullable: true,
resolve: async (_root, _args, ctx) => {
if (!ctx.currentUser || ctx.currentUser.role !== "model") {
throw new GraphQLError("Unauthorized");
}
const userId = ctx.currentUser.id;
// Get all videos by this user (via video_models)
const modelVideoIds = await ctx.db
.select({ video_id: video_models.video_id })
.from(video_models)
.where(eq(video_models.user_id, userId));
if (modelVideoIds.length === 0) {
return { total_videos: 0, total_likes: 0, total_plays: 0, plays_by_date: {}, likes_by_date: {}, videos: [] };
}
const videoIds = modelVideoIds.map((v: any) => v.video_id);
const videoList = await ctx.db.select().from(videos).where(inArray(videos.id, videoIds));
const plays = await ctx.db.select().from(video_plays).where(inArray(video_plays.video_id, videoIds));
const likes = await ctx.db.select().from(video_likes).where(inArray(video_likes.video_id, videoIds));
const totalLikes = videoList.reduce((sum, v) => sum + (v.likes_count || 0), 0);
const totalPlays = videoList.reduce((sum, v) => sum + (v.plays_count || 0), 0);
const playsByDate = plays.reduce((acc: any, play) => {
const date = new Date(play.date_created).toISOString().split("T")[0];
if (!acc[date]) acc[date] = 0;
acc[date]++;
return acc;
}, {});
const likesByDate = likes.reduce((acc: any, like) => {
const date = new Date(like.date_created).toISOString().split("T")[0];
if (!acc[date]) acc[date] = 0;
acc[date]++;
return acc;
}, {});
const videoAnalytics = videoList.map((video) => {
const vPlays = plays.filter((p) => p.video_id === video.id);
const completedPlays = vPlays.filter((p) => p.completed).length;
const avgWatchTime = vPlays.length > 0
? vPlays.reduce((sum, p) => sum + (p.duration_watched || 0), 0) / vPlays.length
: 0;
return {
id: video.id,
title: video.title,
slug: video.slug,
upload_date: video.upload_date,
likes: video.likes_count || 0,
plays: video.plays_count || 0,
completed_plays: completedPlays,
completion_rate: video.plays_count ? (completedPlays / video.plays_count) * 100 : 0,
avg_watch_time: Math.round(avgWatchTime),
};
});
return {
total_videos: videoList.length,
total_likes: totalLikes,
total_plays: totalPlays,
plays_by_date: playsByDate,
likes_by_date: likesByDate,
videos: videoAnalytics,
};
},
}),
);

View File

@@ -0,0 +1,545 @@
import { builder } from "../builder.js";
// File type
export const FileType = builder.objectRef<{
id: string;
title: string | null;
description: string | null;
filename: string;
mime_type: string | null;
filesize: number | null;
duration: number | null;
uploaded_by: string | null;
date_created: Date;
}>("File").implement({
fields: (t) => ({
id: t.exposeString("id"),
title: t.exposeString("title", { nullable: true }),
description: t.exposeString("description", { nullable: true }),
filename: t.exposeString("filename"),
mime_type: t.exposeString("mime_type", { nullable: true }),
filesize: t.exposeFloat("filesize", { nullable: true }),
duration: t.exposeInt("duration", { nullable: true }),
uploaded_by: t.exposeString("uploaded_by", { nullable: true }),
date_created: t.expose("date_created", { type: "DateTime" }),
}),
});
// User type
export const UserType = builder.objectRef<{
id: string;
email: string;
first_name: string | null;
last_name: string | null;
artist_name: string | null;
slug: string | null;
description: string | null;
tags: string[] | null;
role: "model" | "viewer" | "admin";
avatar: string | null;
banner: string | null;
email_verified: boolean;
date_created: Date;
}>("User").implement({
fields: (t) => ({
id: t.exposeString("id"),
email: t.exposeString("email"),
first_name: t.exposeString("first_name", { nullable: true }),
last_name: t.exposeString("last_name", { nullable: true }),
artist_name: t.exposeString("artist_name", { nullable: true }),
slug: t.exposeString("slug", { nullable: true }),
description: t.exposeString("description", { nullable: true }),
tags: t.exposeStringList("tags", { nullable: true }),
role: t.exposeString("role"),
avatar: t.exposeString("avatar", { nullable: true }),
banner: t.exposeString("banner", { nullable: true }),
email_verified: t.exposeBoolean("email_verified"),
date_created: t.expose("date_created", { type: "DateTime" }),
}),
});
// CurrentUser type (same shape, used for auth context)
export const CurrentUserType = builder.objectRef<{
id: string;
email: string;
first_name: string | null;
last_name: string | null;
artist_name: string | null;
slug: string | null;
description: string | null;
tags: string[] | null;
role: "model" | "viewer" | "admin";
avatar: string | null;
banner: string | null;
email_verified: boolean;
date_created: Date;
}>("CurrentUser").implement({
fields: (t) => ({
id: t.exposeString("id"),
email: t.exposeString("email"),
first_name: t.exposeString("first_name", { nullable: true }),
last_name: t.exposeString("last_name", { nullable: true }),
artist_name: t.exposeString("artist_name", { nullable: true }),
slug: t.exposeString("slug", { nullable: true }),
description: t.exposeString("description", { nullable: true }),
tags: t.exposeStringList("tags", { nullable: true }),
role: t.exposeString("role"),
avatar: t.exposeString("avatar", { nullable: true }),
banner: t.exposeString("banner", { nullable: true }),
email_verified: t.exposeBoolean("email_verified"),
date_created: t.expose("date_created", { type: "DateTime" }),
}),
});
// Video type
export const VideoType = builder.objectRef<{
id: string;
slug: string;
title: string;
description: string | null;
image: string | null;
movie: string | null;
tags: string[] | null;
upload_date: Date;
premium: boolean | null;
featured: boolean | null;
likes_count: number | null;
plays_count: number | null;
models?: { id: string; artist_name: string | null; slug: string | null; avatar: string | null }[];
movie_file?: { id: string; filename: string; mime_type: string | null; duration: number | null } | null;
}>("Video").implement({
fields: (t) => ({
id: t.exposeString("id"),
slug: t.exposeString("slug"),
title: t.exposeString("title"),
description: t.exposeString("description", { nullable: true }),
image: t.exposeString("image", { nullable: true }),
movie: t.exposeString("movie", { nullable: true }),
tags: t.exposeStringList("tags", { nullable: true }),
upload_date: t.expose("upload_date", { type: "DateTime" }),
premium: t.exposeBoolean("premium", { nullable: true }),
featured: t.exposeBoolean("featured", { nullable: true }),
likes_count: t.exposeInt("likes_count", { nullable: true }),
plays_count: t.exposeInt("plays_count", { nullable: true }),
models: t.expose("models", { type: [VideoModelType], nullable: true }),
movie_file: t.expose("movie_file", { type: VideoFileType, nullable: true }),
}),
});
export const VideoModelType = builder.objectRef<{
id: string;
artist_name: string | null;
slug: string | null;
avatar: string | null;
}>("VideoModel").implement({
fields: (t) => ({
id: t.exposeString("id"),
artist_name: t.exposeString("artist_name", { nullable: true }),
slug: t.exposeString("slug", { nullable: true }),
avatar: t.exposeString("avatar", { nullable: true }),
}),
});
export const VideoFileType = builder.objectRef<{
id: string;
filename: string;
mime_type: string | null;
duration: number | null;
}>("VideoFile").implement({
fields: (t) => ({
id: t.exposeString("id"),
filename: t.exposeString("filename"),
mime_type: t.exposeString("mime_type", { nullable: true }),
duration: t.exposeInt("duration", { nullable: true }),
}),
});
// Model type (model profile, enriched user)
export const ModelType = builder.objectRef<{
id: string;
slug: string | null;
artist_name: string | null;
description: string | null;
avatar: string | null;
banner: string | null;
tags: string[] | null;
date_created: Date;
photos?: { id: string; filename: string }[];
}>("Model").implement({
fields: (t) => ({
id: t.exposeString("id"),
slug: t.exposeString("slug", { nullable: true }),
artist_name: t.exposeString("artist_name", { nullable: true }),
description: t.exposeString("description", { nullable: true }),
avatar: t.exposeString("avatar", { nullable: true }),
banner: t.exposeString("banner", { nullable: true }),
tags: t.exposeStringList("tags", { nullable: true }),
date_created: t.expose("date_created", { type: "DateTime" }),
photos: t.expose("photos", { type: [ModelPhotoType], nullable: true }),
}),
});
export const ModelPhotoType = builder.objectRef<{
id: string;
filename: string;
}>("ModelPhoto").implement({
fields: (t) => ({
id: t.exposeString("id"),
filename: t.exposeString("filename"),
}),
});
// Article type
export const ArticleType = builder.objectRef<{
id: string;
slug: string;
title: string;
excerpt: string | null;
content: string | null;
image: string | null;
tags: string[] | null;
publish_date: Date;
category: string | null;
featured: boolean | null;
author?: { first_name: string | null; last_name: string | null; avatar: string | null; description: string | null } | null;
}>("Article").implement({
fields: (t) => ({
id: t.exposeString("id"),
slug: t.exposeString("slug"),
title: t.exposeString("title"),
excerpt: t.exposeString("excerpt", { nullable: true }),
content: t.exposeString("content", { nullable: true }),
image: t.exposeString("image", { nullable: true }),
tags: t.exposeStringList("tags", { nullable: true }),
publish_date: t.expose("publish_date", { type: "DateTime" }),
category: t.exposeString("category", { nullable: true }),
featured: t.exposeBoolean("featured", { nullable: true }),
author: t.expose("author", { type: ArticleAuthorType, nullable: true }),
}),
});
export const ArticleAuthorType = builder.objectRef<{
first_name: string | null;
last_name: string | null;
avatar: string | null;
description: string | null;
}>("ArticleAuthor").implement({
fields: (t) => ({
first_name: t.exposeString("first_name", { nullable: true }),
last_name: t.exposeString("last_name", { nullable: true }),
avatar: t.exposeString("avatar", { nullable: true }),
description: t.exposeString("description", { nullable: true }),
}),
});
// Recording type
export const RecordingType = builder.objectRef<{
id: string;
title: string;
description: string | null;
slug: string;
duration: number;
events: object[] | null;
device_info: object[] | null;
user_id: string;
status: string;
tags: string[] | null;
linked_video: string | null;
featured: boolean | null;
public: boolean | null;
date_created: Date;
date_updated: Date | null;
}>("Recording").implement({
fields: (t) => ({
id: t.exposeString("id"),
title: t.exposeString("title"),
description: t.exposeString("description", { nullable: true }),
slug: t.exposeString("slug"),
duration: t.exposeInt("duration"),
events: t.expose("events", { type: "JSON", nullable: true }),
device_info: t.expose("device_info", { type: "JSON", nullable: true }),
user_id: t.exposeString("user_id"),
status: t.exposeString("status"),
tags: t.exposeStringList("tags", { nullable: true }),
linked_video: t.exposeString("linked_video", { nullable: true }),
featured: t.exposeBoolean("featured", { nullable: true }),
public: t.exposeBoolean("public", { nullable: true }),
date_created: t.expose("date_created", { type: "DateTime" }),
date_updated: t.expose("date_updated", { type: "DateTime", nullable: true }),
}),
});
// Comment type
export const CommentType = builder.objectRef<{
id: number;
collection: string;
item_id: string;
comment: string;
user_id: string;
date_created: Date;
user?: { id: string; first_name: string | null; last_name: string | null; avatar: string | null } | null;
}>("Comment").implement({
fields: (t) => ({
id: t.exposeInt("id"),
collection: t.exposeString("collection"),
item_id: t.exposeString("item_id"),
comment: t.exposeString("comment"),
user_id: t.exposeString("user_id"),
date_created: t.expose("date_created", { type: "DateTime" }),
user: t.expose("user", { type: CommentUserType, nullable: true }),
}),
});
export const CommentUserType = builder.objectRef<{
id: string;
first_name: string | null;
last_name: string | null;
avatar: string | null;
}>("CommentUser").implement({
fields: (t) => ({
id: t.exposeString("id"),
first_name: t.exposeString("first_name", { nullable: true }),
last_name: t.exposeString("last_name", { nullable: true }),
avatar: t.exposeString("avatar", { nullable: true }),
}),
});
// Stats type
export const StatsType = builder.objectRef<{
videos_count: number;
models_count: number;
viewers_count: number;
}>("Stats").implement({
fields: (t) => ({
videos_count: t.exposeInt("videos_count"),
models_count: t.exposeInt("models_count"),
viewers_count: t.exposeInt("viewers_count"),
}),
});
// Gamification types
export const LeaderboardEntryType = builder.objectRef<{
user_id: string;
display_name: string | null;
avatar: string | null;
total_weighted_points: number | null;
total_raw_points: number | null;
recordings_count: number | null;
playbacks_count: number | null;
achievements_count: number | null;
rank: number;
}>("LeaderboardEntry").implement({
fields: (t) => ({
user_id: t.exposeString("user_id"),
display_name: t.exposeString("display_name", { nullable: true }),
avatar: t.exposeString("avatar", { nullable: true }),
total_weighted_points: t.exposeFloat("total_weighted_points", { nullable: true }),
total_raw_points: t.exposeInt("total_raw_points", { nullable: true }),
recordings_count: t.exposeInt("recordings_count", { nullable: true }),
playbacks_count: t.exposeInt("playbacks_count", { nullable: true }),
achievements_count: t.exposeInt("achievements_count", { nullable: true }),
rank: t.exposeInt("rank"),
}),
});
export const AchievementType = builder.objectRef<{
id: string;
code: string;
name: string;
description: string | null;
icon: string | null;
category: string | null;
required_count: number;
points_reward: number;
}>("Achievement").implement({
fields: (t) => ({
id: t.exposeString("id"),
code: t.exposeString("code"),
name: t.exposeString("name"),
description: t.exposeString("description", { nullable: true }),
icon: t.exposeString("icon", { nullable: true }),
category: t.exposeString("category", { nullable: true }),
required_count: t.exposeInt("required_count"),
points_reward: t.exposeInt("points_reward"),
}),
});
export const UserGamificationType = builder.objectRef<{
stats: {
user_id: string;
total_raw_points: number | null;
total_weighted_points: number | null;
recordings_count: number | null;
playbacks_count: number | null;
comments_count: number | null;
achievements_count: number | null;
rank: number;
} | null;
achievements: {
id: string;
code: string;
name: string;
description: string | null;
icon: string | null;
category: string | null;
date_unlocked: Date;
progress: number | null;
required_count: number;
}[];
recent_points: {
action: string;
points: number;
date_created: Date;
recording_id: string | null;
}[];
}>("UserGamification").implement({
fields: (t) => ({
stats: t.expose("stats", { type: UserStatsType, nullable: true }),
achievements: t.expose("achievements", { type: [UserAchievementType] }),
recent_points: t.expose("recent_points", { type: [RecentPointType] }),
}),
});
export const UserStatsType = builder.objectRef<{
user_id: string;
total_raw_points: number | null;
total_weighted_points: number | null;
recordings_count: number | null;
playbacks_count: number | null;
comments_count: number | null;
achievements_count: number | null;
rank: number;
}>("UserStats").implement({
fields: (t) => ({
user_id: t.exposeString("user_id"),
total_raw_points: t.exposeInt("total_raw_points", { nullable: true }),
total_weighted_points: t.exposeFloat("total_weighted_points", { nullable: true }),
recordings_count: t.exposeInt("recordings_count", { nullable: true }),
playbacks_count: t.exposeInt("playbacks_count", { nullable: true }),
comments_count: t.exposeInt("comments_count", { nullable: true }),
achievements_count: t.exposeInt("achievements_count", { nullable: true }),
rank: t.exposeInt("rank"),
}),
});
export const UserAchievementType = builder.objectRef<{
id: string;
code: string;
name: string;
description: string | null;
icon: string | null;
category: string | null;
date_unlocked: Date;
progress: number | null;
required_count: number;
}>("UserAchievement").implement({
fields: (t) => ({
id: t.exposeString("id"),
code: t.exposeString("code"),
name: t.exposeString("name"),
description: t.exposeString("description", { nullable: true }),
icon: t.exposeString("icon", { nullable: true }),
category: t.exposeString("category", { nullable: true }),
date_unlocked: t.expose("date_unlocked", { type: "DateTime" }),
progress: t.exposeInt("progress", { nullable: true }),
required_count: t.exposeInt("required_count"),
}),
});
export const RecentPointType = builder.objectRef<{
action: string;
points: number;
date_created: Date;
recording_id: string | null;
}>("RecentPoint").implement({
fields: (t) => ({
action: t.exposeString("action"),
points: t.exposeInt("points"),
date_created: t.expose("date_created", { type: "DateTime" }),
recording_id: t.exposeString("recording_id", { nullable: true }),
}),
});
// Analytics types
export const AnalyticsType = builder.objectRef<{
total_videos: number;
total_likes: number;
total_plays: number;
plays_by_date: Record<string, number>;
likes_by_date: Record<string, number>;
videos: {
id: string;
title: string;
slug: string;
upload_date: Date;
likes: number;
plays: number;
completed_plays: number;
completion_rate: number;
avg_watch_time: number;
}[];
}>("Analytics").implement({
fields: (t) => ({
total_videos: t.exposeInt("total_videos"),
total_likes: t.exposeInt("total_likes"),
total_plays: t.exposeInt("total_plays"),
plays_by_date: t.expose("plays_by_date", { type: "JSON" }),
likes_by_date: t.expose("likes_by_date", { type: "JSON" }),
videos: t.expose("videos", { type: [VideoAnalyticsType] }),
}),
});
export const VideoAnalyticsType = builder.objectRef<{
id: string;
title: string;
slug: string;
upload_date: Date;
likes: number;
plays: number;
completed_plays: number;
completion_rate: number;
avg_watch_time: number;
}>("VideoAnalytics").implement({
fields: (t) => ({
id: t.exposeString("id"),
title: t.exposeString("title"),
slug: t.exposeString("slug"),
upload_date: t.expose("upload_date", { type: "DateTime" }),
likes: t.exposeInt("likes"),
plays: t.exposeInt("plays"),
completed_plays: t.exposeInt("completed_plays"),
completion_rate: t.exposeFloat("completion_rate"),
avg_watch_time: t.exposeInt("avg_watch_time"),
}),
});
// Response types
export const VideoLikeResponseType = builder.objectRef<{
liked: boolean;
likes_count: number;
}>("VideoLikeResponse").implement({
fields: (t) => ({
liked: t.exposeBoolean("liked"),
likes_count: t.exposeInt("likes_count"),
}),
});
export const VideoPlayResponseType = builder.objectRef<{
success: boolean;
play_id: string;
plays_count: number;
}>("VideoPlayResponse").implement({
fields: (t) => ({
success: t.exposeBoolean("success"),
play_id: t.exposeString("play_id"),
plays_count: t.exposeInt("plays_count"),
}),
});
export const VideoLikeStatusType = builder.objectRef<{
liked: boolean;
}>("VideoLikeStatus").implement({
fields: (t) => ({
liked: t.exposeBoolean("liked"),
}),
});

View File

@@ -0,0 +1,87 @@
import Fastify from "fastify";
import fastifyCookie from "@fastify/cookie";
import fastifyCors from "@fastify/cors";
import fastifyMultipart from "@fastify/multipart";
import fastifyStatic from "@fastify/static";
import { createYoga } from "graphql-yoga";
import path from "path";
import { schema } from "./graphql/index.js";
import { buildContext } from "./graphql/context.js";
import { db } from "./db/connection.js";
import { redis } from "./lib/auth.js";
const PORT = parseInt(process.env.PORT || "4000");
const UPLOAD_DIR = process.env.UPLOAD_DIR || "/data/uploads";
const CORS_ORIGIN = process.env.CORS_ORIGIN || "http://localhost:3000";
const fastify = Fastify({
logger: {
level: process.env.LOG_LEVEL || "info",
},
});
await fastify.register(fastifyCookie, {
secret: process.env.COOKIE_SECRET || "change-me-in-production",
});
await fastify.register(fastifyCors, {
origin: CORS_ORIGIN,
credentials: true,
methods: ["GET", "POST", "PUT", "PATCH", "DELETE", "OPTIONS"],
});
await fastify.register(fastifyMultipart, {
limits: {
fileSize: 5 * 1024 * 1024 * 1024, // 5 GB
},
});
await fastify.register(fastifyStatic, {
root: path.resolve(UPLOAD_DIR),
prefix: "/assets/",
decorateReply: false,
});
const yoga = createYoga({
schema,
context: buildContext,
graphqlEndpoint: "/graphql",
healthCheckEndpoint: "/health",
logging: {
debug: (...args) => fastify.log.debug(...args),
info: (...args) => fastify.log.info(...args),
warn: (...args) => fastify.log.warn(...args),
error: (...args) => fastify.log.error(...args),
},
});
fastify.route({
url: "/graphql",
method: ["GET", "POST", "OPTIONS"],
handler: async (request, reply) => {
const response = await yoga.handleNodeRequestAndResponse(request, reply, {
request,
reply,
db,
redis,
});
reply.status(response.status);
for (const [key, value] of response.headers.entries()) {
reply.header(key, value);
}
return reply.send(response.body);
},
});
fastify.get("/health", async (_request, reply) => {
return reply.send({ status: "ok", timestamp: new Date().toISOString() });
});
try {
await fastify.listen({ port: PORT, host: "0.0.0.0" });
fastify.log.info(`Backend running at http://0.0.0.0:${PORT}`);
fastify.log.info(`GraphQL at http://0.0.0.0:${PORT}/graphql`);
} catch (err) {
fastify.log.error(err);
process.exit(1);
}

View File

@@ -0,0 +1,9 @@
import argon2 from "argon2";
export async function hash(password: string): Promise<string> {
return argon2.hash(password);
}
export async function verify(hash: string, password: string): Promise<boolean> {
return argon2.verify(hash, password);
}

View File

@@ -0,0 +1,28 @@
import Redis from "ioredis";
export type SessionUser = {
id: string;
email: string;
role: "model" | "viewer" | "admin";
first_name: string | null;
last_name: string | null;
artist_name: string | null;
slug: string | null;
avatar: string | null;
};
export const redis = new Redis(process.env.REDIS_URL || "redis://localhost:6379");
export async function setSession(token: string, user: SessionUser): Promise<void> {
await redis.set(`session:${token}`, JSON.stringify(user), "EX", 86400);
}
export async function getSession(token: string): Promise<SessionUser | null> {
const data = await redis.get(`session:${token}`);
if (!data) return null;
return JSON.parse(data) as SessionUser;
}
export async function deleteSession(token: string): Promise<void> {
await redis.del(`session:${token}`);
}

View File

@@ -0,0 +1,32 @@
import nodemailer from "nodemailer";
const transporter = nodemailer.createTransport({
host: process.env.SMTP_HOST || "localhost",
port: parseInt(process.env.SMTP_PORT || "587"),
secure: process.env.SMTP_SECURE === "true",
auth: process.env.SMTP_USER ? {
user: process.env.SMTP_USER,
pass: process.env.SMTP_PASS,
} : undefined,
});
const FROM = process.env.EMAIL_FROM || "noreply@sexy.pivoine.art";
const BASE_URL = process.env.PUBLIC_URL || "http://localhost:3000";
export async function sendVerification(email: string, token: string): Promise<void> {
await transporter.sendMail({
from: FROM,
to: email,
subject: "Verify your email",
html: `<p>Click <a href="${BASE_URL}/signup/verify?token=${token}">here</a> to verify your email.</p>`,
});
}
export async function sendPasswordReset(email: string, token: string): Promise<void> {
await transporter.sendMail({
from: FROM,
to: email,
subject: "Reset your password",
html: `<p>Click <a href="${BASE_URL}/password/reset?token=${token}">here</a> to reset your password.</p>`,
});
}

View File

@@ -0,0 +1,10 @@
import ffmpeg from "fluent-ffmpeg";
export function extractDuration(filePath: string): Promise<number> {
return new Promise((resolve, reject) => {
ffmpeg.ffprobe(filePath, (err, metadata) => {
if (err) return reject(err);
resolve(Math.round(metadata.format.duration || 0));
});
});
}

View File

@@ -0,0 +1,324 @@
import { eq, sql, and, gt, isNotNull, count, sum } from "drizzle-orm";
import type { DB } from "../db/connection.js";
import {
user_points,
user_stats,
recordings,
recording_plays,
comments,
user_achievements,
achievements,
users,
} from "../db/schema/index.js";
export const POINT_VALUES = {
RECORDING_CREATE: 50,
RECORDING_PLAY: 10,
RECORDING_COMPLETE: 5,
COMMENT_CREATE: 5,
RECORDING_FEATURED: 100,
} as const;
const DECAY_LAMBDA = 0.005;
export async function awardPoints(
db: DB,
userId: string,
action: keyof typeof POINT_VALUES,
recordingId?: string,
): Promise<void> {
const points = POINT_VALUES[action];
await db.insert(user_points).values({
user_id: userId,
action,
points,
recording_id: recordingId || null,
date_created: new Date(),
});
await updateUserStats(db, userId);
}
export async function calculateWeightedScore(db: DB, userId: string): Promise<number> {
const now = new Date();
const result = await db.execute(sql`
SELECT SUM(
points * EXP(-${DECAY_LAMBDA} * EXTRACT(EPOCH FROM (${now}::timestamptz - date_created)) / 86400)
) as weighted_score
FROM user_points
WHERE user_id = ${userId}
`);
return parseFloat((result.rows[0] as any)?.weighted_score || "0");
}
export async function updateUserStats(db: DB, userId: string): Promise<void> {
const now = new Date();
const rawPointsResult = await db
.select({ total: sum(user_points.points) })
.from(user_points)
.where(eq(user_points.user_id, userId));
const totalRawPoints = parseInt(String(rawPointsResult[0]?.total || "0"));
const totalWeightedPoints = await calculateWeightedScore(db, userId);
const recordingsResult = await db
.select({ count: count() })
.from(recordings)
.where(and(eq(recordings.user_id, userId), eq(recordings.status, "published")));
const recordingsCount = recordingsResult[0]?.count || 0;
// Get playbacks count (excluding own recordings)
const ownRecordingIds = await db
.select({ id: recordings.id })
.from(recordings)
.where(eq(recordings.user_id, userId));
const ownIds = ownRecordingIds.map((r) => r.id);
let playbacksCount = 0;
if (ownIds.length > 0) {
const playbacksResult = await db.execute(sql`
SELECT COUNT(*) as count FROM recording_plays
WHERE user_id = ${userId}
AND recording_id NOT IN (${sql.join(ownIds.map(id => sql`${id}`), sql`, `)})
`);
playbacksCount = parseInt((playbacksResult.rows[0] as any)?.count || "0");
} else {
const playbacksResult = await db
.select({ count: count() })
.from(recording_plays)
.where(eq(recording_plays.user_id, userId));
playbacksCount = playbacksResult[0]?.count || 0;
}
const commentsResult = await db
.select({ count: count() })
.from(comments)
.where(and(eq(comments.user_id, userId), eq(comments.collection, "recordings")));
const commentsCount = commentsResult[0]?.count || 0;
const achievementsResult = await db
.select({ count: count() })
.from(user_achievements)
.where(and(eq(user_achievements.user_id, userId), isNotNull(user_achievements.date_unlocked)));
const achievementsCount = achievementsResult[0]?.count || 0;
const existing = await db
.select()
.from(user_stats)
.where(eq(user_stats.user_id, userId))
.limit(1);
if (existing.length > 0) {
await db
.update(user_stats)
.set({
total_raw_points: totalRawPoints,
total_weighted_points: totalWeightedPoints,
recordings_count: recordingsCount,
playbacks_count: playbacksCount,
comments_count: commentsCount,
achievements_count: achievementsCount,
last_updated: now,
})
.where(eq(user_stats.user_id, userId));
} else {
await db.insert(user_stats).values({
user_id: userId,
total_raw_points: totalRawPoints,
total_weighted_points: totalWeightedPoints,
recordings_count: recordingsCount,
playbacks_count: playbacksCount,
comments_count: commentsCount,
achievements_count: achievementsCount,
last_updated: now,
});
}
}
export async function checkAchievements(
db: DB,
userId: string,
category?: string,
): Promise<void> {
let achievementsQuery = db
.select()
.from(achievements)
.where(eq(achievements.status, "published"));
if (category) {
achievementsQuery = db
.select()
.from(achievements)
.where(and(eq(achievements.status, "published"), eq(achievements.category, category)));
}
const achievementsList = await achievementsQuery;
for (const achievement of achievementsList) {
const progress = await getAchievementProgress(db, userId, achievement);
const existing = await db
.select()
.from(user_achievements)
.where(
and(
eq(user_achievements.user_id, userId),
eq(user_achievements.achievement_id, achievement.id),
),
)
.limit(1);
const isUnlocked = progress >= achievement.required_count;
const wasUnlocked = existing[0]?.date_unlocked !== null;
if (existing.length > 0) {
await db
.update(user_achievements)
.set({
progress,
date_unlocked: isUnlocked ? (existing[0].date_unlocked || new Date()) : null,
})
.where(
and(
eq(user_achievements.user_id, userId),
eq(user_achievements.achievement_id, achievement.id),
),
);
} else {
await db.insert(user_achievements).values({
user_id: userId,
achievement_id: achievement.id,
progress,
date_unlocked: isUnlocked ? new Date() : null,
});
}
if (isUnlocked && !wasUnlocked && achievement.points_reward > 0) {
await db.insert(user_points).values({
user_id: userId,
action: `ACHIEVEMENT_${achievement.code}`,
points: achievement.points_reward,
recording_id: null,
date_created: new Date(),
});
await updateUserStats(db, userId);
}
}
}
async function getAchievementProgress(
db: DB,
userId: string,
achievement: typeof achievements.$inferSelect,
): Promise<number> {
const { code } = achievement;
if (["first_recording", "recording_10", "recording_50", "recording_100"].includes(code)) {
const result = await db
.select({ count: count() })
.from(recordings)
.where(and(eq(recordings.user_id, userId), eq(recordings.status, "published")));
return result[0]?.count || 0;
}
if (code === "featured_recording") {
const result = await db
.select({ count: count() })
.from(recordings)
.where(
and(
eq(recordings.user_id, userId),
eq(recordings.status, "published"),
eq(recordings.featured, true),
),
);
return result[0]?.count || 0;
}
if (["first_play", "play_100", "play_500"].includes(code)) {
const result = await db.execute(sql`
SELECT COUNT(*) as count
FROM recording_plays rp
LEFT JOIN recordings r ON rp.recording_id = r.id
WHERE rp.user_id = ${userId}
AND r.user_id != ${userId}
`);
return parseInt((result.rows[0] as any)?.count || "0");
}
if (["completionist_10", "completionist_100"].includes(code)) {
const result = await db
.select({ count: count() })
.from(recording_plays)
.where(and(eq(recording_plays.user_id, userId), eq(recording_plays.completed, true)));
return result[0]?.count || 0;
}
if (["first_comment", "comment_50", "comment_250"].includes(code)) {
const result = await db
.select({ count: count() })
.from(comments)
.where(and(eq(comments.user_id, userId), eq(comments.collection, "recordings")));
return result[0]?.count || 0;
}
if (code === "early_adopter") {
const user = await db.select().from(users).where(eq(users.id, userId)).limit(1);
if (user[0]) {
const joinDate = new Date(user[0].date_created);
const platformLaunch = new Date("2025-01-01");
const oneMonthAfterLaunch = new Date(platformLaunch);
oneMonthAfterLaunch.setMonth(oneMonthAfterLaunch.getMonth() + 1);
return joinDate <= oneMonthAfterLaunch ? 1 : 0;
}
}
if (code === "one_year") {
const user = await db.select().from(users).where(eq(users.id, userId)).limit(1);
if (user[0]) {
const joinDate = new Date(user[0].date_created);
const oneYearAgo = new Date();
oneYearAgo.setFullYear(oneYearAgo.getFullYear() - 1);
return joinDate <= oneYearAgo ? 1 : 0;
}
}
if (code === "balanced_creator") {
const recordingsResult = await db
.select({ count: count() })
.from(recordings)
.where(and(eq(recordings.user_id, userId), eq(recordings.status, "published")));
const playsResult = await db.execute(sql`
SELECT COUNT(*) as count FROM recording_plays rp
LEFT JOIN recordings r ON rp.recording_id = r.id
WHERE rp.user_id = ${userId} AND r.user_id != ${userId}
`);
const rc = recordingsResult[0]?.count || 0;
const pc = parseInt((playsResult.rows[0] as any)?.count || "0");
return rc >= 50 && pc >= 100 ? 1 : 0;
}
if (code === "top_10_rank") {
const userStat = await db
.select()
.from(user_stats)
.where(eq(user_stats.user_id, userId))
.limit(1);
if (!userStat[0]) return 0;
const rankResult = await db
.select({ count: count() })
.from(user_stats)
.where(gt(user_stats.total_weighted_points, userStat[0].total_weighted_points || 0));
const userRank = (rankResult[0]?.count || 0) + 1;
return userRank <= 10 ? 1 : 0;
}
return 0;
}
export async function recalculateAllWeightedScores(db: DB): Promise<void> {
const allUsers = await db.select({ user_id: user_stats.user_id }).from(user_stats);
for (const u of allUsers) {
await updateUserStats(db, u.user_id);
}
}

View File

@@ -0,0 +1,5 @@
import slugifyLib from "slugify";
export function slugify(text: string): string {
return slugifyLib(text, { lower: true, strict: true });
}

View File

@@ -0,0 +1,566 @@
/**
* Data Migration: Directus → Custom Backend
*
* Migrates data from Directus tables to the new schema.
* Run with: tsx src/scripts/data-migration.ts
*
* Environment variables:
* DATABASE_URL - PostgreSQL connection (same DB)
* OLD_UPLOAD_DIR - Path to Directus uploads (e.g. /old-uploads)
* NEW_UPLOAD_DIR - Path to new upload dir (e.g. /data/uploads)
*/
import { Pool } from "pg";
import fs from "fs";
import path from "path";
const DATABASE_URL = process.env.DATABASE_URL || "postgresql://sexy:sexy@localhost:5432/sexy";
const OLD_UPLOAD_DIR = process.env.OLD_UPLOAD_DIR || "/old-uploads";
const NEW_UPLOAD_DIR = process.env.NEW_UPLOAD_DIR || "/data/uploads";
const pool = new Pool({ connectionString: DATABASE_URL });
async function query(sql: string, params: unknown[] = []) {
const client = await pool.connect();
try {
return await client.query(sql, params);
} finally {
client.release();
}
}
function copyFile(src: string, dest: string) {
const dir = path.dirname(dest);
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir, { recursive: true });
}
if (fs.existsSync(src)) {
fs.copyFileSync(src, dest);
return true;
}
return false;
}
async function migrateFiles() {
console.log("📁 Migrating files...");
const { rows } = await query(
`SELECT id, title, description, filename_disk, type, filesize, duration, uploaded_by, date_created
FROM directus_files`,
);
let migrated = 0;
let skipped = 0;
for (const file of rows) {
// Check if already migrated
const existing = await query("SELECT id FROM files WHERE id = $1", [file.id]);
if (existing.rows.length > 0) {
skipped++;
continue;
}
await query(
`INSERT INTO files (id, title, description, filename, mime_type, filesize, duration, uploaded_by, date_created)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9)
ON CONFLICT (id) DO NOTHING`,
[
file.id,
file.title,
file.description,
file.filename_disk || `${file.id}`,
file.type,
file.filesize,
file.duration,
file.uploaded_by,
file.date_created,
],
);
// Copy file to new location
const srcPath = path.join(OLD_UPLOAD_DIR, file.filename_disk || "");
const destPath = path.join(NEW_UPLOAD_DIR, file.id, file.filename_disk || `${file.id}`);
const copied = copyFile(srcPath, destPath);
if (!copied) {
console.warn(` ⚠️ File not found on disk: ${file.filename_disk}`);
}
migrated++;
}
console.log(` ✅ Files: ${migrated} migrated, ${skipped} already existed`);
}
async function migrateUsers() {
console.log("👥 Migrating users...");
const { rows } = await query(
`SELECT u.id, u.email, u.password, u.first_name, u.last_name,
u.description, u.avatar, u.date_created,
u.artist_name, u.slug, u.email_notifications_key,
r.name as role_name
FROM directus_users u
LEFT JOIN directus_roles r ON u.role = r.id
WHERE u.status = 'active'`,
);
let migrated = 0;
for (const user of rows) {
const existing = await query("SELECT id FROM users WHERE id = $1", [user.id]);
if (existing.rows.length > 0) {
migrated++;
continue;
}
const role =
user.role_name === "Model"
? "model"
: user.role_name === "Administrator"
? "admin"
: "viewer";
// Fetch tags from custom user fields if they exist
let tags: string[] = [];
try {
const tagsRes = await query("SELECT tags FROM directus_users WHERE id = $1", [user.id]);
if (tagsRes.rows[0]?.tags) {
tags = Array.isArray(tagsRes.rows[0].tags)
? tagsRes.rows[0].tags
: JSON.parse(tagsRes.rows[0].tags || "[]");
}
} catch {}
await query(
`INSERT INTO users (id, email, password_hash, first_name, last_name, artist_name, slug,
description, tags, role, avatar, email_verified, date_created)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13)
ON CONFLICT (id) DO NOTHING`,
[
user.id,
user.email,
user.password || "MIGRATED_NO_PASSWORD",
user.first_name,
user.last_name,
user.artist_name,
user.slug,
user.description,
JSON.stringify(tags),
role,
user.avatar,
true, // Assume existing users are verified
user.date_created,
],
);
migrated++;
}
console.log(` ✅ Users: ${migrated} migrated`);
}
async function migrateUserPhotos() {
console.log("🖼️ Migrating user photos...");
const { rows } = await query(
`SELECT directus_users_id as user_id, directus_files_id as file_id, sort
FROM junction_directus_users_files`,
);
let migrated = 0;
for (const row of rows) {
const userExists = await query("SELECT id FROM users WHERE id = $1", [row.user_id]);
const fileExists = await query("SELECT id FROM files WHERE id = $1", [row.file_id]);
if (!userExists.rows.length || !fileExists.rows.length) continue;
await query(
`INSERT INTO user_photos (user_id, file_id, sort) VALUES ($1, $2, $3)
ON CONFLICT DO NOTHING`,
[row.user_id, row.file_id, row.sort || 0],
);
migrated++;
}
console.log(` ✅ User photos: ${migrated} migrated`);
}
async function migrateArticles() {
console.log("📰 Migrating articles...");
const { rows } = await query(
`SELECT id, slug, title, excerpt, content, image, tags, publish_date,
author, category, featured, date_created, date_updated
FROM sexy_articles`,
);
let migrated = 0;
for (const article of rows) {
await query(
`INSERT INTO articles (id, slug, title, excerpt, content, image, tags, publish_date,
author, category, featured, date_created, date_updated)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13)
ON CONFLICT (id) DO NOTHING`,
[
article.id,
article.slug,
article.title,
article.excerpt,
article.content,
article.image,
Array.isArray(article.tags) ? JSON.stringify(article.tags) : article.tags,
article.publish_date,
article.author,
article.category,
article.featured,
article.date_created,
article.date_updated,
],
);
migrated++;
}
console.log(` ✅ Articles: ${migrated} migrated`);
}
async function migrateVideos() {
console.log("🎬 Migrating videos...");
const { rows } = await query(
`SELECT id, slug, title, description, image, movie, tags, upload_date,
premium, featured, likes_count, plays_count
FROM sexy_videos`,
);
let migrated = 0;
for (const video of rows) {
await query(
`INSERT INTO videos (id, slug, title, description, image, movie, tags, upload_date,
premium, featured, likes_count, plays_count)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12)
ON CONFLICT (id) DO NOTHING`,
[
video.id,
video.slug,
video.title,
video.description,
video.image,
video.movie,
Array.isArray(video.tags) ? JSON.stringify(video.tags) : video.tags,
video.upload_date,
video.premium,
video.featured,
video.likes_count || 0,
video.plays_count || 0,
],
);
migrated++;
}
console.log(` ✅ Videos: ${migrated} migrated`);
}
async function migrateVideoModels() {
console.log("🔗 Migrating video models...");
const { rows } = await query(
`SELECT sexy_videos_id as video_id, directus_users_id as user_id
FROM sexy_videos_models`,
);
let migrated = 0;
for (const row of rows) {
const videoExists = await query("SELECT id FROM videos WHERE id = $1", [row.video_id]);
const userExists = await query("SELECT id FROM users WHERE id = $1", [row.user_id]);
if (!videoExists.rows.length || !userExists.rows.length) continue;
await query(
`INSERT INTO video_models (video_id, user_id) VALUES ($1, $2) ON CONFLICT DO NOTHING`,
[row.video_id, row.user_id],
);
migrated++;
}
console.log(` ✅ Video models: ${migrated} migrated`);
}
async function migrateVideoLikes() {
console.log("❤️ Migrating video likes...");
const { rows } = await query(
`SELECT id, video_id, user_id, date_created FROM sexy_video_likes`,
);
let migrated = 0;
for (const row of rows) {
await query(
`INSERT INTO video_likes (id, video_id, user_id, date_created) VALUES ($1, $2, $3, $4)
ON CONFLICT (id) DO NOTHING`,
[row.id, row.video_id, row.user_id, row.date_created],
);
migrated++;
}
console.log(` ✅ Video likes: ${migrated} migrated`);
}
async function migrateVideoPlays() {
console.log("▶️ Migrating video plays...");
const { rows } = await query(
`SELECT id, video_id, user_id, session_id, duration_watched, completed, date_created
FROM sexy_video_plays`,
);
let migrated = 0;
for (const row of rows) {
await query(
`INSERT INTO video_plays (id, video_id, user_id, session_id, duration_watched, completed, date_created)
VALUES ($1, $2, $3, $4, $5, $6, $7)
ON CONFLICT (id) DO NOTHING`,
[
row.id,
row.video_id,
row.user_id,
row.session_id,
row.duration_watched,
row.completed,
row.date_created,
],
);
migrated++;
}
console.log(` ✅ Video plays: ${migrated} migrated`);
}
async function migrateRecordings() {
console.log("🎙️ Migrating recordings...");
const { rows } = await query(
`SELECT id, title, description, slug, duration, events, device_info,
user_created as user_id, status, tags, linked_video, featured, public,
original_recording_id, date_created, date_updated
FROM sexy_recordings`,
);
let migrated = 0;
for (const recording of rows) {
await query(
`INSERT INTO recordings (id, title, description, slug, duration, events, device_info,
user_id, status, tags, linked_video, featured, public,
original_recording_id, date_created, date_updated)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15, $16)
ON CONFLICT (id) DO NOTHING`,
[
recording.id,
recording.title,
recording.description,
recording.slug,
recording.duration,
typeof recording.events === "string" ? recording.events : JSON.stringify(recording.events),
typeof recording.device_info === "string"
? recording.device_info
: JSON.stringify(recording.device_info),
recording.user_id,
recording.status,
Array.isArray(recording.tags) ? JSON.stringify(recording.tags) : recording.tags,
recording.linked_video,
recording.featured,
recording.public,
recording.original_recording_id,
recording.date_created,
recording.date_updated,
],
);
migrated++;
}
console.log(` ✅ Recordings: ${migrated} migrated`);
}
async function migrateRecordingPlays() {
console.log("▶️ Migrating recording plays...");
const { rows } = await query(
`SELECT id, user_id, recording_id, duration_played, completed, date_created
FROM sexy_recording_plays`,
);
let migrated = 0;
for (const row of rows) {
await query(
`INSERT INTO recording_plays (id, recording_id, user_id, duration_played, completed, date_created)
VALUES ($1, $2, $3, $4, $5, $6)
ON CONFLICT (id) DO NOTHING`,
[row.id, row.recording_id, row.user_id, row.duration_played, row.completed, row.date_created],
);
migrated++;
}
console.log(` ✅ Recording plays: ${migrated} migrated`);
}
async function migrateComments() {
console.log("💬 Migrating comments...");
const { rows } = await query(
`SELECT id, collection, item, comment, user_created as user_id, date_created
FROM directus_comments
WHERE collection IN ('sexy_videos', 'sexy_recordings')`,
);
let migrated = 0;
for (const row of rows) {
// Map collection names
const collection = row.collection === "sexy_videos" ? "videos" : "recordings";
await query(
`INSERT INTO comments (collection, item_id, comment, user_id, date_created)
VALUES ($1, $2, $3, $4, $5)`,
[collection, row.item, row.comment, row.user_id, row.date_created],
);
migrated++;
}
console.log(` ✅ Comments: ${migrated} migrated`);
}
async function migrateAchievements() {
console.log("🏆 Migrating achievements...");
const { rows } = await query(
`SELECT id, code, name, description, icon, category, required_count, points_reward, status, sort
FROM sexy_achievements`,
);
let migrated = 0;
for (const row of rows) {
await query(
`INSERT INTO achievements (id, code, name, description, icon, category, required_count, points_reward, status, sort)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10)
ON CONFLICT (id) DO NOTHING`,
[
row.id,
row.code,
row.name,
row.description,
row.icon,
row.category,
row.required_count,
row.points_reward,
row.status,
row.sort,
],
);
migrated++;
}
console.log(` ✅ Achievements: ${migrated} migrated`);
}
async function migrateUserAchievements() {
console.log("🎖️ Migrating user achievements...");
const { rows } = await query(
`SELECT user_id, achievement_id, progress, date_unlocked FROM sexy_user_achievements`,
);
let migrated = 0;
for (const row of rows) {
const userExists = await query("SELECT id FROM users WHERE id = $1", [row.user_id]);
const achievementExists = await query("SELECT id FROM achievements WHERE id = $1", [
row.achievement_id,
]);
if (!userExists.rows.length || !achievementExists.rows.length) continue;
await query(
`INSERT INTO user_achievements (user_id, achievement_id, progress, date_unlocked)
VALUES ($1, $2, $3, $4)
ON CONFLICT (user_id, achievement_id) DO NOTHING`,
[row.user_id, row.achievement_id, row.progress, row.date_unlocked],
);
migrated++;
}
console.log(` ✅ User achievements: ${migrated} migrated`);
}
async function migrateUserPoints() {
console.log("💎 Migrating user points...");
const { rows } = await query(
`SELECT user_id, action, points, recording_id, date_created FROM sexy_user_points`,
);
let migrated = 0;
for (const row of rows) {
const userExists = await query("SELECT id FROM users WHERE id = $1", [row.user_id]);
if (!userExists.rows.length) continue;
await query(
`INSERT INTO user_points (user_id, action, points, recording_id, date_created)
VALUES ($1, $2, $3, $4, $5)`,
[row.user_id, row.action, row.points, row.recording_id, row.date_created],
);
migrated++;
}
console.log(` ✅ User points: ${migrated} migrated`);
}
async function migrateUserStats() {
console.log("📊 Migrating user stats...");
const { rows } = await query(
`SELECT user_id, total_raw_points, total_weighted_points, recordings_count,
playbacks_count, comments_count, achievements_count, last_updated
FROM sexy_user_stats`,
);
let migrated = 0;
for (const row of rows) {
const userExists = await query("SELECT id FROM users WHERE id = $1", [row.user_id]);
if (!userExists.rows.length) continue;
await query(
`INSERT INTO user_stats (user_id, total_raw_points, total_weighted_points, recordings_count,
playbacks_count, comments_count, achievements_count, last_updated)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8)
ON CONFLICT (user_id) DO NOTHING`,
[
row.user_id,
row.total_raw_points,
row.total_weighted_points,
row.recordings_count,
row.playbacks_count,
row.comments_count,
row.achievements_count,
row.last_updated,
],
);
migrated++;
}
console.log(` ✅ User stats: ${migrated} migrated`);
}
async function main() {
console.log("🚀 Starting data migration from Directus to custom backend...\n");
try {
// Verify connection
await query("SELECT 1");
console.log("✅ Database connected\n");
// Migration order respects FK dependencies
await migrateFiles();
await migrateUsers();
await migrateUserPhotos();
await migrateArticles();
await migrateVideos();
await migrateVideoModels();
await migrateVideoLikes();
await migrateVideoPlays();
await migrateRecordings();
await migrateRecordingPlays();
await migrateComments();
await migrateAchievements();
await migrateUserAchievements();
await migrateUserPoints();
await migrateUserStats();
console.log("\n🎉 Migration complete!");
} catch (error) {
console.error("❌ Migration failed:", error);
process.exit(1);
} finally {
await pool.end();
}
}
main();

View File

@@ -0,0 +1,20 @@
{
"compilerOptions": {
"target": "ES2022",
"module": "NodeNext",
"moduleResolution": "NodeNext",
"lib": ["ES2022"],
"outDir": "./dist",
"rootDir": "./src",
"strict": true,
"skipLibCheck": true,
"esModuleInterop": true,
"experimentalDecorators": true,
"emitDecoratorMetadata": true,
"declaration": true,
"declarationMap": true,
"sourceMap": true
},
"include": ["src/**/*"],
"exclude": ["node_modules", "dist"]
}

View File

@@ -1,54 +0,0 @@
{
"name": "@sexy.pivoine.art/bundle",
"description": "Please enter a description for your extension",
"icon": "extension",
"version": "1.0.0",
"keywords": [
"directus",
"directus-extension",
"directus-extension-bundle"
],
"type": "module",
"files": [
"dist"
],
"directus:extension": {
"type": "bundle",
"path": {
"app": "dist/app.js",
"api": "dist/api.js"
},
"entries": [
{
"name": "endpoint",
"type": "endpoint",
"source": "src/endpoint"
},
{
"name": "hook",
"type": "hook",
"source": "src/hook"
},
{
"name": "theme",
"type": "theme",
"source": "src/theme"
}
],
"host": "^11.11.0"
},
"scripts": {
"build": "directus-extension build",
"dev": "directus-extension build -w --no-minify",
"link": "directus-extension link",
"validate": "directus-extension validate",
"add": "directus-extension add"
},
"devDependencies": {
"@directus/extensions-sdk": "16.0.2"
},
"dependencies": {
"@sindresorhus/slugify": "^3.0.0",
"fluent-ffmpeg": "^2.1.3"
}
}

View File

@@ -1,336 +0,0 @@
/**
* Gamification Helper Functions
* Handles points, achievements, and user stats for recording-focused gamification system
*/
import type { Knex } from "knex";
/**
* Point values for different actions
*/
export const POINT_VALUES = {
RECORDING_CREATE: 50,
RECORDING_PLAY: 10,
RECORDING_COMPLETE: 5,
COMMENT_CREATE: 5,
RECORDING_FEATURED: 100,
} as const;
/**
* Time decay constant for weighted scoring
* λ = 0.005 means ~14% decay per month
*/
const DECAY_LAMBDA = 0.005;
/**
* Award points to a user for a specific action
*/
export async function awardPoints(
database: Knex,
userId: string,
action: keyof typeof POINT_VALUES,
recordingId?: string,
): Promise<void> {
const points = POINT_VALUES[action];
await database("sexy_user_points").insert({
user_id: userId,
action,
points,
recording_id: recordingId || null,
date_created: new Date(),
});
// Update cached stats
await updateUserStats(database, userId);
}
/**
* Calculate time-weighted score using exponential decay
* Score = Σ (points × e^(-λ × age_in_days))
*/
export async function calculateWeightedScore(
database: Knex,
userId: string,
): Promise<number> {
const now = new Date();
const result = await database("sexy_user_points")
.where({ user_id: userId })
.select(
database.raw(`
SUM(
points * EXP(-${DECAY_LAMBDA} * EXTRACT(EPOCH FROM (? - date_created)) / 86400)
) as weighted_score
`, [now]),
);
return result[0]?.weighted_score || 0;
}
/**
* Update or create user stats cache
*/
export async function updateUserStats(database: Knex, userId: string): Promise<void> {
const now = new Date();
// Calculate raw points
const rawPointsResult = await database("sexy_user_points")
.where({ user_id: userId })
.sum("points as total");
const totalRawPoints = rawPointsResult[0]?.total || 0;
// Calculate weighted points
const totalWeightedPoints = await calculateWeightedScore(database, userId);
// Get recordings count
const recordingsResult = await database("sexy_recordings")
.where({ user_created: userId, status: "published" })
.count("* as count");
const recordingsCount = recordingsResult[0]?.count || 0;
// Get playbacks count (excluding own recordings)
const playbacksResult = await database("sexy_recording_plays")
.where({ user_id: userId })
.whereNotIn("recording_id", function () {
this.select("id").from("sexy_recordings").where("user_created", userId);
})
.count("* as count");
const playbacksCount = playbacksResult[0]?.count || 0;
// Get comments count (on recordings only)
const commentsResult = await database("comments")
.where({ user_created: userId, collection: "sexy_recordings" })
.count("* as count");
const commentsCount = commentsResult[0]?.count || 0;
// Get achievements count
const achievementsResult = await database("sexy_user_achievements")
.where({ user_id: userId })
.whereNotNull("date_unlocked")
.count("* as count");
const achievementsCount = achievementsResult[0]?.count || 0;
// Upsert stats
const existing = await database("sexy_user_stats")
.where({ user_id: userId })
.first();
if (existing) {
await database("sexy_user_stats")
.where({ user_id: userId })
.update({
total_raw_points: totalRawPoints,
total_weighted_points: totalWeightedPoints,
recordings_count: recordingsCount,
playbacks_count: playbacksCount,
comments_count: commentsCount,
achievements_count: achievementsCount,
last_updated: now,
});
} else {
await database("sexy_user_stats").insert({
user_id: userId,
total_raw_points: totalRawPoints,
total_weighted_points: totalWeightedPoints,
recordings_count: recordingsCount,
playbacks_count: playbacksCount,
comments_count: commentsCount,
achievements_count: achievementsCount,
last_updated: now,
});
}
}
/**
* Check and update achievement progress for a user
*/
export async function checkAchievements(
database: Knex,
userId: string,
category?: string,
): Promise<void> {
// Get all achievements (optionally filtered by category)
let achievementsQuery = database("sexy_achievements")
.where({ status: "published" });
if (category) {
achievementsQuery = achievementsQuery.where({ category });
}
const achievements = await achievementsQuery;
for (const achievement of achievements) {
const progress = await getAchievementProgress(database, userId, achievement);
// Check if already unlocked
const existing = await database("sexy_user_achievements")
.where({ user_id: userId, achievement_id: achievement.id })
.first();
const isUnlocked = progress >= achievement.required_count;
const wasUnlocked = existing?.date_unlocked !== null;
if (existing) {
// Update progress
await database("sexy_user_achievements")
.where({ user_id: userId, achievement_id: achievement.id })
.update({
progress,
date_unlocked: isUnlocked ? (existing.date_unlocked || new Date()) : null,
});
} else {
// Insert new progress
await database("sexy_user_achievements").insert({
user_id: userId,
achievement_id: achievement.id,
progress,
date_unlocked: isUnlocked ? new Date() : null,
});
}
// Award bonus points if newly unlocked
if (isUnlocked && !wasUnlocked && achievement.points_reward > 0) {
await database("sexy_user_points").insert({
user_id: userId,
action: `ACHIEVEMENT_${achievement.code}`,
points: achievement.points_reward,
recording_id: null,
date_created: new Date(),
});
// Refresh stats after awarding bonus
await updateUserStats(database, userId);
}
}
}
/**
* Get progress for a specific achievement
*/
async function getAchievementProgress(
database: Knex,
userId: string,
achievement: any,
): Promise<number> {
const { code } = achievement;
// Recordings achievements
if (code === "first_recording" || code === "recording_10" || code === "recording_50" || code === "recording_100") {
const result = await database("sexy_recordings")
.where({ user_created: userId, status: "published" })
.count("* as count");
return result[0]?.count || 0;
}
// Featured recording
if (code === "featured_recording") {
const result = await database("sexy_recordings")
.where({ user_created: userId, status: "published", featured: true })
.count("* as count");
return result[0]?.count || 0;
}
// Playback achievements (excluding own recordings)
if (code === "first_play" || code === "play_100" || code === "play_500") {
const result = await database("sexy_recording_plays as rp")
.leftJoin("sexy_recordings as r", "rp.recording_id", "r.id")
.where({ "rp.user_id": userId })
.where("r.user_created", "!=", userId)
.count("* as count");
return result[0]?.count || 0;
}
// Completionist achievements
if (code === "completionist_10" || code === "completionist_100") {
const result = await database("sexy_recording_plays")
.where({ user_id: userId, completed: true })
.count("* as count");
return result[0]?.count || 0;
}
// Social achievements
if (code === "first_comment" || code === "comment_50" || code === "comment_250") {
const result = await database("comments")
.where({ user_created: userId, collection: "sexy_recordings" })
.count("* as count");
return result[0]?.count || 0;
}
// Special: Early adopter (joined in first month)
if (code === "early_adopter") {
const user = await database("directus_users")
.where({ id: userId })
.first();
if (user) {
const joinDate = new Date(user.date_created);
const platformLaunch = new Date("2025-01-01"); // Adjust to actual launch date
const oneMonthAfterLaunch = new Date(platformLaunch);
oneMonthAfterLaunch.setMonth(oneMonthAfterLaunch.getMonth() + 1);
return joinDate <= oneMonthAfterLaunch ? 1 : 0;
}
}
// Special: One year anniversary
if (code === "one_year") {
const user = await database("directus_users")
.where({ id: userId })
.first();
if (user) {
const joinDate = new Date(user.date_created);
const oneYearAgo = new Date();
oneYearAgo.setFullYear(oneYearAgo.getFullYear() - 1);
return joinDate <= oneYearAgo ? 1 : 0;
}
}
// Special: Balanced creator (50 recordings + 100 plays)
if (code === "balanced_creator") {
const recordings = await database("sexy_recordings")
.where({ user_created: userId, status: "published" })
.count("* as count");
const plays = await database("sexy_recording_plays as rp")
.leftJoin("sexy_recordings as r", "rp.recording_id", "r.id")
.where({ "rp.user_id": userId })
.where("r.user_created", "!=", userId)
.count("* as count");
const recordingsCount = recordings[0]?.count || 0;
const playsCount = plays[0]?.count || 0;
return (recordingsCount >= 50 && playsCount >= 100) ? 1 : 0;
}
// Special: Top 10 rank
if (code === "top_10_rank") {
const userStats = await database("sexy_user_stats")
.where({ user_id: userId })
.first();
if (!userStats) return 0;
const rank = await database("sexy_user_stats")
.where("total_weighted_points", ">", userStats.total_weighted_points)
.count("* as count");
const userRank = (rank[0]?.count || 0) + 1;
return userRank <= 10 ? 1 : 0;
}
return 0;
}
/**
* Recalculate all weighted scores (for cron job)
*/
export async function recalculateAllWeightedScores(database: Knex): Promise<void> {
const users = await database("sexy_user_stats").select("user_id");
for (const user of users) {
await updateUserStats(database, user.user_id);
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,145 +0,0 @@
import { createRequire } from "module";
global.require = createRequire(import.meta.url);
import { defineHook } from "@directus/extensions-sdk";
import slugify from "@sindresorhus/slugify";
import ffmpeg from "fluent-ffmpeg";
import { awardPoints, checkAchievements } from "../endpoint/gamification.js";
async function processVideo(
meta,
{ schema, accountability },
services,
logger,
) {
const { FilesService } = services;
const itemId = meta.key;
const videoPath = `/directus/uploads/${meta.payload.filename_disk}`; // Adjust path as needed
const videoService = new FilesService({ schema, accountability }); // Replace with your collection name
try {
const durationInSeconds = await new Promise((resolve, reject) => {
ffmpeg.ffprobe(videoPath, function (err, metadata) {
if (err) {
reject(err);
}
resolve(parseInt(metadata.format.duration));
});
});
// Update the item with the duration
await videoService.updateOne(itemId, { duration: durationInSeconds });
logger.info(`Video ${itemId} duration updated to ${durationInSeconds}`);
} catch (error) {
logger.error(`Error processing video ${itemId}:`, error);
}
}
export default defineHook(async ({ filter, action }, { services, logger, database, getSchema }) => {
action("files.upload", async (meta, context) => {
await processVideo(meta, context, services, logger);
});
filter(
"users.create",
(payload: {
first_name: string;
last_name: string;
artist_name: string;
slug: string;
}) => {
const artist_name = `${payload.first_name}-${new Date().getTime()}`;
const slug = slugify(artist_name);
const join_date = new Date();
return { ...payload, artist_name, slug, join_date };
},
);
filter(
"users.update",
(payload: {
first_name: string;
last_name: string;
artist_name: string;
slug: string;
}) => {
if (payload.artist_name) {
const slug = slugify(payload.artist_name);
return { ...payload, slug };
}
return payload;
},
);
// =========================================
// GAMIFICATION HOOKS
// =========================================
// Hook: Award points when recording is published
action("items.create", async (meta, { collection, accountability }) => {
if (collection === "sexy_recordings") {
const { payload, key } = meta;
// Award points if recording is published
if (payload.status === "published" && accountability?.user) {
try {
await awardPoints(database, accountability.user, "RECORDING_CREATE", key);
await checkAchievements(database, accountability.user, "recordings");
logger.info(`Awarded RECORDING_CREATE points to user ${accountability.user}`);
} catch (error) {
logger.error("Failed to award recording creation points:", error);
}
}
}
});
// Hook: Award points when recording status changes to published or featured
action("items.update", async (meta, { collection, accountability, schema }) => {
if (collection === "sexy_recordings") {
const { payload, keys } = meta;
try {
const { ItemsService } = services;
const recordingsService = new ItemsService("sexy_recordings", {
schema: await getSchema(),
});
for (const key of keys) {
const recording = await recordingsService.readOne(key);
// Award points if status changed from non-published to published
if (payload.status === "published" && recording.status !== "published" && recording.user_created) {
await awardPoints(database, recording.user_created, "RECORDING_CREATE", key);
await checkAchievements(database, recording.user_created, "recordings");
logger.info(`Awarded RECORDING_CREATE points to user ${recording.user_created}`);
}
// Award bonus points if recording becomes featured
if (payload.featured === true && !recording.featured && recording.user_created) {
await awardPoints(database, recording.user_created, "RECORDING_FEATURED", key);
await checkAchievements(database, recording.user_created, "recordings");
logger.info(`Awarded RECORDING_FEATURED points to user ${recording.user_created}`);
}
}
} catch (error) {
logger.error("Failed to award recording update points:", error);
}
}
});
// Hook: Award points when user creates a comment on a recording
action("comments.create", async (meta, { accountability }) => {
if (!accountability?.user) return;
try {
const { payload } = meta;
// Check if comment is on a recording
if (payload.collection === "sexy_recordings") {
await awardPoints(database, accountability.user, "COMMENT_CREATE");
await checkAchievements(database, accountability.user, "social");
logger.info(`Awarded COMMENT_CREATE points to user ${accountability.user}`);
}
} catch (error) {
logger.error("Failed to award comment points:", error);
}
});
});

View File

@@ -1,130 +0,0 @@
import { defineTheme } from "@directus/extensions-sdk";
import "./style.css";
export default defineTheme({
id: "@sexy.pivoine.art/theme",
name: "Sexy.Art Dark",
appearance: "dark",
rules: {
borderRadius: "6px",
borderWidth: "2px",
foreground: "#c9d1d9",
foregroundSubdued: "#666672",
foregroundAccent: "#f0f6fc",
background: "#0D1117",
backgroundNormal: "#21262E",
backgroundAccent: "#30363D",
backgroundSubdued: "#161B22",
borderColor: "#21262E",
borderColorAccent: "#30363D",
borderColorSubdued: "#161B22",
primary: "#ce47eb",
secondary: "#613dff",
success: "#87ff66",
warning: "#ffbf66",
danger: "#ff6467",
navigation: {
background: "#21262E",
backgroundAccent: "#30363D",
borderWidth: "0px",
borderColor: "transparent",
project: {
background: "#30363D",
borderWidth: "0px",
borderColor: "transparent",
},
modules: {
borderWidth: "0px",
borderColor: "transparent",
button: {
foregroundHover: "#fff",
background: "transparent",
backgroundHover: "transparent",
backgroundActive: "#21262E",
},
},
list: {
background: "transparent",
backgroundHover: "#30363D",
backgroundActive: "#30363D",
divider: {
borderColor: "#30363D",
},
},
},
header: {
borderWidth: "0px",
borderColor: "transparent",
boxShadow: "0 4px 7px -4px black",
},
form: {
columnGap: "32px",
rowGap: "40px",
field: {
label: {
fontWeight: "600",
},
input: {
borderColor: "#21262E",
borderColorHover: "#30363D",
boxShadow: "none",
boxShadowHover: "none",
height: "60px",
padding: "16px",
},
},
},
sidebar: {
background: "#21262E",
borderWidth: "0px",
borderColor: "transparent",
section: {
toggle: {
background: "#30363D",
borderWidth: "0px",
borderColor: "transparent",
},
form: {
field: {
input: {
height: "52px",
padding: "12px",
},
},
},
},
},
public: {
art: {
background: "#21262E",
speed: "1",
},
},
popover: {
menu: {
background: "#30363D",
boxShadow: "0px 0px 6px 0px black",
},
},
banner: {
background: "#161B22",
padding: "40px",
avatar: {
background: "#fff",
borderRadius: "50%",
},
headline: {
foreground: "#fff",
},
title: {
foreground: "#fff",
},
subtitle: {
foreground: "#969696",
},
art: {
foreground: "#21262E",
},
},
},
});

View File

@@ -1,29 +0,0 @@
{
"compilerOptions": {
"target": "ES2022",
"lib": ["ES2022", "DOM"],
"module": "ES2022",
"moduleResolution": "node",
"strict": false,
"noFallthroughCasesInSwitch": true,
"esModuleInterop": true,
"noImplicitAny": false,
"noImplicitThis": true,
"noImplicitReturns": true,
"noUnusedLocals": true,
"noUncheckedIndexedAccess": true,
"noUnusedParameters": true,
"alwaysStrict": true,
"strictNullChecks": true,
"strictFunctionTypes": true,
"strictBindCallApply": true,
"strictPropertyInitialization": true,
"resolveJsonModule": false,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"allowSyntheticDefaultImports": true,
"isolatedModules": true,
"allowJs": true
},
"include": ["./src/**/*.ts"]
}

View File

@@ -177,7 +177,7 @@ checksum = "46c5e41b57b8bba42a04676d81cb89e9ee8e859a1a66f80a5a72e1cb76b34d43"
[[package]]
name = "buttplug_core"
version = "10.0.0"
source = "git+https://github.com/valknarthing/buttplug.git#c569409c51ad15f343c3f97a57711cdaa358f2ea"
source = "git+https://github.com/valknarthing/buttplug.git?rev=fad6c9d#fad6c9d97895218b01ceb55fd4a872a89043194a"
dependencies = [
"async-stream",
"cfg-if",
@@ -203,7 +203,7 @@ dependencies = [
[[package]]
name = "buttplug_server"
version = "10.0.0"
source = "git+https://github.com/valknarthing/buttplug.git#c569409c51ad15f343c3f97a57711cdaa358f2ea"
source = "git+https://github.com/valknarthing/buttplug.git?rev=fad6c9d#fad6c9d97895218b01ceb55fd4a872a89043194a"
dependencies = [
"aes",
"async-trait",
@@ -243,8 +243,8 @@ dependencies = [
[[package]]
name = "buttplug_server_device_config"
version = "10.0.0"
source = "git+https://github.com/valknarthing/buttplug.git#c569409c51ad15f343c3f97a57711cdaa358f2ea"
version = "10.0.1"
source = "git+https://github.com/valknarthing/buttplug.git?rev=fad6c9d#fad6c9d97895218b01ceb55fd4a872a89043194a"
dependencies = [
"buttplug_core",
"dashmap",
@@ -913,9 +913,9 @@ checksum = "4a5f13b858c8d314ee3e8f639011f7ccefe71f97f96e50151fb991f267928e2c"
[[package]]
name = "js-sys"
version = "0.3.80"
version = "0.3.87"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "852f13bec5eba4ba9afbeb93fd7c13fe56147f055939ae21c43a29a0ecb2702e"
checksum = "93f0862381daaec758576dcc22eb7bbf4d7efd67328553f3b45a412a51a3fb21"
dependencies = [
"once_cell",
"wasm-bindgen",
@@ -1860,9 +1860,9 @@ dependencies = [
[[package]]
name = "wasm-bindgen"
version = "0.2.103"
version = "0.2.110"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ab10a69fbd0a177f5f649ad4d8d3305499c42bab9aef2f7ff592d0ec8f833819"
checksum = "1de241cdc66a9d91bd84f097039eb140cdc6eec47e0cdbaf9d932a1dd6c35866"
dependencies = [
"cfg-if",
"once_cell",
@@ -1873,27 +1873,14 @@ dependencies = [
"wasm-bindgen-shared",
]
[[package]]
name = "wasm-bindgen-backend"
version = "0.2.103"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0bb702423545a6007bbc368fde243ba47ca275e549c8a28617f56f6ba53b1d1c"
dependencies = [
"bumpalo",
"log",
"proc-macro2",
"quote",
"syn",
"wasm-bindgen-shared",
]
[[package]]
name = "wasm-bindgen-futures"
version = "0.4.53"
version = "0.4.60"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a0b221ff421256839509adbb55998214a70d829d3a28c69b4a6672e9d2a42f67"
checksum = "a42e96ea38f49b191e08a1bab66c7ffdba24b06f9995b39a9dd60222e5b6f1da"
dependencies = [
"cfg-if",
"futures-util",
"js-sys",
"once_cell",
"wasm-bindgen",
@@ -1902,9 +1889,9 @@ dependencies = [
[[package]]
name = "wasm-bindgen-macro"
version = "0.2.103"
version = "0.2.110"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fc65f4f411d91494355917b605e1480033152658d71f722a90647f56a70c88a0"
checksum = "e12fdf6649048f2e3de6d7d5ff3ced779cdedee0e0baffd7dff5cdfa3abc8a52"
dependencies = [
"quote",
"wasm-bindgen-macro-support",
@@ -1912,22 +1899,22 @@ dependencies = [
[[package]]
name = "wasm-bindgen-macro-support"
version = "0.2.103"
version = "0.2.110"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ffc003a991398a8ee604a401e194b6b3a39677b3173d6e74495eb51b82e99a32"
checksum = "0e63d1795c565ac3462334c1e396fd46dbf481c40f51f5072c310717bc4fb309"
dependencies = [
"bumpalo",
"proc-macro2",
"quote",
"syn",
"wasm-bindgen-backend",
"wasm-bindgen-shared",
]
[[package]]
name = "wasm-bindgen-shared"
version = "0.2.103"
version = "0.2.110"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "293c37f4efa430ca14db3721dfbe48d8c33308096bd44d80ebaa775ab71ba1cf"
checksum = "e9f9cdac23a5ce71f6bf9f8824898a501e511892791ea2a0c6b8568c68b9cb53"
dependencies = [
"unicode-ident",
]
@@ -1948,9 +1935,9 @@ dependencies = [
[[package]]
name = "web-sys"
version = "0.3.80"
version = "0.3.87"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fbe734895e869dc429d78c4b433f8d17d95f8d05317440b4fad5ab2d33e596dc"
checksum = "f2c7c5718134e770ee62af3b6b4a84518ec10101aad610c024b64d6ff29bb1ff"
dependencies = [
"js-sys",
"wasm-bindgen",

View File

@@ -16,15 +16,15 @@ name = "buttplug_wasm"
path = "src/lib.rs"
[dependencies]
buttplug_core = { git = "https://github.com/valknarthing/buttplug.git", default-features = false, features = ["wasm"] }
buttplug_server = { git = "https://github.com/valknarthing/buttplug.git", default-features = false, features = ["wasm"] }
buttplug_server_device_config = { git = "https://github.com/valknarthing/buttplug.git" }
js-sys = "0.3.80"
buttplug_core = { git = "https://github.com/valknarthing/buttplug.git", rev = "fad6c9d", default-features = false, features = ["wasm"] }
buttplug_server = { git = "https://github.com/valknarthing/buttplug.git", rev = "fad6c9d", default-features = false, features = ["wasm"] }
buttplug_server_device_config = { git = "https://github.com/valknarthing/buttplug.git", rev = "fad6c9d" }
js-sys = "0.3.87"
tracing-wasm = "0.2.1"
log-panics = { version = "2.1.0", features = ["with-backtrace"] }
console_error_panic_hook = "0.1.7"
wasmtimer = "0.4.3"
wasm-bindgen = { version = "0.2.103", features = ["serde-serialize"] }
wasm-bindgen = { version = "0.2.110", features = ["serde-serialize"] }
tokio = { version = "1.47.1", features = ["sync", "macros", "io-util"] }
tokio-stream = "0.1.17"
tracing = "0.1.41"
@@ -33,12 +33,12 @@ tracing-subscriber = { version = "0.3.20", features = ["json"] }
futures = "0.3.31"
futures-util = "0.3.31"
async-trait = "0.1.89"
wasm-bindgen-futures = "0.4.53"
wasm-bindgen-futures = "0.4.60"
getrandom = { version = "0.3", features = ["wasm_js"] }
parking_lot = { version = "0.11.1", features = ["wasm-bindgen"]}
[dependencies.web-sys]
version = "0.3.80"
version = "0.3.87"
# path = "../../wasm-bindgen/crates/web-sys"
#git = "https://github.com/rustwasm/wasm-bindgen"
features = [

View File

@@ -13,13 +13,13 @@
"build:wasm": "wasm-pack build --out-dir wasm --out-name index --target bundler --release"
},
"dependencies": {
"eventemitter3": "^5.0.1",
"typescript": "^5.9.2",
"vite": "^7.1.4",
"eventemitter3": "^5.0.4",
"typescript": "^5.9.3",
"vite": "^7.3.1",
"vite-plugin-wasm": "3.5.0",
"ws": "^8.18.3"
"ws": "^8.19.0"
},
"devDependencies": {
"wasm-pack": "^0.13.1"
"wasm-pack": "^0.14.0"
}
}

View File

@@ -7,7 +7,6 @@ use buttplug_server::device::hardware::communication::{
HardwareCommunicationManagerEvent,
};
use futures::future;
use js_sys::Array;
use tokio::sync::mpsc::Sender;
use wasm_bindgen::prelude::*;
use wasm_bindgen_futures::{spawn_local, JsFuture};
@@ -63,8 +62,8 @@ impl HardwareCommunicationManager for WebBluetoothCommunicationManager {
// way for anyone to add device configurations through FFI yet anyways.
let config_manager = create_test_dcm(false);
let options = web_sys::RequestDeviceOptions::new();
let filters = Array::new();
let optional_services = Array::new();
let mut filters = Vec::new();
let mut optional_services = Vec::new();
for vals in config_manager.base_communication_specifiers().iter() {
for config in vals.1.iter() {
if let ProtocolCommunicationSpecifier::BluetoothLE(btle) = &config {
@@ -77,16 +76,16 @@ impl HardwareCommunicationManager for WebBluetoothCommunicationManager {
} else {
filter.set_name(&name);
}
filters.push(&filter.into());
filters.push(filter);
}
for (service, _) in btle.services() {
optional_services.push(&service.to_string().into());
optional_services.push(js_sys::JsString::from(service.to_string()));
}
}
}
}
options.set_filters(&filters.into());
options.set_optional_services(&optional_services.into());
options.set_filters(&filters);
options.set_optional_services(&optional_services);
let nav = web_sys::window().unwrap().navigator();
//nav.bluetooth().get_availability();
//JsFuture::from(nav.bluetooth().request_device()).await;

View File

@@ -1,6 +1,4 @@
PUBLIC_API_URL=https://sexy.pivoine.art/api
PUBLIC_URL=https://sexy.pivoine.art
PUBLIC_API_URL=http://localhost:3000/api
PUBLIC_URL=http://localhost:3000
PUBLIC_UMAMI_ID=
LETTERSPACE_API_URL=
LETTERSPACE_API_KEY=
LETTERSPACE_LIST_ID=
PUBLIC_UMAMI_SCRIPT=

View File

@@ -11,39 +11,40 @@
"start": "node ./build"
},
"devDependencies": {
"@iconify-json/ri": "^1.2.5",
"@iconify/tailwind4": "^1.0.6",
"@internationalized/date": "^3.8.2",
"@lucide/svelte": "^0.544.0",
"@sveltejs/adapter-node": "^5.3.1",
"@sveltejs/adapter-static": "^3.0.9",
"@sveltejs/kit": "^2.37.0",
"@sveltejs/vite-plugin-svelte": "^6.1.4",
"@tailwindcss/forms": "^0.5.9",
"@tailwindcss/typography": "^0.5.15",
"@tailwindcss/vite": "^4.0.0",
"@tsconfig/svelte": "^5.0.5",
"bits-ui": "2.11.0",
"@iconify-json/ri": "^1.2.10",
"@iconify/tailwind4": "^1.2.1",
"@internationalized/date": "^3.11.0",
"@lucide/svelte": "^0.577.0",
"@sveltejs/adapter-node": "^5.5.4",
"@sveltejs/adapter-static": "^3.0.10",
"@sveltejs/kit": "^2.53.4",
"@sveltejs/vite-plugin-svelte": "^6.2.4",
"@tailwindcss/forms": "^0.5.11",
"@tailwindcss/typography": "^0.5.19",
"@tailwindcss/vite": "^4.2.1",
"@tsconfig/svelte": "^5.0.8",
"bits-ui": "2.16.2",
"clsx": "^2.1.1",
"glob": "^11.0.3",
"glob": "^13.0.6",
"mode-watcher": "^1.1.0",
"prettier-plugin-svelte": "^3.4.0",
"super-sitemap": "^1.0.5",
"svelte": "^5.38.6",
"svelte-sonner": "^1.0.5",
"tailwind-merge": "^3.3.1",
"tailwind-variants": "^1.0.0",
"tailwindcss": "^4.0.0",
"tw-animate-css": "^1.3.8",
"typescript": "^5.9.2",
"vite": "^7.1.4",
"prettier-plugin-svelte": "^3.5.1",
"super-sitemap": "^1.0.7",
"svelte": "^5.53.7",
"svelte-sonner": "^1.0.8",
"tailwind-merge": "^3.5.0",
"tailwind-variants": "^3.2.2",
"tailwindcss": "^4.2.1",
"tw-animate-css": "^1.4.0",
"typescript": "^5.9.3",
"vite": "^7.3.1",
"vite-plugin-wasm": "3.5.0"
},
"dependencies": {
"@directus/sdk": "^20.0.3",
"@sexy.pivoine.art/buttplug": "workspace:*",
"javascript-time-ago": "^2.5.11",
"media-chrome": "^4.13.1",
"graphql": "^16.11.0",
"graphql-request": "^7.1.2",
"javascript-time-ago": "^2.6.4",
"media-chrome": "^4.18.0",
"svelte-i18n": "^4.0.1"
}
}

View File

@@ -30,7 +30,7 @@ export const handle: Handle = async ({ event, resolve }) => {
});
// Handle authentication
const token = cookies.get("directus_session_token");
const token = cookies.get("session_token");
if (token) {
try {
@@ -42,7 +42,7 @@ export const handle: Handle = async ({ event, resolve }) => {
userId: locals.authStatus.user?.id,
context: {
email: locals.authStatus.user?.email,
role: locals.authStatus.user?.role?.name,
role: locals.authStatus.user?.role,
},
});
} else {

View File

@@ -0,0 +1,25 @@
import { GraphQLClient } from "graphql-request";
import { env } from "$env/dynamic/public";
import type { CurrentUser } from "./types";
export const apiUrl = env.PUBLIC_API_URL || "http://localhost:3000/api";
export const getGraphQLClient = (fetchFn?: typeof globalThis.fetch) =>
new GraphQLClient(`${apiUrl}/graphql`, {
credentials: "include",
fetch: fetchFn || globalThis.fetch,
});
export const getAssetUrl = (
id: string,
transform?: "mini" | "thumbnail" | "preview" | "medium" | "banner",
) => {
if (!id) {
return null;
}
return `${apiUrl}/assets/${id}${transform ? "?transform=" + transform : ""}`;
};
export const isModel = (user: CurrentUser) => {
return user.role === "model";
};

View File

@@ -1,6 +1,6 @@
<script lang="ts">
import { _ } from "svelte-i18n";
import { PUBLIC_URL } from "$env/static/public";
import { env } from "$env/dynamic/public";
interface Props {
title: string;
@@ -11,14 +11,17 @@ interface Props {
let {
title,
description,
image = `${PUBLIC_URL || "http://localhost:3000"}/img/kamasutra.jpg`,
image = `${env.PUBLIC_URL || "http://localhost:3000"}/img/kamasutra.jpg`,
}: Props = $props();
</script>
<svelte:head>
<title>{$_("head.title", { values: { title } })}</title>
<meta name="description" content={description} />
<meta property="og:title" content={$_("head.title", { values: { title } })} />
<meta
property="og:title"
content={$_("head.title", { values: { title } })}
/>
<meta property="og:description" content={description} />
<meta property="og:image" content={image} />
</svelte:head>

View File

@@ -1,119 +0,0 @@
<script lang="ts">
import { _ } from "svelte-i18n";
import {
Dialog,
DialogContent,
DialogDescription,
DialogHeader,
DialogTitle,
} from "$lib/components/ui/dialog";
import { Button } from "$lib/components/ui/button";
import { Separator } from "$lib/components/ui/separator";
import type { Snippet } from "svelte";
import Label from "../ui/label/label.svelte";
import Input from "../ui/input/input.svelte";
import { toast } from "svelte-sonner";
interface Props {
open: boolean;
email: string;
children?: Snippet;
}
let isLoading = $state(false);
async function handleSubscription(e: Event) {
e.preventDefault();
try {
isLoading = true;
await fetch("/newsletter", {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({ email }),
});
toast.success(
$_("newsletter_signup.toast_subscribe", { values: { email } }),
);
} finally {
isLoading = false;
open = false;
}
}
let { open = $bindable(), email = $bindable() }: Props = $props();
</script>
<Dialog bind:open>
<DialogContent class="sm:max-w-md">
<DialogHeader class="space-y-4">
<div class="flex items-center justify-between">
<div class="flex items-center gap-3">
<div
class="w-10 h-10 rounded-full bg-gradient-to-br from-primary to-purple-600 flex items-center justify-center shrink-0 grow-0"
>
<span class="icon-[ri--newspaper-line]"></span>
</div>
<div class="">
<DialogTitle
class="text-left text-xl font-semibold text-primary-foreground"
>{$_('newsletter_signup.title')}</DialogTitle
>
<DialogDescription class="text-left text-sm">
{$_('newsletter_signup.description')}
</DialogDescription>
</div>
</div>
</div>
</DialogHeader>
<Separator class="my-4" />
<form onsubmit={handleSubscription}>
<!-- Email -->
<div class="space-y-2 flex gap-4 items-center">
<Label for="email" class="m-0">{$_('newsletter_signup.email')}</Label>
<Input
id="email"
type="email"
placeholder={$_('newsletter_signup.email_placeholder')}
bind:value={email}
required
class="bg-background/50 border-primary/20 focus:border-primary"
/>
</div>
<Separator class="my-8" />
<!-- Close Button -->
<div class="flex justify-end gap-4">
<Button
variant="ghost"
size="sm"
onclick={() => (open = false)}
class="text-muted-foreground hover:text-foreground cursor-pointer"
>
<span class="icon-[ri--close-large-line]"></span>
{$_('newsletter_signup.close')}
</Button>
<Button
variant="default"
size="sm"
type="submit"
class="cursor-pointer"
disabled={isLoading}
>
{#if isLoading}
<div
class="w-4 h-4 border-2 border-white/30 border-t-white rounded-full animate-spin mr-2"
></div>
{$_('newsletter_signup.subscribing')}
{:else}
<span class="icon-[ri--check-line]"></span>
{$_('newsletter_signup.subscribe')}
{/if}
</Button>
</div>
</form>
</DialogContent>
</Dialog>

View File

@@ -1,26 +0,0 @@
<script>
import { _ } from "svelte-i18n";
import { Button } from "../ui/button";
import { Card, CardContent } from "../ui/card";
import NewsletterSignupPopup from "./newsletter-signup-popup.svelte";
let isPopupOpen = $state(false);
let { email = "" } = $props();
</script>
<!-- Newsletter Signup -->
<Card class="p-0 not-last:bg-gradient-to-br from-primary/10 to-accent/10">
<CardContent class="p-6 text-center">
<h3 class="font-semibold mb-2">{$_('newsletter_signup.title')}</h3>
<p class="text-sm text-muted-foreground mb-4">
{$_('newsletter_signup.description')}
</p>
<Button
onclick={() => (isPopupOpen = true)}
target="_blank"
class="cursor-pointer w-full bg-gradient-to-r from-primary to-accent hover:from-primary/90 hover:to-accent/90"
>{$_('newsletter_signup.cta')}</Button
>
<NewsletterSignupPopup bind:open={isPopupOpen} {email} />
</CardContent>
</Card>

View File

@@ -1,35 +1,3 @@
import { authentication, createDirectus, rest } from "@directus/sdk";
import { PUBLIC_API_URL } from "$env/static/public";
import type { CurrentUser } from "./types";
export const directusApiUrl = PUBLIC_API_URL || "http://localhost:3000/api";
export const getDirectusInstance = (fetch?: typeof globalThis.fetch) => {
const options: { globals?: { fetch: typeof globalThis.fetch } } = fetch
? { globals: { fetch } }
: {};
const directus = createDirectus(directusApiUrl, options)
.with(rest())
.with(authentication("session"));
return directus;
};
export const getAssetUrl = (
id: string,
transform?: "mini" | "thumbnail" | "preview" | "medium" | "banner",
) => {
if (!id) {
return null;
}
return `${directusApiUrl}/assets/${id}${transform ? "?transform=" + transform : ""}`;
};
export const isModel = (user: CurrentUser) => {
if (user.role.name === "Model") {
return true;
}
if (user.policies.find((p) => p.policy.name === "Model")) {
return true;
}
return false;
};
// Re-export from api.ts for backwards compatibility
// All components that import from $lib/directus continue to work
export { apiUrl as directusApiUrl, getAssetUrl, isModel, getGraphQLClient as getDirectusInstance } from "./api.js";

View File

@@ -613,8 +613,8 @@ export default {
address: {
company: "SexyArt",
name: "Sebastian Krüger",
street: "Neue Weinsteige 21",
city: "70180 Stuttgart",
street: "Berlingerstraße 48",
city: "78333 Stockach",
country: "Germany",
},
phone: {
@@ -870,18 +870,6 @@ export default {
exit: "Exit",
exit_url: "https://pivoine.art",
},
newsletter_signup: {
title: "Stay Updated",
description:
"Get the latest articles and insights delivered to your inbox.",
email: "Email",
email_placeholder: "your@email.com",
cta: "Subscribe to Newsletter",
close: "Close",
subscribe: "Subscribe",
subscribing: "Subscribing",
toast_subscribe: "Your email has been added to the newsletter list!",
},
sharing_popup_button: {
share: "Share",
},

View File

@@ -123,7 +123,7 @@ class Logger {
PUBLIC_API_URL: process.env.PUBLIC_API_URL,
PUBLIC_URL: process.env.PUBLIC_URL,
PUBLIC_UMAMI_ID: process.env.PUBLIC_UMAMI_ID ? '***set***' : 'not set',
LETTERSPACE_API_URL: process.env.LETTERSPACE_API_URL || 'not set',
PUBLIC_UMAMI_SCRIPT: process.env.PUBLIC_UMAMI_SCRIPT || 'not set',
PORT: process.env.PORT || '3000',
HOST: process.env.HOST || '0.0.0.0',
};

File diff suppressed because it is too large Load Diff

View File

@@ -16,14 +16,8 @@ export interface User {
export interface CurrentUser extends User {
avatar: File;
role: {
name: string;
};
policies: {
policy: {
name: string;
};
}[];
role: "model" | "viewer" | "admin";
policies: string[];
}
export interface AuthStatus {

View File

@@ -7,7 +7,7 @@ import Footer from "$lib/components/footer/footer.svelte";
import { Toaster } from "$lib/components/ui/sonner";
import Header from "$lib/components/header/header.svelte";
import AgeVerificationDialog from "$lib/components/age-verification-dialog/age-verification-dialog.svelte";
import { PUBLIC_UMAMI_ID } from "$env/static/public";
import { env } from "$env/dynamic/public";
onMount(async () => {
await waitLocale();
@@ -17,11 +17,11 @@ let { children, data } = $props();
</script>
<svelte:head>
{#if import.meta.env.PROD && PUBLIC_UMAMI_ID}
{#if import.meta.env.PROD && env.PUBLIC_UMAMI_ID && env.PUBLIC_UMAMI_SCRIPT}
<script
defer
src="https://umami.pivoine.art/script.js"
data-website-id={PUBLIC_UMAMI_ID}
src={env.PUBLIC_UMAMI_SCRIPT}
data-website-id={env.PUBLIC_UMAMI_ID}
></script>
{/if}
</svelte:head>

View File

@@ -1,5 +1,17 @@
import { redirect } from "@sveltejs/kit";
import type { PageServerLoad } from "./$types";
import { gql } from "graphql-request";
import { getGraphQLClient } from "$lib/api";
const LEADERBOARD_QUERY = gql`
query Leaderboard($limit: Int, $offset: Int) {
leaderboard(limit: $limit, offset: $offset) {
user_id display_name avatar
total_weighted_points total_raw_points
recordings_count playbacks_count achievements_count rank
}
}
`;
export const load: PageServerLoad = async ({ fetch, url, locals }) => {
// Guard: Redirect to login if not authenticated
@@ -11,22 +23,27 @@ export const load: PageServerLoad = async ({ fetch, url, locals }) => {
const limit = parseInt(url.searchParams.get("limit") || "100");
const offset = parseInt(url.searchParams.get("offset") || "0");
const response = await fetch(
`/api/sexy/gamification/leaderboard?limit=${limit}&offset=${offset}`,
);
if (!response.ok) {
throw new Error("Failed to fetch leaderboard");
}
const data = await response.json();
const client = getGraphQLClient(fetch);
const data = await client.request<{
leaderboard: {
user_id: string;
display_name: string | null;
avatar: string | null;
total_weighted_points: number | null;
total_raw_points: number | null;
recordings_count: number | null;
playbacks_count: number | null;
achievements_count: number | null;
rank: number;
}[];
}>(LEADERBOARD_QUERY, { limit, offset });
return {
leaderboard: data.data || [],
leaderboard: data.leaderboard || [],
pagination: {
limit,
offset,
hasMore: data.data?.length === limit,
hasMore: data.leaderboard?.length === limit,
},
};
} catch (error) {

View File

@@ -9,7 +9,6 @@ import { getAssetUrl } from "$lib/directus";
import SharingPopup from "$lib/components/sharing-popup/sharing-popup.svelte";
import Meta from "$lib/components/meta/meta.svelte";
import PeonyBackground from "$lib/components/background/peony-background.svelte";
import NewsletterSignup from "$lib/components/newsletter-signup/newsletter-signup-widget.svelte";
import SharingPopupButton from "$lib/components/sharing-popup/sharing-popup-button.svelte";
const { data } = $props();
@@ -215,8 +214,6 @@ const timeAgo = new TimeAgo("en");
</Card>
-->
<!-- <NewsletterSignup email={data.authStatus.user?.email}/> -->
<!-- Back to Magazine -->
<Button
variant="outline"

View File

@@ -1,22 +0,0 @@
import {
LETTERSPACE_API_KEY,
LETTERSPACE_API_URL,
LETTERSPACE_LIST_ID,
} from "$env/static/private";
import { json } from "@sveltejs/kit";
export async function POST({ request, fetch }) {
const { email } = await request.json();
const lists = [LETTERSPACE_LIST_ID];
await fetch(`${LETTERSPACE_API_URL}/subscribers`, {
method: "POST",
headers: {
"Content-Type": "application/json",
"X-API-Key": LETTERSPACE_API_KEY,
},
body: JSON.stringify({ email, lists }),
});
return json({ email }, { status: 201 });
}

View File

@@ -1,5 +1,25 @@
import { redirect } from "@sveltejs/kit";
import type { PageServerLoad } from "./$types";
import { gql } from "graphql-request";
import { getGraphQLClient } from "$lib/api";
const USER_PROFILE_QUERY = gql`
query UserProfile($id: String!) {
userProfile(id: $id) {
id first_name last_name email description avatar date_created
}
userGamification(userId: $id) {
stats {
user_id total_raw_points total_weighted_points
recordings_count playbacks_count comments_count achievements_count rank
}
achievements {
id code name description icon category date_unlocked progress required_count
}
recent_points { action points date_created recording_id }
}
}
`;
export const load: PageServerLoad = async ({ params, locals, fetch }) => {
// Guard: Redirect to login if not authenticated
@@ -10,38 +30,44 @@ export const load: PageServerLoad = async ({ params, locals, fetch }) => {
const { id } = params;
try {
// Fetch user profile data from Directus
const userResponse = await fetch(`/api/users/${id}?fields=id,first_name,last_name,email,description,avatar,date_created,location`);
const client = getGraphQLClient(fetch);
const data = await client.request<{
userProfile: {
id: string;
first_name: string | null;
last_name: string | null;
email: string;
description: string | null;
avatar: string | null;
date_created: string;
} | null;
userGamification: {
stats: {
user_id: string;
total_raw_points: number | null;
total_weighted_points: number | null;
recordings_count: number | null;
playbacks_count: number | null;
comments_count: number | null;
achievements_count: number | null;
rank: number;
} | null;
achievements: unknown[];
recent_points: unknown[];
} | null;
}>(USER_PROFILE_QUERY, { id });
if (!userResponse.ok) {
if (!data.userProfile) {
throw redirect(404, "/");
}
const userData = await userResponse.json();
const user = userData.data;
// Fetch user's comments count
const commentsResponse = await fetch(`/api/comments?filter[user_created][_eq]=${id}&aggregate[count]=*`);
const commentsData = await commentsResponse.json();
const commentsCount = commentsData.data?.[0]?.count || 0;
// Fetch user's video likes count
const likesResponse = await fetch(`/api/items/sexy_video_likes?filter[user_id][_eq]=${id}&aggregate[count]=*`);
const likesData = await likesResponse.json();
const likesCount = likesData.data?.[0]?.count || 0;
// Fetch gamification data
const gamificationResponse = await fetch(`/api/sexy/gamification/user/${id}`);
let gamification = null;
if (gamificationResponse.ok) {
gamification = await gamificationResponse.json();
}
const gamification = data.userGamification;
return {
user,
user: data.userProfile,
stats: {
comments_count: commentsCount,
likes_count: likesCount,
comments_count: gamification?.stats?.comments_count || 0,
likes_count: 0,
},
gamification,
isOwnProfile: locals.authStatus.user?.id === id,

View File

@@ -16,7 +16,6 @@ import { formatVideoDuration, getUserInitials } from "$lib/utils";
import { invalidateAll } from "$app/navigation";
import { toast } from "svelte-sonner";
import { createCommentForVideo, likeVideo, unlikeVideo, recordVideoPlay, updateVideoPlay } from "$lib/services";
import NewsletterSignup from "$lib/components/newsletter-signup/newsletter-signup-widget.svelte";
import SharingPopupButton from "$lib/components/sharing-popup/sharing-popup-button.svelte";
const { data } = $props();
@@ -539,8 +538,6 @@ let showPlayer = $state(false);
</CardContent>
</Card> -->
<!-- <NewsletterSignup /> -->
<!-- Back to Videos -->
<Button
variant="outline"

View File

@@ -14,7 +14,7 @@ export default defineConfig({
proxy: {
"/api": {
rewrite: (path) => path.replace(/^\/api/, ""),
target: "http://localhost:8055",
target: "http://localhost:4000",
changeOrigin: true,
secure: false,
ws: true,

4452
pnpm-lock.yaml generated

File diff suppressed because it is too large Load Diff

2667
schema.sql Normal file

File diff suppressed because it is too large Load Diff