Compare commits

...

177 Commits

Author SHA1 Message Date
9b8b07c653 chore: lint and format
All checks were successful
Build and Push Backend Image / build (push) Successful in 50s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-11 11:36:18 +01:00
22a2e63687 chore: gitea backend workflow with email
All checks were successful
Build and Push Backend Image / build (push) Successful in 49s
2026-03-11 11:32:51 +01:00
a05a96a8aa fix: install email devDeps in builder and copy email artifacts to runner
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-11 11:31:01 +01:00
d2deb3a218 docs: update README with email package, buttplug CI badge, and 2026 copyright
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-11 11:23:12 +01:00
d0f0d865b6 refactor(email): externalize styles to email.css, inject via expandLinkTag
Some checks failed
Build and Push Backend Image / build (push) Failing after 22s
- Move @import "@maizzle/tailwindcss" + @theme tokens to packages/email/email.css
- Layout uses <link rel="stylesheet" href="{{ cssPath }}" inline> — Maizzle's
  expandLinkTag reads the absolute path and expands it to a <style> tag, which
  the second compileCss pass then processes with @tailwindcss/postcss + LightningCSS
- render.ts passes cssPath as a local so the expression resolves inside the layout
- Layout head is now clean HTML with no inline style logic

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-11 10:56:30 +01:00
a30692b1ac refactor(email): align templates with frontend design tokens from app.css
- @theme now mirrors all :root variables from app.css (background, foreground,
  card, muted, muted-foreground, border, primary, primary-foreground)
- Replaced all zinc-* utilities with semantic token classes (bg-background,
  bg-card, bg-muted, text-foreground, text-muted-foreground, border-border, etc.)
- Added Noto Sans via Google Fonts import (progressive enhancement — skips
  Tailwind processing via `plain` attribute)
- Font family @theme token set to Noto Sans with system-font fallbacks
- Button inline styles updated to use hex equivalent of --primary-foreground

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-11 10:49:23 +01:00
60531771cf feat: packages/email — Maizzle v6 + Tailwind CSS v4 HTML email templates
- New @sexy.pivoine.art/email package with @maizzle/framework@6.0.0-15
- Uses @maizzle/tailwindcss (TW v4 preset) with @theme brand tokens
  derived from the frontend's app.css oklch primary color
- LightningCSS automatically lowers oklch/lab to hex for email clients
- Real HTML template files (templates/layouts/main.html, verification.html,
  password-reset.html) — not JS template strings
- PostCSS `from` override so @import "@maizzle/tailwindcss" resolves from
  the email package's own node_modules
- Backend lib/email.ts now calls renderVerification/renderPasswordReset
  instead of inline HTML strings

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-11 10:41:12 +01:00
bb6bf7ca11 chore: cleanup
All checks were successful
Build and Push Buttplug Image / build (push) Successful in 3m20s
Build and Push Frontend Image / build (push) Successful in 1m17s
2026-03-11 09:32:05 +01:00
fdc16957a4 refactor: move card descriptions under page headings on profile and security pages
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-11 09:28:29 +01:00
f8cb365e09 chore: cleanup
All checks were successful
Build and Push Frontend Image / build (push) Successful in 1m16s
2026-03-11 09:18:45 +01:00
ad4f5b3700 fix: global Unauthorized handling — redirect to /login, suppress log spam
Some checks failed
Build and Push Frontend Image / build (push) Has been cancelled
- Add UnauthorizedError class exported from services.ts
- loggedApiCall now detects Unauthorized GraphQL errors, logs at DEBUG
  instead of ERROR, and throws UnauthorizedError (no more stack dumps)
- hooks.server.ts catches UnauthorizedError from any load function and
  redirects to /login?redirect=<original-path>
- getRecordings, getRecording, getAnalytics now accept an optional token
  and use getAuthClient server-side so cross-origin cookie forwarding works
- Update play/recordings, play/buttplug, me/analytics page.server.ts to
  pass the session token — prevents Unauthorized on auth-protected pages

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-11 09:01:47 +01:00
3fd876180a ci: link Docker images to Gitea repository via OCI source label
All checks were successful
Build and Push Frontend Image / build (push) Successful in 1m11s
docker/metadata-action uses github.* context vars which are empty in
Gitea Actions. Explicitly set org.opencontainers.image.source using
gitea.server_url and gitea.repository so the container registry links
each image back to this repository.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-10 17:04:45 +01:00
c5b04be981 fix: align How It Works card padding with leaderboard card
Use CardHeader + CardTitle instead of an h3 inside CardContent,
so both cards get the same pt-0 treatment on CardContent.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-10 17:03:16 +01:00
96cffb9be1 fix: tighten leaderboard entry layout for mobile
- px-2 py-2 / gap-2 on mobile (was p-4 / gap-4)
- Rank badge w-8 on mobile (was w-14), font scaled down
- Avatar h-9 w-9 on mobile (was h-12 w-12)
- Score text-lg on mobile (was text-2xl), "points" label hidden on mobile
- Stats always visible, icons/gaps scaled down for mobile
- Arrow indicator hidden on mobile (hover-only, useless on touch)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-10 17:01:44 +01:00
9b1771ed6a fix: prevent horizontal overflow on mobile in /play layout
Add overflow-hidden to outer wrapper so the absolutely-positioned
SexyBackground is clipped, matching how public pages handle it.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-10 16:55:31 +01:00
b842106e44 fix: match pagination button size to admin filter buttons (default size)
All checks were successful
Build and Push Frontend Image / build (push) Successful in 1m11s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-10 12:07:44 +01:00
9abcd715d7 feat: add subtitles to /play/buttplug and /play/recordings page headers
All checks were successful
Build and Push Frontend Image / build (push) Successful in 1m12s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-10 12:04:24 +01:00
ab0af9a773 feat: extract Pagination component and use it on all paginated pages
All checks were successful
Build and Push Frontend Image / build (push) Successful in 1m13s
- New lib/components/pagination/pagination.svelte with numbered pages,
  ellipsis for large ranges, and prev/next buttons
- All 6 admin pages (users, articles, videos, recordings, comments,
  queues) now show enumerated page numbers next to the "Showing X–Y of Z"
  label; offset is derived from page number * limit
- Public pages (videos, models, magazine) replace their inline
  totalPages/pageNumbers derived state with the shared component
- Removes ~80 lines of duplicated pagination logic across 9 files

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-10 12:01:13 +01:00
fbd2efa994 feat: server-side pagination and filtering for admin queues page
- Move queue, status, and offset to URL search params (?queue=&status=&offset=)
- Load jobs server-side in +page.server.ts with auth token (matches other admin pages)
- Derive total from adminQueues counts (waiting+active+completed+failed+delayed)
  so pagination knows total without an extra query
- Add fetchFn/token params to getAdminQueueJobs for server-side use
- Retry/remove/pause/resume actions now use invalidateAll() instead of local state

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-10 11:49:50 +01:00
79932157bf fix: revoke points when a comment is deleted
All checks were successful
Build and Push Backend Image / build (push) Successful in 43s
- revokePoints now accepts optional recordingId; when absent it deletes
  one matching row (for actions like COMMENT_CREATE that have no recording)
- deleteComment queues revokePoints + checkAchievements so leaderboard
  and social achievements stay in sync

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-10 11:16:18 +01:00
04b0ec1a71 fix: revoke gamification points on recording delete + fix comment collection
- deleteRecording now queues revokePoints for RECORDING_CREATE (and
  RECORDING_FEATURED if applicable) before deleting a published recording,
  so leaderboard points are correctly removed
- Fix comment stat/achievement queries using collection "recordings" instead
  of "videos" — comments are stored under collection "videos", so the count
  was always 0, breaking COMMENT_CREATE stats and social achievements

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-10 11:13:01 +01:00
cc693d8be7 fix: prevent achievement points from being re-awarded on republish
All checks were successful
Build and Push Backend Image / build (push) Successful in 1m2s
Once an achievement is unlocked, preserve date_unlocked permanently
instead of clearing it to null when the user drops below the threshold
(e.g. on unpublish). This prevents the wasUnlocked check from returning
false on republish, which was causing achievement points to be re-awarded
on every publish/unpublish cycle.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-09 20:04:20 +01:00
52aa00dd13 fix: embed DECAY_LAMBDA as SQL literal to avoid pg type inference failure
All checks were successful
Build and Push Backend Image / build (push) Successful in 45s
Build and Push Frontend Image / build (push) Successful in 1m12s
PostgreSQL cannot resolve the type of a parameterized $1 = 0.005 in
-$1 * EXTRACT(EPOCH ...) and fails with an operator type error. Using
sql.raw() embeds the constant directly in the query string so userId
is the only parameter.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-09 19:55:45 +01:00
8085b40af8 fix: use NOW() in weighted score query instead of JS Date parameter
Passing a JS Date to a Drizzle sql template serializes it as a locale
string (e.g. "Mon Mar 09 2026 19:51:22 GMT+0100") which PostgreSQL
cannot parse as timestamptz, causing the gamification worker to fail.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-09 19:52:03 +01:00
5f40a812d3 feat: gamification queue with deduplication and unpublish revoke
- Add migration 0004: partial unique index on user_points (user_id, action, recording_id)
  for RECORDING_CREATE and RECORDING_FEATURED to prevent earn-on-republish farming
- Add revokePoints() to gamification lib; awardPoints() now uses onConflictDoNothing
- Add gamificationQueue (BullMQ) with 3-attempt exponential backoff
- Add gamification worker handling awardPoints, revokePoints, checkAchievements jobs
- Move all inline gamification calls in recordings + comments resolvers to queue
- Revoke RECORDING_CREATE points when a recording is unpublished (published → draft)
- Register gamification worker at server startup alongside mail worker

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-09 19:50:33 +01:00
1b724e86c9 chore: lint and format
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-09 19:38:37 +01:00
a9e4ed6049 feat: refactor play area into sidebar layout with buttplug, recordings, and leaderboard sub-pages
- Add /play sidebar layout (mobile nav + desktop sidebar) with SexyBackground
- Move buttplug device control to /play/buttplug with Empty component and scan button
- Move recordings from /me/recordings to /play/recordings
- Move leaderboard to /play/leaderboard; redirect /leaderboard → /play/leaderboard
- Redirect /me/recordings → /play/recordings and /play → /play/buttplug
- Remove recordings entry from /me sidebar nav
- Rename "SexyPlay" → "Play", swap bluetooth icon for rocket, remove subtitle
- Add play.nav i18n keys (play, recordings, leaderboard, back_to_site, back_mobile)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-09 19:33:28 +01:00
66179d7ba8 style: streamline draft/published badge across recording card and admin table
- Replace custom inline span+getStatusColor with Badge component in recording card
- Align admin recordings table badge to same style (outline, green/yellow)
- Use i18n label in admin table instead of raw status string
- Remove unused cn import and getStatusColor helper

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-09 18:43:18 +01:00
3a8fa7d8ce style: refine admin edit forms and fix mobile padding
- Remove back button from admin entity edit pages (sidebar handles navigation)
- Remove cancel button from video/article forms, make submit button full-width
- Show actual entity title + subtitle on video/article edit pages
- Remove asterisks from Title/Slug field labels in i18n
- Remove px-3 sm:px-0 from all admin list page headers/filters (fixes mobile padding)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-09 18:38:38 +01:00
fddc3f15d0 feat: fix recording save and add publish/unpublish support
- Fix broken fetch("/api/sexy/recordings") → use createRecording GraphQL service
- Round duration to integer before sending (GraphQL Int type)
- Add updateRecording mutation to services
- Add publish/unpublish buttons to RecordingCard (draft ↔ published)
- Remove "Go to Play" button from recordings page header
- Add publish/unpublish i18n keys

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-09 18:26:42 +01:00
d9a60f0572 style: refine admin & me UI — card forms, back arrows, avatar in admin sidebar, Empty component
- Replace ← text with icon-[ri--arrow-left-line] in admin and me layouts
- Add avatar + admin shield badge to admin sidebar header
- Wrap all admin edit forms in Card (bg-card/50 border-primary/20) with styled inputs
- Fix sm:pl-6 → lg:pl-6 so extra left padding only applies when sidebar is visible
- Update security form submit button to gradient style matching profile
- Remove "View Public Profile" button from me/profile
- Use shadcn-svelte Empty component for recordings empty state
- Install empty component via shadcn-svelte

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-09 18:16:39 +01:00
ba648c796a feat: refactor /me into admin-style layout with profile, security, recordings, analytics sub-pages
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-09 17:47:00 +01:00
27e2ff5f66 feat: add Meta title tags to all admin pages
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-09 17:23:35 +01:00
b7a29c55b3 style: fix header nav gap
All checks were successful
Build and Push Frontend Image / build (push) Successful in 1m16s
2026-03-09 10:24:14 +01:00
99b2ed7f2b feat: show login/signup buttons on mobile header
All checks were successful
Build and Push Frontend Image / build (push) Successful in 1m24s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-09 10:00:02 +01:00
8357aecf98 feat: add ring border to logo icon matching avatar style
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-09 09:58:27 +01:00
ab3d9f4118 feat: show auth icon strip on mobile header, move burger outside pill
All checks were successful
Build and Push Frontend Image / build (push) Successful in 1m16s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-09 09:50:25 +01:00
5219fae36a feat: add structured logging to BullMQ queues and workers
All checks were successful
Build and Push Backend Image / build (push) Successful in 43s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-09 09:33:43 +01:00
7de1bf7a03 style: fix header
All checks were successful
Build and Push Frontend Image / build (push) Successful in 1m46s
2026-03-09 08:56:49 +01:00
a4fd1ff18b fix: soften header shadow glow
All checks were successful
Build and Push Frontend Image / build (push) Successful in 1m15s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 20:32:17 +01:00
6605980a43 style: move page gradient to global background so it shows behind header
All checks were successful
Build and Push Frontend Image / build (push) Successful in 1m19s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 19:32:58 +01:00
15d9708072 Revert "feat: fixed header with hero section extending behind it"
This reverts commit fc97c1b84b.
2026-03-08 19:25:15 +01:00
89c4c390fa Revert "feat: extend page-hero behind fixed header on all pages"
This reverts commit f5ff59b910.
2026-03-08 19:25:15 +01:00
f5ff59b910 feat: extend page-hero behind fixed header on all pages
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 19:24:22 +01:00
fc97c1b84b feat: fixed header with hero section extending behind it
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 19:22:41 +01:00
e2abb0794a style: use text-foreground color for burger menu lines
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 19:18:39 +01:00
2644e033b4 style: remove underline from logout button hover
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 19:16:43 +01:00
ee1cea6d01 style: match logout button hover to other icon buttons, destructive color on hover
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 19:16:04 +01:00
1496399b96 style: remove header border, keep glow shadow only
All checks were successful
Build and Push Frontend Image / build (push) Successful in 1m15s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 19:09:55 +01:00
075f64f4e3 style: single crisp primary border with soft glow on header
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 19:06:43 +01:00
8c6c98d612 style: edgy glowing bottom border on header
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 19:05:41 +01:00
28be084781 style: transparent header, no scroll tracking
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 19:04:25 +01:00
21b8d2c223 feat: transparent header at top, solid on scroll
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 19:03:39 +01:00
b315062d43 revert: restore original header styling
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 19:02:42 +01:00
5bef996dbc fix: use derived override pattern for selectedQueue to avoid captured state warning
All checks were successful
Build and Push Backend Image / build (push) Successful in 43s
Build and Push Frontend Image / build (push) Successful in 1m38s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 18:36:50 +01:00
da2484d232 fix: replace nested button with div[role=button] on queue cards
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 18:35:16 +01:00
722392d19e chore: lint and format
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 18:32:39 +01:00
a07a5cb091 fix: suppress false-positive svelte state warning on queues page
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 18:31:35 +01:00
ea23233645 feat: add BullMQ job queue with admin monitoring UI
All checks were successful
Build and Push Backend Image / build (push) Successful in 48s
Build and Push Buttplug Image / build (push) Successful in 3m26s
Build and Push Frontend Image / build (push) Successful in 1m11s
- Add BullMQ to backend; mail jobs (verification, password reset) now enqueued instead of sent inline
- Mail worker processes jobs with 3-attempt exponential backoff retry
- Admin GraphQL resolvers: adminQueues, adminQueueJobs, adminRetryJob, adminRemoveJob, adminPauseQueue, adminResumeQueue
- Admin frontend page at /admin/queues: queue cards with counts, job table with status filter, retry/remove/pause actions

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 18:25:09 +01:00
6dcdc0130b chore: streamline package.json files 2026-03-08 17:46:04 +01:00
8508e1f6e9 style: reduce header background opacity for softer appearance
All checks were successful
Build and Push Frontend Image / build (push) Successful in 1m13s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 17:39:47 +01:00
6abcfc7363 chore: remove unused super-sitemap dependency
All checks were successful
Build and Push Buttplug Image / build (push) Successful in 3m38s
Build and Push Frontend Image / build (push) Successful in 1m12s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 17:22:55 +01:00
d4b3968518 fix: suppress Rust compiler warnings in buttplug
- #[allow(dead_code)] on FFICallbackContextWrapper, Connected variant,
  Unsubscribe variant, and device_event_receiver field
- let _ = for unused Result from callback.call1() (x2) and .send().await

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 17:21:58 +01:00
8f4999f127 chore: upgrade pnpm from 10.19.0 to 10.31.0
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 17:19:39 +01:00
4b53a25fa3 fix: proper build script calls in package.json 2026-03-08 17:16:46 +01:00
4f85637875 fix: upgrade Node.js to 22.14.0, add svelte-kit sync before build
All checks were successful
Build and Push Backend Image / build (push) Successful in 1m9s
Build and Push Buttplug Image / build (push) Successful in 4m21s
Build and Push Frontend Image / build (push) Successful in 1m15s
- Node 22.11.0 is below Vite's minimum requirement of 22.12+
- svelte-kit sync must run before vite build to generate
  .svelte-kit/tsconfig.json which tsconfig.json extends

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 17:14:51 +01:00
1175b4d0e6 chore: update @internationalized/date to 3.12.0
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 17:12:57 +01:00
2afa3c6e9b fix: replace raw HTML buttons with Button component in admin, remove vite-plugin-wasm
- Use Button component for photo remove, editor tab toggle, and model
  pill buttons across admin/users, admin/articles, admin/videos
- Remove vite-plugin-wasm from frontend devDependencies (no longer
  needed since WASM is served by the buttplug nginx container)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 17:11:24 +01:00
b55cebea4e ci: add path filters to all workflow triggers
Each image only builds when its relevant source changes:
- backend: packages/backend/**, packages/types/**, Dockerfile.backend
- frontend: packages/frontend/**, packages/types/**, Dockerfile
- buttplug: packages/buttplug/**, Dockerfile.buttplug, nginx.buttplug.conf

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 16:49:48 +01:00
9845553d49 fix: switch buttplug WASM to --target web for browser compatibility
All checks were successful
Build and Push Backend Image / build (push) Successful in 16s
Build and Push Buttplug Image / build (push) Successful in 3m37s
Build and Push Frontend Image / build (push) Successful in 1m15s
--target bundler generates static WASM ESM imports that only work
through a bundler. --target web generates fetch-based WASM loading
via import.meta.url which browsers handle natively.

- Change wasm-pack build target from bundler to web
- Call wasmModule.default() (init) after import in maybeLoadWasm
- Add .gitignore to exclude dist/ and wasm/ build outputs

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 14:12:01 +01:00
ced0a08da3 fix: serve buttplug locally in dev instead of Docker
Add a minimal Node.js static server (serve.mjs) to the buttplug package
that serves dist/ and wasm/ on port 8080 with correct MIME types.
Update dev:buttplug to use it instead of docker compose, avoiding a
full Rust/WASM Docker build on every dev start.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 13:55:31 +01:00
f880aa5957 feat: externalize buttplug as separate nginx container
- Add Dockerfile.buttplug: builds Rust/WASM + TS, serves via nginx
- Add nginx.buttplug.conf: serves /dist and /wasm with correct MIME types
- Add .gitea/workflows/docker-build-buttplug.yml: path-filtered CI workflow
- Strip Rust toolchain and buttplug build from frontend Dockerfile
- Move buttplug to devDependencies (types only at build time)
- Remove vite-plugin-wasm from frontend (WASM now served by nginx)
- Add /buttplug proxy in vite.config (dev: localhost:8080)
- Add buttplug service to compose.yml
- Load buttplug dynamically in play page via runtime import
- Fix faq page: suppress no-unnecessary-state-wrap for reassigned SvelteSet

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 13:49:38 +01:00
239128bf5e fix: buttplug lint errors
All checks were successful
Build and Push Backend Image / build (push) Successful in 17s
Build and Push Frontend Image / build (push) Successful in 4m34s
2026-03-08 13:28:59 +01:00
0a50c3efd8 fix: remove sitemap.xml
All checks were successful
Build and Push Backend Image / build (push) Successful in 16s
Build and Push Frontend Image / build (push) Successful in 4m29s
2026-03-08 12:04:50 +01:00
af4a11b73c style: apply prettier formatting to svelte and ts files
Some checks failed
Build and Push Backend Image / build (push) Successful in 1m3s
Build and Push Frontend Image / build (push) Has been cancelled
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 11:49:43 +01:00
627ce75719 fix: restore \$state on SvelteMap in device-mapping-dialog
The variable is fully reassigned in an \$effect, so \$state is required
for reactivity. Suppress the no-unnecessary-state-wrap lint rule with a
comment explaining the reason.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 11:47:21 +01:00
446e9f835b fix: use writable \$derived for search inputs, remove unnecessary \$state wrap
- Replace \$state + \$effect pattern with writable \$derived (Svelte 5.25+)
  for all searchValue instances across list pages — cleaner and lint-compliant
- Remove now-unused untrack imports from those files
- Drop \$state() wrapper around SvelteMap in device-mapping-dialog
  (SvelteMap is already reactive)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 11:46:15 +01:00
422f97417e fix: resolve vite-plugin-svelte warnings
- image-viewer: replace backdrop div with button for a11y
- file-drop-zone: wrap prop check in \$effect to avoid state_referenced_locally
- about: use \$derived for stats array
- magazine: use \$derived for featuredArticle
- play: add role/keyboard support to seek bar slider; fix \$state on SvelteMap in device-mapping-dialog
- admin/videos/[id]: add <track kind="captions"> to video element

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 11:41:58 +01:00
edee98b552 fix: use untrack() in \$state initialisers to silence state_referenced_locally warnings
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 11:31:51 +01:00
b9b98f178f fix: sync reactive state with data prop using \$effect
Replaces bare \$state(data.x) initialisers (which only capture the
initial value) with \$state + \$effect pairs so that state stays in sync
whenever page data is invalidated or the URL changes. Affects all list
pages (searchValue) and all edit/detail pages (form fields).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 11:06:30 +01:00
dc1850126b fix: run DB migrations automatically at backend startup
Instead of relying on a manual `pnpm db:migrate` step (which was
connecting to a different postgres than the Docker container), the
backend now calls drizzle migrate() before the server starts. This
ensures migrations always run against the correct database on startup.

Also fixes the Dockerfile to copy migrations into dist/migrations so
the path resolves correctly in the compiled output.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 11:00:36 +01:00
4d81266cb1 feat: add dedicated model photo separate from avatar
Adds a `photo` field to the users table (and a migration) that serves
as a dedicated profile/card image for models. This is now used in model
cards and on the model single page, while `avatar` is reserved for
comments, article authors, and the user profile page.

- DB: `photo` column on `users` with FK to `files`
- GraphQL: exposed on ModelType, UserType, AdminUserDetailType; photoId arg on adminUpdateUser
- Services: photo field in MODELS_QUERY, MODEL_BY_SLUG_QUERY, ADMIN_GET/UPDATE_USER
- Frontend: model cards and single page use `photo ?? avatar` fallback
- Admin: model photo upload section in user edit page

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 10:54:27 +01:00
2980c0b637 fix: remove brand name text from mobile flyout header
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 10:42:22 +01:00
7af9c0d7ca fix: fix admin mobile nav overflow breaking layout on small screens
Mobile nav now scrolls horizontally with hidden scrollbar; nav items
don't shrink and show icon-only on xs, icon+label on sm and up.
Added scrollbar-none utility to app.css.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-08 10:39:12 +01:00
76d71ee7c3 fix: remove comments list timestamp gap
All checks were successful
Build and Push Backend Image / build (push) Successful in 17s
Build and Push Frontend Image / build (push) Successful in 4m25s
2026-03-07 19:37:52 +01:00
90497e9e7c fix: resolve TypeScript build errors from leftJoin nullable types
Some checks failed
Build and Push Backend Image / build (push) Successful in 42s
Build and Push Frontend Image / build (push) Has been cancelled
Non-null assert photo/achievement ids that are structurally non-null
due to FK constraints but nullable in Drizzle's leftJoin return type.
Add missing description field to enrichVideo model select and map.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 19:33:16 +01:00
a558449964 chore: remove eslint warn rule
Some checks failed
Build and Push Backend Image / build (push) Failing after 29s
Build and Push Frontend Image / build (push) Successful in 4m12s
2026-03-07 19:28:17 +01:00
e236ced12a refactor: replace all explicit any types with proper TypeScript types
Backend resolvers: typed enrichArticle/enrichVideo/enrichModel with DB
and $inferSelect types, SQL<unknown>[] for conditions arrays, proper
enum casts for status/role fields, $inferInsert for .set() updates,
typed raw SQL result rows in gamification, ReplyLike interface for
ctx.reply in auth. Frontend: typed catch blocks with Error/interface
casts, isActiveLink param, adminGetUser response, tags filter callback.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 19:25:04 +01:00
8313664d70 chore: fix all lint errors and format codebase
- Remove unused `or` import in comments resolver
- Remove unused `users` import in recordings resolver
- Add index keys to pagination {#each} loops in videos, models, magazine
- Remove stale svelte-ignore comment in header (a11y warnings no longer fired)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 19:06:57 +01:00
ae0929ad06 fix: replace arrow symbol with icon css in author profile link
All checks were successful
Build and Push Backend Image / build (push) Successful in 43s
Build and Push Frontend Image / build (push) Successful in 4m12s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 19:03:41 +01:00
b78831231d fix: select description from users in article enrichArticle query
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 19:02:53 +01:00
f90b045ca5 fix: add description to VideoModel type and GraphQL schema
Requesting description on the article author caused a GraphQL error
which the page.server.ts caught as a 404.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 19:01:52 +01:00
d2cbb1004f fix: show author description on magazine article page
Add description field to ARTICLE_BY_SLUG_QUERY and render it in the
author bio card below the name.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 18:58:42 +01:00
77ebccf6fa feat: redesign avatar upload as circular click-to-change UI
Replace generic file drop zone + tiny thumbnail with a 96px circular
avatar that shows a camera overlay on hover, upgrades preview to
thumbnail quality, and adds a compact remove button.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 18:55:56 +01:00
1c101406f6 fix: match admin background gradient to rest of the app
All checks were successful
Build and Push Backend Image / build (push) Successful in 44s
Build and Push Frontend Image / build (push) Successful in 4m4s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 18:51:40 +01:00
cb7720ca9c fix: smooth hero-to-content transition with transparent gradient fade
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 18:47:24 +01:00
df099b2700 fix: remove extra closing div in models pagination
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 18:42:39 +01:00
291f72381f feat: improve UX across all listing pages and homepage
- Make model/video cards fully clickable on homepage, models, videos, magazine, and tags pages
- Replace inline blob divs with SexyBackground component on magazine and play pages
- Replace magazine hero section with PageHero component for consistency
- Remove redundant action buttons from cards (cards are now the link targets)
- Fix nested anchor/button invalid HTML in magazine featured article
- Convert inner overlay anchors to aria-hidden divs to avoid nested <a> elements
- Add bg-muted skeleton placeholder to all card images
- Update magazine pagination to smart numbered style with ellipsis (matching videos/models)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 18:41:39 +01:00
1a2fab3e37 refactor: UX and styling improvements across frontend
- Fix login spinner (isLoading never set to true before await)
- Extract PageHero component, replace copy-pasted hero sections on videos/models/tags pages
- Replace inline plasma blobs with SexyBackground on videos/models/tags pages
- Make video/model/tag cards fully clickable (wrap in <a>), remove redundant Watch/View Profile buttons
- Convert inner overlay anchors to divs to avoid nested <a> elements
- Fix home page model avatar preset: mini → thumbnail (correct size for 96px display)
- Reduce home hero height: min-h-screen → min-h-[70vh]
- Remove dead hideName prop from Logo, simplify component
- Add brand name to mobile flyout panel header with gradient styling
- Remove dead _relatedVideos array, isBookmarked state, _handleBookmark from video detail page
- Clean up commented-out code blocks in video detail and models pages
- Note: tag card inner tag links converted to spans to avoid nested anchors

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 18:33:32 +01:00
56b57486dc fix: add upload/delete file endpoints and wire avatar update through profile
- Add POST /upload and DELETE /assets/:id routes to backend (session auth via session_token cookie)
- Add avatar arg to updateProfile GraphQL mutation and resolver
- Fix frontend to pass avatarId correctly on save, preserve existing avatar when unchanged
- Ignore 404 on file delete (already gone is fine)
- Remove broken folder lookup (getFolders is a stub, backend has no folder concept)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 18:22:22 +01:00
a050e886cb feat: replace slide-to-logout with avatar + name + logout button in header
Removes the drag-to-logout widget in favour of a clean inline layout:
- Desktop: avatar (links to /me), artist name, and a logout icon button
- Mobile flyout: user card with avatar, name, email, and logout button

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 18:12:18 +01:00
519fd45d8d feat: rebrand to Sexy, restyle logo with gradient icon and updated assets
- Rename brand from SexyArt to Sexy throughout i18n locale
- Apply primary→accent gradient to SVG icon stroke
- Remove brand name text from logo, icon-only display
- Switch logo font to Noto Sans bold (default), drop Dancing Script
- Update favicons, app icons, webmanifest, and add logo.svg

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 18:04:14 +01:00
0592d27a15 fix: redirect authenticated users away from login, signup, and password pages
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 15:58:12 +01:00
a38883e631 fix: align admin filter toggle buttons with search input height
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 15:51:13 +01:00
798495c3d6 fix: remove archived badge from recording card and i18n
All checks were successful
Build and Push Backend Image / build (push) Successful in 17s
Build and Push Frontend Image / build (push) Successful in 4m19s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 11:57:40 +01:00
fde0d63271 feat: remove archived status from recordings, deletions are now immediate
All checks were successful
Build and Push Backend Image / build (push) Successful in 45s
Build and Push Frontend Image / build (push) Successful in 4m3s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 11:42:13 +01:00
754a236e51 feat: add admin tables for comments and recordings
All checks were successful
Build and Push Backend Image / build (push) Successful in 44s
Build and Push Frontend Image / build (push) Successful in 4m20s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 11:29:48 +01:00
dfe49b5882 feat: allow users to delete their own comments on videos
All checks were successful
Build and Push Backend Image / build (push) Successful in 16s
Build and Push Frontend Image / build (push) Successful in 4m7s
Shows a delete button on each comment for the comment author and admins.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 11:19:50 +01:00
9ba848372a fix: make gamification calls non-blocking so errors don't fail core mutations
Some checks failed
Build and Push Backend Image / build (push) Successful in 43s
Build and Push Frontend Image / build (push) Has been cancelled
awardPoints/checkAchievements were awaited inline, so any gamification error
(DB constraint, missing table, etc.) would propagate as INTERNAL_SERVER_ERROR
on comment creation, recording plays, etc. Now they run fire-and-forget with
error logging, so the core action always succeeds.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 11:16:27 +01:00
dcf2fbd3d4 feat: enhance session security and freshness
All checks were successful
Build and Push Backend Image / build (push) Successful in 43s
Build and Push Frontend Image / build (push) Successful in 4m15s
- Sliding expiration: reset 24h TTL on every Redis session access
- SameSite=Strict on login and logout cookies (was Lax)
- Secure flag on logout cookie in production (was missing)
- Re-fetch user from DB on every request in buildContext so role/avatar/
  admin changes take effect immediately without requiring re-login

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 11:10:01 +01:00
bff354094e fix: add adminGetVideo/adminGetArticle queries to fix 404 on edit pages
Some checks failed
Build and Push Backend Image / build (push) Successful in 43s
Build and Push Frontend Image / build (push) Has been cancelled
The edit page loaders were calling adminListVideos/adminListArticles with the
old pre-pagination signatures and filtering by ID client-side, which broke
after pagination limited results to 50. Now fetches the single item by ID
directly via new adminGetVideo and adminGetArticle backend queries.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 11:05:21 +01:00
6f2f3b3529 fix: deduplicate model photos in public resolver to match admin behavior
All checks were successful
Build and Push Backend Image / build (push) Successful in 43s
Build and Push Frontend Image / build (push) Successful in 4m10s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 10:54:29 +01:00
f2871b98db style: apply prettier formatting across frontend components and pages
Some checks failed
Build and Push Backend Image / build (push) Successful in 1m4s
Build and Push Frontend Image / build (push) Has been cancelled
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 10:49:35 +01:00
9c5dba5c90 feat: add server-side pagination, search, and filtering to all collection and admin pages
- Public pages (videos, magazine, models): URL-driven search, sort, category/duration
  filters, and Prev/Next pagination (page size 24)
- Admin tables (videos, articles): search input, toggle filters, and pagination (page size 50)
- Tags page: tag filtering now done server-side via DB arrayContains query instead of
  fetching all items and filtering client-side
- Backend resolvers updated for videos, articles, models with paginated { items, total }
  responses and filter/sort/tag args

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-07 10:43:26 +01:00
c90c09da9a chore: cleanup
All checks were successful
Build and Push Backend Image / build (push) Successful in 49s
Build and Push Frontend Image / build (push) Successful in 4m18s
2026-03-06 19:25:07 +01:00
aed7b4a16f fix: restore admin role in User type, use image logo
- Add "admin" back to User.role union to fix backend TS build
- Replace SVG PeonyIcon with logo.png image in Logo component

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 18:04:49 +01:00
454c477c40 style: use rocket icon for all Go to Play buttons
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 17:42:49 +01:00
3cf81bd381 style: apply gradient to primary buttons in admin area
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 17:39:36 +01:00
ac63e59906 style: remove card wrapper from error page
Some checks failed
Build and Push Backend Image / build (push) Failing after 27s
Build and Push Frontend Image / build (push) Successful in 5m7s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 17:28:11 +01:00
19d29cbfc6 fix: replace flyout profile card with logout slider, i18n auth errors
- Replace static account card in mobile flyout with swipe-to-logout widget
- Remove redundant logout button from flyout bottom
- Make LogoutButton full-width via class prop and dynamic maxSlide
- Extract clean GraphQL error messages instead of raw JSON in all auth forms
- Add i18n keys for known backend errors (invalid credentials, email taken, invalid token)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 17:26:14 +01:00
0ec27117ae style: streamline /me page header to match admin dashboard style
Replace large gradient title with simple text-2xl font-bold heading,
matching the header pattern used across admin pages.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 17:13:19 +01:00
ed9eb6ef22 style: fix admin table padding — edge-to-edge on mobile, no right pad on desktop
Mobile: remove horizontal padding so tables fill full width with top/bottom
borders only. Desktop: keep left padding, table extends to right edge.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 17:08:16 +01:00
609f116b5d feat: replace native date inputs with shadcn date picker
Add calendar + popover components and a custom DateTimePicker wrapper.
Video forms use date-only; article forms include a time picker.
Also add video player preview to the video edit form.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 17:03:35 +01:00
e943876e70 fix: prevent age verification dialog flicker on page load
Initialize isOpen as false and only open in onMount if not yet verified,
instead of opening immediately and closing after localStorage check.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 16:52:48 +01:00
7d373b3aa3 i18n: internationalize all admin pages
Add full i18n coverage for the admin section — locale keys, layout nav,
users, videos, and articles pages (list, new, edit).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 16:49:30 +01:00
95fd9f48fc refactor: align article author with VideoModel, streamline selects, fix flyout inert
- Remove ArticleAuthor type; article.author now reuses VideoModel (id, artist_name, slug, avatar)
- updateArticle accepts authorId; author selectable in admin article edit page
- Article edit: single Select with bind:value + $derived selectedAuthor display
- Video edit: replace pill toggles with Select type="multiple" bind:value for models
- Video table: replace inline badge spans with Badge component
- Magazine: display artist_name throughout, author bio links to model profile
- Fix flyout aria-hidden warning: replace with inert attribute

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 16:31:41 +01:00
670c18bcb7 feat: refactor role system to is_admin flag, add Badge component, fix native dialogs
- Separate admin identity from role: viewer|model + is_admin boolean flag
- DB migration 0001_is_admin: adds column, migrates former admin role users
- Update ACL helpers, auth session, GraphQL types and all resolvers
- Admin layout guard and header links check is_admin instead of role
- Admin users table: show Admin badge next to name, remove admin from role select
- Admin user edit page: is_admin checkbox toggle
- Install shadcn Badge component; use in admin users table
- Fix duplicate photo keys in adminGetUser resolver
- Replace confirm() in /me recordings with Dialog component

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 16:14:00 +01:00
9ef490c1e5 fix: make deleteRecording a hard delete instead of soft archive
Previously deleteRecording set status to "archived", leaving the row
in the DB and visible in queries without a status filter. Now it hard-
deletes the row. Also excludes archived recordings from the default
recordings query so any pre-existing archived rows no longer appear.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 14:45:59 +01:00
434e926f77 style: use primary color for scrollbar thumbs
All checks were successful
Build and Push Backend Image / build (push) Successful in 17s
Build and Push Frontend Image / build (push) Successful in 4m0s
40% opacity at rest, 70% on hover, adapts to light/dark theme.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 14:42:15 +01:00
7a9ce0c3b1 fix: explicitly style html root scrollbar for Firefox
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 14:39:44 +01:00
ff1e1f6679 feat: style scrollbars globally using theme colors
Thin scrollbars using --border for thumb and transparent track,
with --muted-foreground on hover. Uses both scrollbar-color (Firefox)
and ::-webkit-scrollbar (Chrome/Safari).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 14:38:02 +01:00
648123fab5 feat: mobile-optimize admin section
- Layout: sidebar hidden on mobile, replaced with horizontal top nav strip
- Tables: overflow-x-auto + hide secondary columns (email/category/dates/
  plays/likes) on small screens; show email inline under name on mobile
- Forms: grid-cols-2 → grid-cols-1 sm:grid-cols-2 on all admin forms
- Markdown editor: Write/Preview tab toggle on mobile, side-by-side on sm+
- Padding: p-3 sm:p-6 on all admin pages for tighter mobile layout

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 14:36:52 +01:00
a7fafaf7c5 refactor: replace native select with shadcn Select for user role in admin
All checks were successful
Build and Push Backend Image / build (push) Successful in 1m10s
Build and Push Frontend Image / build (push) Successful in 5m8s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 13:34:39 +01:00
b71d7dc559 refactor: remove duplicate utils/utils.ts, consolidate into utils.ts
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 13:31:35 +01:00
f764e27d59 fix: shrink flyout account card, remove online indicator
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 13:29:33 +01:00
d7eb2acc6c fix: match mobile flyout header height to main header (h-16)
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 13:27:31 +01:00
fb38d6b9a9 fix: constrain admin layout to container width matching rest of site
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 13:22:15 +01:00
d021acaf0b feat: add admin user edit page with avatar, banner, and photo gallery
- Backend: adminGetUser query returns user + photos; adminUpdateUser now
  accepts avatarId/bannerId; new adminAddUserPhoto and adminRemoveUserPhoto
  mutations; AdminUserDetailType added to GraphQL schema
- Frontend: /admin/users/[id] page for editing name, avatar, banner, and
  managing the model photo gallery (upload multiple, delete individually)
- Admin users list: edit button per row linking to the detail page

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 13:18:43 +01:00
e06a1915f2 fix: remove backdrop-blur overlay causing blurry text site-wide
The full-screen glassmorphism overlay had backdrop-blur-[0.5px] which
triggered GPU compositing on the entire viewport, degrading subpixel
text rendering inconsistently. Also use globalThis.fetch (not SvelteKit
fetch) when forwarding session token in admin SSR calls to avoid header
stripping.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 13:11:54 +01:00
ebab3405b1 fix: forward session token in admin SSR load functions
Admin list queries (users, videos, articles) were using getGraphQLClient
without auth credentials, causing silent 403s on server-side loads. Now
extract session_token cookie and pass it to getAuthClient so the backend
sees the admin session on SSR requests.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 12:56:47 +01:00
ad7ceee5f8 fix: resolve lint errors from ACL/admin implementation
- Remove unused requireOwnerOrAdmin import from videos.ts
- Remove unused requireAuth import from users.ts
- Remove unused GraphQLError import from articles.ts
- Replace URLSearchParams with SvelteURLSearchParams in admin users page
- Apply prettier formatting to all changed files

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 12:35:11 +01:00
c1770ab9c9 feat: role-based ACL + admin management UI
Backend:
- Add acl.ts with requireAuth/requireRole/requireOwnerOrAdmin helpers
- Gate premium videos from unauthenticated users in videos query/resolver
- Fix updateVideoPlay to verify ownership before updating
- Add admin mutations: adminListUsers, adminUpdateUser, adminDeleteUser
- Add admin mutations: createVideo, updateVideo, deleteVideo, setVideoModels, adminListVideos
- Add admin mutations: createArticle, updateArticle, deleteArticle, adminListArticles
- Add deleteComment mutation (owner or admin only)
- Add AdminUserListType to GraphQL types
- Fix featured filter on articles query

Frontend:
- Install marked for markdown rendering
- Add /admin/* section with sidebar layout and admin-only guard
- Admin users page: paginated table with search, role filter, inline role change, delete
- Admin videos pages: list, create form, edit form with file upload and model assignment
- Admin articles pages: list, create form, edit form with split-pane markdown editor
- Add admin nav link in header (desktop + mobile) for admin users
- Render article content through marked in magazine detail page
- Add all admin GraphQL service functions to services.ts

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 12:31:33 +01:00
b200498a10 fix: correct CI badge URLs to use Gitea workflow badge format
All checks were successful
Build and Push Backend Image / build (push) Successful in 16s
Build and Push Frontend Image / build (push) Successful in 15s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 12:57:47 +01:00
1369d5c228 fix: copy packages/types into backend Docker build
All checks were successful
Build and Push Backend Image / build (push) Successful in 46s
Build and Push Frontend Image / build (push) Successful in 16s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 12:51:52 +01:00
e200514347 refactor: move flyout to left side, restore logo, remove close button
Some checks failed
Build and Push Backend Image / build (push) Failing after 47s
Build and Push Frontend Image / build (push) Successful in 5m13s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 12:34:57 +01:00
d7057c3681 refactor: improve mobile flyout header
- Replace inline mobile dropdown with sliding flyout panel from right
- Hide burger menu on lg breakpoint, desktop auth buttons use hidden lg:flex
- Add backdrop overlay with opacity transition
- Remove logo from flyout panel header
- Fix backdrop div accessibility with role="presentation"

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 11:44:23 +01:00
d820a8f6be chore: relative uploads dir 2026-03-05 11:05:30 +01:00
9bef2469d1 refactor: rename RecordedEvent fields to snake_case
deviceIndex → device_index
deviceName → device_name
actuatorIndex → actuator_index
actuatorType → actuator_type

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 11:04:36 +01:00
97269788ee feat: add shared @sexy.pivoine.art/types package and fix type safety across frontend/backend
- Create packages/types with shared TypeScript domain model interfaces (User, Video, Model, Article, Comment, Recording, etc.)
- Wire both frontend and backend packages to use @sexy.pivoine.art/types via workspace:*
- Update backend Pothos objectRef types to use shared interfaces instead of inline types
- Update frontend $lib/types.ts to re-export from shared package
- Fix all type errors introduced by more accurate nullable types (avatar/banner as string|null UUIDs, author nullable, events/device_info as object[])
- Add artist_name to comment user select in backend resolver
- Widen utility function signatures (getAssetUrl, getUserInitials, calcReadingTime) to accept null/undefined

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 11:01:11 +01:00
c6126c13e9 feat: add backend logger matching frontend text format
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 10:22:49 +01:00
fd4050a49f refactor: remove directus.ts shim, import directly from api
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 10:19:05 +01:00
efc7624ba3 style: apply prettier formatting to all files
All checks were successful
Build and Push Backend Image / build (push) Successful in 46s
Build and Push Frontend Image / build (push) Successful in 5m12s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 22:27:54 +01:00
18116072c9 feat: add formidable ESLint + Prettier linting setup
Some checks failed
Build and Push Backend Image / build (push) Successful in 47s
Build and Push Frontend Image / build (push) Has been cancelled
- Root-level eslint.config.js (flat config): typescript-eslint,
  eslint-plugin-svelte, eslint-config-prettier, @eslint/js
- Root-level prettier.config.js with prettier-plugin-svelte
- svelte-check added to frontend for Svelte/TS type checking
- lint, lint:fix, format, format:check, check scripts in root
  and both packages
- All 60 lint errors fixed across backend and frontend:
  - Consistent type imports
  - Removed unused imports/variables
  - Added keys to all {#each} blocks for Svelte performance
  - Replaced mutable Set/Map with SvelteSet/SvelteMap
  - Fixed useless assignments and empty catch blocks
- 64 remaining warnings are intentional any usages in the
  Pothos/Drizzle GraphQL resolver layer

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 22:24:55 +01:00
741e0c3387 docs: update README for custom backend replacing Directus
All checks were successful
Build and Push Backend Image / build (push) Successful in 17s
Build and Push Frontend Image / build (push) Successful in 17s
Replace all Directus references with the new Fastify + GraphQL Yoga
stack, update CI/CD references to dev.pivoine.art Gitea Actions,
add DB schema overview, auth flow, image transform presets table,
and fix all links to use https and correct registry URLs.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 22:01:46 +01:00
662e3e8fe2 fix: model join date used join_date but API returns date_created
All checks were successful
Build and Push Backend Image / build (push) Successful in 17s
Build and Push Frontend Image / build (push) Successful in 4m13s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 21:42:34 +01:00
fa159feffa fix: remove black border below video controls
Some checks failed
Build and Push Backend Image / build (push) Successful in 16s
Build and Push Frontend Image / build (push) Has been cancelled
- video: inline → block w-full (eliminates baseline descender gap)
- media-controller: fill parent container with absolute inset-0 w-full h-full

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 21:41:07 +01:00
124f0bfb22 fix: video src used movie.id but movie is already the UUID string
All checks were successful
Build and Push Backend Image / build (push) Successful in 17s
Build and Push Frontend Image / build (push) Successful in 4m18s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 21:27:06 +01:00
df89cc59f5 fix: use preview transform for home page video teasers
Some checks failed
Build and Push Backend Image / build (push) Has been cancelled
Build and Push Frontend Image / build (push) Has been cancelled
thumbnail (300x300 square) was double-cropping inside the wide h-48
container. preview (800px, aspect-ratio preserved) lets object-cover
do the only crop, matching the videos and model pages.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 21:24:59 +01:00
845e3df223 fix: image transforms — preserve aspect ratio, increase quality
Some checks failed
Build and Push Backend Image / build (push) Successful in 40s
Build and Push Frontend Image / build (push) Has been cancelled
- preview/medium use fit:inside (no forced crop, preserves aspect ratio)
- Only mini/thumbnail/banner force square/fixed crops
- Increase WebP quality 85 → 92
- Increase preview width 480 → 800, medium 960 → 1400

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 21:22:30 +01:00
05cb6a66e3 fix: image transforms via Sharp, model photos crash, video duration
All checks were successful
Build and Push Backend Image / build (push) Successful in 46s
Build and Push Frontend Image / build (push) Successful in 5m7s
- Backend: add Sharp image transform endpoint (/assets/:id?transform=X)
  with presets: mini(64), thumbnail(200), preview(480), medium(960), banner(1280)
  Transformed images are cached as webp next to originals
- Frontend: fix model photos crash (p.directus_files_id → p)
- Frontend: fix model banner URL (data.model.banner.id → data.model.banner)
- Frontend: fix video duration display (video.movie.duration → video.movie_file?.duration)
  across models/[slug], videos, videos/[slug], and home pages

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 20:56:33 +01:00
273aa42510 fix: serve assets via DB lookup to resolve file path correctly
All checks were successful
Build and Push Backend Image / build (push) Successful in 38s
Build and Push Frontend Image / build (push) Successful in 4m11s
Files are stored as <UPLOAD_DIR>/<id>/<filename>. The previous static
serving attempted to serve <UPLOAD_DIR>/<id> (a directory) which failed.
Custom /assets/:id route now looks up filename from DB and uses sendFile.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 20:40:22 +01:00
1e930baccb fix: resolve GraphQL request hang in Fastify integration
All checks were successful
Build and Push Backend Image / build (push) Successful in 39s
Build and Push Frontend Image / build (push) Successful in 4m7s
- Pass FastifyRequest/FastifyReply directly to yoga.handleNodeRequestAndResponse
  per the official graphql-yoga Fastify integration docs. Yoga v5 uses req.body
  (already parsed by Fastify) when available, avoiding the dead raw stream issue.
- Add proper TypeScript generics for server context including db and redis
- Wrap sendVerification/sendPasswordReset in try/catch so missing SMTP
  does not crash register/requestPasswordReset mutations
- Fix migrate.ts path resolution to work with both tsx (src/) and compiled (dist/)
- Expose postgres:5432 and redis:6379 ports in compose.yml for local dev

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 20:31:18 +01:00
012bb176d9 fix: convert Web API ReadableStream to Node.js Readable for Fastify
Some checks failed
Build and Push Backend Image / build (push) Failing after 26s
Build and Push Frontend Image / build (push) Successful in 4m17s
graphql-yoga's handleNodeRequestAndResponse returns a Response with a
Web API ReadableStream body. Fastify's reply.send() requires a Node.js
Readable stream, causing all GraphQL requests to hang indefinitely.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 20:11:08 +01:00
ed7ac0c573 fix: downgrade nanoid to v3 for CommonJS compatibility
All checks were successful
Build and Push Backend Image / build (push) Successful in 2m28s
Build and Push Frontend Image / build (push) Successful in 5m12s
nanoid v5 is ESM-only and cannot be require()'d in a CommonJS module.
v3 is the last version with native CJS support.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 19:43:44 +01:00
4565038be3 fix: cast recording duration float to integer in data migration
Some checks failed
Build and Push Backend Image / build (push) Successful in 39s
Build and Push Frontend Image / build (push) Has been cancelled
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 19:39:52 +01:00
fbafbeca5d fix: pass tags as native arrays not JSON strings in data migration
Some checks failed
Build and Push Backend Image / build (push) Successful in 38s
Build and Push Frontend Image / build (push) Has been cancelled
PostgreSQL text[] columns require native array values, not JSON strings.
Parse string tags from Directus and pass as JS arrays directly.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 19:36:04 +01:00
480369aa4e fix: correct column names in data migration script to match actual Directus schema
Some checks failed
Build and Push Backend Image / build (push) Successful in 37s
Build and Push Frontend Image / build (push) Has been cancelled
- directus_files: uploaded_on → date_created alias
- directus_users: join_date → date_created, remove email_notifications_key
- junction_directus_users_files: remove non-existent sort column
- sexy_videos: remove non-existent likes_count/plays_count (default 0)
- sexy_recordings: remove non-existent featured column (schema has default false)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 19:33:33 +01:00
ceb57ec1c4 fix: copy root node_modules to runner stage in backend Dockerfile
All checks were successful
Build and Push Backend Image / build (push) Successful in 30s
Build and Push Frontend Image / build (push) Successful in 15s
pnpm hoists workspace dependencies to the root node_modules.
Without copying it, modules like pg are not found at runtime.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 19:29:36 +01:00
4f8271217c fix: set CI=true for pnpm install in backend Dockerfile
All checks were successful
Build and Push Backend Image / build (push) Successful in 1m4s
Build and Push Frontend Image / build (push) Successful in 15s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 19:26:25 +01:00
046689e363 fix: set CI=true for pnpm install -rP in frontend Dockerfile
Some checks failed
Build and Push Backend Image / build (push) Failing after 28s
Build and Push Frontend Image / build (push) Successful in 4m50s
pnpm requires CI=true to allow non-interactive removal of node_modules
in CI environments without a TTY.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 19:19:45 +01:00
9ba71239b7 fix: use correct 'file' parameter in docker/build-push-action (not 'dockerfile')
Some checks failed
Build and Push Frontend Image / build (push) Failing after 4m12s
Build and Push Backend Image / build (push) Failing after 25s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 19:09:58 +01:00
757bbe9e3b fix: skip buttplug Rust build in backend Dockerfile via --ignore-scripts
Some checks failed
Build and Push Backend Image / build (push) Has been cancelled
Build and Push Frontend Image / build (push) Has been cancelled
Copy all workspace package.json files so pnpm can resolve the lockfile,
but install with --ignore-scripts to prevent buttplug's Rust/WASM build
from running. Only explicitly rebuild argon2 native bindings.
Also restore the missing migrations COPY line.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 19:08:48 +01:00
73f7a4f2f0 ci: replace combined workflow with separate frontend and backend workflow files
Some checks failed
Build and Push Frontend Image / build (push) Has been cancelled
Build and Push Backend Image / build (push) Has been cancelled
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 19:05:54 +01:00
3bd8d95576 ci: disable registry cache for backend build to fix poisoned buildcache
All checks were successful
Build and Push Docker Image to Gitea / build-frontend (push) Successful in 15s
Build and Push Docker Image to Gitea / build-backend (push) Successful in 4m44s
The backend buildcache was contaminated with frontend image layers, causing
the backend image to be built with the wrong content. Using no-cache forces
a fresh build until the cache can be reliably separated.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 18:57:39 +01:00
14e816241d ci: split frontend and backend into separate jobs to fix image tag mix-up
All checks were successful
Build and Push Docker Image to Gitea / build-frontend (push) Successful in 17s
Build and Push Docker Image to Gitea / build-backend (push) Successful in 16s
Both builds in the same job shared the same docker buildx instance,
causing the backend image to be incorrectly tagged with the frontend image.
Separate jobs get isolated buildx instances and separate build caches.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 18:53:16 +01:00
4102f9990c fix: switch backend to CommonJS, generate Drizzle migrations, add migrate script
All checks were successful
Build and Push Docker Image to Gitea / build-and-push (push) Successful in 4m23s
- Remove "type": "module" and switch tsconfig to CommonJS/Node resolution
  to fix drizzle-kit ESM/CJS incompatibility
- Strip .js extensions from all backend TypeScript imports
- Fix gamification resolver: combine two .where() calls using and()
- Fix index.ts: wrap top-level awaits in async main(), fix Fastify+yoga
  request handling via handleNodeRequestAndResponse
- Generate initial Drizzle SQL migration (0000_pale_hellion.sql) for all
  15 tables
- Add src/scripts/migrate.ts: programmatic Drizzle migrator for production
- Copy migrations folder into Docker image (Dockerfile.backend)
- Add schema:migrate npm script

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 18:42:58 +01:00
2565e6c28b fix: resolve pnpm frozen-lockfile error and argon2 native build
All checks were successful
Build and Push Docker Image to Gitea / build-and-push (push) Successful in 5m19s
- Run pnpm install to update lockfile with packages/backend dependencies
- Add argon2 to root onlyBuiltDependencies (pnpm-workspace.yaml + package.json)
- Add explicit `pnpm rebuild argon2` in Dockerfile.backend to ensure native
  bindings compile regardless of pnpm v10 build approval state
- Remove pnpm.onlyBuiltDependencies from packages/backend/package.json
  (ineffective in workspace packages, warned by pnpm)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 18:19:52 +01:00
330 changed files with 26716 additions and 20352 deletions

View File

@@ -0,0 +1,72 @@
name: Build and Push Backend Image
on:
push:
branches:
- main
- develop
tags:
- "v*.*.*"
paths:
- "packages/backend/**"
- "packages/types/**"
- "packages/email/**"
- "Dockerfile.backend"
pull_request:
branches:
- main
paths:
- "packages/backend/**"
- "packages/types/**"
- "packages/email/**"
- "Dockerfile.backend"
workflow_dispatch:
env:
REGISTRY: dev.pivoine.art
IMAGE_NAME: valknar/sexy-backend
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
platforms: linux/amd64
- name: Log in to Gitea Container Registry
uses: docker/login-action@v3
with:
registry: ${{ env.REGISTRY }}
username: ${{ gitea.actor }}
password: ${{ secrets.REGISTRY_TOKEN }}
- name: Extract metadata
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
labels: |
org.opencontainers.image.source=${{ gitea.server_url }}/${{ gitea.repository }}
tags: |
type=raw,value=latest,enable={{is_default_branch}}
type=ref,event=branch
type=ref,event=pr
type=semver,pattern={{version}}
type=sha,prefix={{branch}}-
- name: Build and push
uses: docker/build-push-action@v5
with:
context: .
file: Dockerfile.backend
platforms: linux/amd64
push: ${{ gitea.event_name != 'pull_request' }}
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=registry,ref=${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:buildcache
cache-to: type=registry,ref=${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:buildcache,mode=max

View File

@@ -0,0 +1,70 @@
name: Build and Push Buttplug Image
on:
push:
branches:
- main
- develop
tags:
- "v*.*.*"
paths:
- "packages/buttplug/**"
- "Dockerfile.buttplug"
- "nginx.buttplug.conf"
pull_request:
branches:
- main
paths:
- "packages/buttplug/**"
- "Dockerfile.buttplug"
- "nginx.buttplug.conf"
workflow_dispatch:
env:
REGISTRY: dev.pivoine.art
IMAGE_NAME: valknar/sexy-buttplug
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
platforms: linux/amd64
- name: Log in to Gitea Container Registry
uses: docker/login-action@v3
with:
registry: ${{ env.REGISTRY }}
username: ${{ gitea.actor }}
password: ${{ secrets.REGISTRY_TOKEN }}
- name: Extract metadata
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
labels: |
org.opencontainers.image.source=${{ gitea.server_url }}/${{ gitea.repository }}
tags: |
type=raw,value=latest,enable={{is_default_branch}}
type=ref,event=branch
type=ref,event=pr
type=semver,pattern={{version}}
type=sha,prefix={{branch}}-
- name: Build and push
uses: docker/build-push-action@v5
with:
context: .
file: Dockerfile.buttplug
platforms: linux/amd64
push: ${{ gitea.event_name != 'pull_request' }}
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=registry,ref=${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:buildcache
cache-to: type=registry,ref=${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:buildcache,mode=max

View File

@@ -0,0 +1,70 @@
name: Build and Push Frontend Image
on:
push:
branches:
- main
- develop
tags:
- "v*.*.*"
paths:
- "packages/frontend/**"
- "packages/types/**"
- "Dockerfile"
pull_request:
branches:
- main
paths:
- "packages/frontend/**"
- "packages/types/**"
- "Dockerfile"
workflow_dispatch:
env:
REGISTRY: dev.pivoine.art
IMAGE_NAME: valknar/sexy
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
platforms: linux/amd64
- name: Log in to Gitea Container Registry
uses: docker/login-action@v3
with:
registry: ${{ env.REGISTRY }}
username: ${{ gitea.actor }}
password: ${{ secrets.REGISTRY_TOKEN }}
- name: Extract metadata
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
labels: |
org.opencontainers.image.source=${{ gitea.server_url }}/${{ gitea.repository }}
tags: |
type=raw,value=latest,enable={{is_default_branch}}
type=ref,event=branch
type=ref,event=pr
type=semver,pattern={{version}}
type=sha,prefix={{branch}}-
- name: Build and push
uses: docker/build-push-action@v5
with:
context: .
file: Dockerfile
platforms: linux/amd64
push: ${{ gitea.event_name != 'pull_request' }}
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=registry,ref=${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:buildcache
cache-to: type=registry,ref=${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:buildcache,mode=max

View File

@@ -1,159 +0,0 @@
name: Build and Push Docker Image to Gitea
on:
push:
branches:
- main
- develop
tags:
- 'v*.*.*'
pull_request:
branches:
- main
workflow_dispatch:
inputs:
tag:
description: 'Custom tag for the image'
required: false
default: 'manual'
env:
REGISTRY: dev.pivoine.art
IMAGE_NAME: valknar/sexy
BACKEND_IMAGE_NAME: valknar/sexy-backend
jobs:
build-and-push:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
platforms: linux/amd64
- name: Log in to Gitea Container Registry
uses: docker/login-action@v3
with:
registry: ${{ env.REGISTRY }}
username: ${{ gitea.actor }}
password: ${{ secrets.REGISTRY_TOKEN }}
- name: Extract metadata (tags, labels)
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
tags: |
# Tag as 'latest' for main branch
type=raw,value=latest,enable={{is_default_branch}}
# Tag with branch name
type=ref,event=branch
# Tag with PR number
type=ref,event=pr
# Tag with git tag (semver)
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=semver,pattern={{major}}
# Tag with commit SHA
type=sha,prefix={{branch}}-
# Custom tag from workflow_dispatch
type=raw,value=${{ gitea.event.inputs.tag }},enable=${{ gitea.event_name == 'workflow_dispatch' }}
labels: |
org.opencontainers.image.title=sexy.pivoine.art
org.opencontainers.image.description=Adult content platform with SvelteKit, Directus, and hardware integration
org.opencontainers.image.vendor=valknar
org.opencontainers.image.source=https://dev.pivoine.art/${{ gitea.repository }}
- name: Build and push frontend Docker image
uses: docker/build-push-action@v5
with:
context: .
dockerfile: Dockerfile
platforms: linux/amd64
push: ${{ gitea.event_name != 'pull_request' }}
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=registry,ref=${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:buildcache
cache-to: type=registry,ref=${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:buildcache,mode=max
build-args: |
NODE_ENV=production
CI=true
- name: Extract metadata for backend image
id: meta-backend
uses: docker/metadata-action@v5
with:
images: ${{ env.REGISTRY }}/${{ env.BACKEND_IMAGE_NAME }}
tags: |
type=raw,value=latest,enable={{is_default_branch}}
type=ref,event=branch
type=ref,event=pr
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=semver,pattern={{major}}
type=sha,prefix={{branch}}-
type=raw,value=${{ gitea.event.inputs.tag }},enable=${{ gitea.event_name == 'workflow_dispatch' }}
labels: |
org.opencontainers.image.title=sexy.pivoine.art backend
org.opencontainers.image.description=GraphQL backend for sexy.pivoine.art (Fastify + Drizzle + Pothos)
org.opencontainers.image.vendor=valknar
org.opencontainers.image.source=https://dev.pivoine.art/${{ gitea.repository }}
- name: Build and push backend Docker image
uses: docker/build-push-action@v5
with:
context: .
dockerfile: Dockerfile.backend
platforms: linux/amd64
push: ${{ gitea.event_name != 'pull_request' }}
tags: ${{ steps.meta-backend.outputs.tags }}
labels: ${{ steps.meta-backend.outputs.labels }}
cache-from: type=registry,ref=${{ env.REGISTRY }}/${{ env.BACKEND_IMAGE_NAME }}:buildcache
cache-to: type=registry,ref=${{ env.REGISTRY }}/${{ env.BACKEND_IMAGE_NAME }}:buildcache,mode=max
build-args: |
NODE_ENV=production
CI=true
- name: Generate image digest
if: gitea.event_name != 'pull_request'
run: |
echo "### Docker Images Published :rocket:" >> $GITEA_STEP_SUMMARY
echo "" >> $GITEA_STEP_SUMMARY
echo "**Registry:** \`${{ env.REGISTRY }}\`" >> $GITEA_STEP_SUMMARY
echo "" >> $GITEA_STEP_SUMMARY
echo "**Frontend (\`${{ env.IMAGE_NAME }}\`):**" >> $GITEA_STEP_SUMMARY
echo "\`\`\`" >> $GITEA_STEP_SUMMARY
echo "${{ steps.meta.outputs.tags }}" >> $GITEA_STEP_SUMMARY
echo "\`\`\`" >> $GITEA_STEP_SUMMARY
echo "" >> $GITEA_STEP_SUMMARY
echo "**Backend (\`${{ env.BACKEND_IMAGE_NAME }}\`):**" >> $GITEA_STEP_SUMMARY
echo "\`\`\`" >> $GITEA_STEP_SUMMARY
echo "${{ steps.meta-backend.outputs.tags }}" >> $GITEA_STEP_SUMMARY
echo "\`\`\`" >> $GITEA_STEP_SUMMARY
echo "" >> $GITEA_STEP_SUMMARY
echo "**Pull commands:**" >> $GITEA_STEP_SUMMARY
echo "\`\`\`bash" >> $GITEA_STEP_SUMMARY
echo "docker pull ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:latest" >> $GITEA_STEP_SUMMARY
echo "docker pull ${{ env.REGISTRY }}/${{ env.BACKEND_IMAGE_NAME }}:latest" >> $GITEA_STEP_SUMMARY
echo "\`\`\`" >> $GITEA_STEP_SUMMARY
- name: PR Comment - Images built but not pushed
if: gitea.event_name == 'pull_request'
run: |
echo "### Docker Images Built Successfully :white_check_mark:" >> $GITEA_STEP_SUMMARY
echo "" >> $GITEA_STEP_SUMMARY
echo "Images were built successfully but **not pushed** (PR builds are not published)." >> $GITEA_STEP_SUMMARY
echo "" >> $GITEA_STEP_SUMMARY
echo "**Frontend would be tagged as:**" >> $GITEA_STEP_SUMMARY
echo "\`\`\`" >> $GITEA_STEP_SUMMARY
echo "${{ steps.meta.outputs.tags }}" >> $GITEA_STEP_SUMMARY
echo "\`\`\`" >> $GITEA_STEP_SUMMARY
echo "" >> $GITEA_STEP_SUMMARY
echo "**Backend would be tagged as:**" >> $GITEA_STEP_SUMMARY
echo "\`\`\`" >> $GITEA_STEP_SUMMARY
echo "${{ steps.meta-backend.outputs.tags }}" >> $GITEA_STEP_SUMMARY
echo "\`\`\`" >> $GITEA_STEP_SUMMARY

1
.gitignore vendored
View File

@@ -4,3 +4,4 @@ target/
pkg/ pkg/
.claude/ .claude/
.data/

6
.prettierignore Normal file
View File

@@ -0,0 +1,6 @@
build/
.svelte-kit/
dist/
node_modules/
migrations/
pnpm-lock.yaml

241
CLAUDE.md
View File

@@ -2,176 +2,93 @@
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository. This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## Project Overview ## Overview
This is a monorepo for an adult content platform built with SvelteKit, Directus CMS, and hardware integration via Buttplug.io. The project uses pnpm workspaces with three main packages. `sexy.pivoine.art` is a self-hosted adult content platform (18+) built as a pnpm monorepo with three packages: `frontend` (SvelteKit 5), `backend` (Fastify + GraphQL), and `buttplug` (hardware integration via WebBluetooth/WASM).
## Prerequisites
1. Install Node.js 20.19.1
2. Enable corepack: `corepack enable`
3. Install dependencies: `pnpm install`
4. Install Rust toolchain and wasm-bindgen: `cargo install wasm-bindgen-cli`
## Project Structure
### Packages
- **`packages/frontend`**: SvelteKit application (main frontend)
- **`packages/bundle`**: Directus extension bundle (custom endpoints, hooks, themes)
- **`packages/buttplug`**: Hardware control library with TypeScript/WebAssembly bindings
### Frontend (SvelteKit + Tailwind CSS 4)
- **Framework**: SvelteKit 2 with adapter-node
- **Styling**: Tailwind CSS v4 via @tailwindcss/vite
- **UI Components**: bits-ui, custom components in `src/lib/components/ui/`
- **Backend**: Directus headless CMS
- **Routes**: File-based routing in `src/routes/`
- `+page.server.ts`: Server-side data loading
- `+layout.server.ts`: Layout data (authentication, etc.)
- **Authentication**: Session-based via Directus SDK (cookies)
- **API Proxy**: Dev server proxies `/api` to `http://localhost:8055` (Directus)
- **i18n**: svelte-i18n for internationalization
Key files:
- `src/lib/directus.ts`: Directus client configuration
- `src/lib/types.ts`: Shared TypeScript types
- `src/hooks.server.ts`: Server-side auth middleware
- `vite.config.ts`: Dev server on port 3000 with API proxy
### Bundle (Directus Extensions)
Custom Directus extensions providing:
- **Endpoint** (`src/endpoint/index.ts`): `/sexy/stats` endpoint for platform statistics
- **Hook** (`src/hook/index.ts`):
- Auto-generates slugs for users based on artist_name
- Processes uploaded videos with ffmpeg to extract duration
- **Theme** (`src/theme/index.ts`): Custom Directus admin theme
### Buttplug (Hardware Control)
Hybrid TypeScript/Rust package for intimate hardware control:
- **TypeScript**: Client library, connectors (WebSocket, Browser WebSocket)
- **Rust/WASM**: Core buttplug implementation compiled to WebAssembly
- Provides browser-based Bluetooth device control via WebBluetooth API
Key concepts:
- `ButtplugClient`: Main client interface
- `ButtplugClientDevice`: Device abstraction
- `ButtplugWasmClientConnector`: WASM-based connector
- Messages defined in `src/core/Messages.ts`
## Common Commands ## Common Commands
### Development Run from the repo root unless otherwise noted.
Start full development environment (data + Directus + frontend):
```bash ```bash
pnpm dev # Development
pnpm dev:data # Start postgres & redis via Docker
pnpm dev:backend # Start backend on http://localhost:4000
pnpm dev # Start backend + frontend (frontend on :3000)
# Linting & Formatting
pnpm lint # ESLint across all packages
pnpm lint:fix # Auto-fix ESLint issues
pnpm format # Prettier format all files
pnpm format:check # Check formatting without changes
# Build
pnpm build:frontend # SvelteKit production build
pnpm build:backend # Compile backend TypeScript to dist/
# Database migrations (from packages/backend/)
pnpm migrate # Run pending Drizzle migrations
``` ```
Individual services: ## Architecture
### Monorepo Layout
```
packages/
frontend/ # SvelteKit 2 + Svelte 5 + Tailwind CSS 4
backend/ # Fastify v5 + GraphQL Yoga v5 + Drizzle ORM
buttplug/ # TypeScript/Rust hybrid, compiles to WASM
```
### Backend (`packages/backend/src/`)
- **`index.ts`** — Fastify server entry: registers plugins (CORS, multipart, static), mounts GraphQL at `/graphql`, serves transformed assets at `/assets/:id`
- **`graphql/builder.ts`** — Pothos schema builder (code-first GraphQL)
- **`graphql/context.ts`** — Injects `currentUser` from Redis session into every request
- **`lib/auth.ts`** — Session management: `nanoid(32)` token stored in Redis with 24h TTL, set as httpOnly cookie
- **`db/schema/`** — Drizzle ORM table definitions (users, videos, files, comments, gamification, etc.)
- **`migrations/`** — SQL migration files managed by Drizzle Kit
### Frontend (`packages/frontend/src/`)
- **`lib/api.ts`** — GraphQL client (graphql-request)
- **`lib/services.ts`** — All API calls (login, videos, comments, models, etc.)
- **`lib/types.ts`** — Shared TypeScript types
- **`hooks.server.ts`** — Auth guard: reads session cookie, fetches `me` query, redirects if needed
- **`routes/`** — SvelteKit file-based routing: `/`, `/login`, `/signup`, `/me`, `/models`, `/models/[slug]`, `/videos`, `/play/[slug]`, `/magazine`, `/leaderboard`
### Asset Pipeline
Backend serves images with server-side Sharp transforms, cached to disk as WebP. Presets: `mini` (80×80), `thumbnail` (300×300), `preview` (800px wide), `medium` (1400px wide), `banner` (1600×480 cropped).
### Gamification
Points + achievements system tracked in `user_points` and `user_stats` tables. Logic in `packages/backend/src/lib/gamification.ts` and the `gamification` resolver.
## Code Style
- **TypeScript strict mode** in all packages
- **ESLint flat config** (`eslint.config.js` at root) — `any` is allowed but discouraged; enforces consistent type imports
- **Prettier**: 2-space indent, trailing commas, 100-char line width, Svelte plugin
- Migrations folder (`packages/backend/src/migrations/`) is excluded from lint
## Environment Variables (Backend)
| Variable | Purpose |
| --------------------------- | ---------------------------- |
| `DATABASE_URL` | PostgreSQL connection string |
| `REDIS_URL` | Redis connection string |
| `COOKIE_SECRET` | Session cookie signing |
| `CORS_ORIGIN` | Frontend origin URL |
| `UPLOAD_DIR` | File storage path |
| `SMTP_HOST/PORT/EMAIL_FROM` | Email (Nodemailer) |
## Docker
```bash ```bash
pnpm dev:data # Start Docker Compose data services docker compose up -d # Start all services (postgres, redis, backend, frontend)
pnpm dev:directus # Start Directus in Docker arty up -d <service> # Preferred way to manage containers in this project
pnpm --filter @sexy.pivoine.art/frontend dev # Frontend dev server only
``` ```
### Building Production images are built and pushed to `dev.pivoine.art` via Gitea Actions on push to `main`.
Build all packages:
```bash
pnpm install # Ensure dependencies are installed first
```
Build specific packages:
```bash
pnpm build:frontend # Pulls git, installs, builds frontend
pnpm build:bundle # Pulls git, installs, builds Directus extensions
```
Individual package builds:
```bash
pnpm --filter @sexy.pivoine.art/frontend build
pnpm --filter @sexy.pivoine.art/bundle build
pnpm --filter @sexy.pivoine.art/buttplug build # TypeScript build
pnpm --filter @sexy.pivoine.art/buttplug build:wasm # Rust WASM build
```
### Production
Start production frontend server (local):
```bash
pnpm --filter @sexy.pivoine.art/frontend start
```
Docker Compose deployment (recommended for production):
```bash
# Local development (with Postgres, Redis, Directus)
docker-compose up -d
# Production (with Traefik, external DB, Redis)
docker-compose -f compose.production.yml --env-file .env.production up -d
```
See `COMPOSE.md` for Docker Compose guide and `DOCKER.md` for standalone Docker deployment.
## Architecture Notes
### Data Flow
1. **Frontend**`/api/*` (proxied) → **Directus CMS**
2. Directus uses **bundle extensions** for custom logic (stats, video processing, user management)
3. Frontend uses **Directus SDK** with session authentication
4. Hardware control uses **buttplug package** (TypeScript → WASM → Bluetooth)
### Authentication
- Session tokens stored in `directus_session_token` cookie
- `hooks.server.ts` validates token on every request via `isAuthenticated()`
- User roles: Model, Viewer (checked via role or policy)
- `isModel()` helper in `src/lib/directus.ts` checks user permissions
### Content Types
Core types in `packages/frontend/src/lib/types.ts`:
- **User/CurrentUser**: User profiles with roles and policies
- **Video**: Videos with models, tags, premium flag
- **Model**: Creator profiles with photos and banner
- **Article**: Magazine/blog content
- **BluetoothDevice**: Hardware device state
### Docker Environment
Development uses Docker Compose in `../compose/` directory:
- `../compose/data`: Database/storage services
- `../compose/sexy`: Directus instance (uses `.env.local`)
### Asset URLs
Assets served via Directus with transforms:
```typescript
getAssetUrl(id, "thumbnail" | "preview" | "medium" | "banner")
// Returns: ${directusApiUrl}/assets/${id}?transform=...
```
## Development Workflow
1. Ensure Docker services are running: `pnpm dev:data && pnpm dev:directus`
2. Start frontend dev server: `pnpm --filter @sexy.pivoine.art/frontend dev`
3. Access frontend at `http://localhost:3000`
4. Access Directus admin at `http://localhost:8055`
When modifying:
- **Frontend code**: Hot reload via Vite
- **Bundle extensions**: Rebuild with `pnpm --filter @sexy.pivoine.art/bundle build` and restart Directus
- **Buttplug library**: Rebuild TypeScript (`pnpm build`) and/or WASM (`pnpm build:wasm`)
## Important Notes
- This is a pnpm workspace; always use `pnpm` not `npm` or `yarn`
- Package manager is locked to `pnpm@10.17.0`
- Buttplug package requires Rust toolchain for WASM builds
- Frontend uses SvelteKit's adapter-node for production deployment
- All TypeScript packages use ES modules (`"type": "module"`)

View File

@@ -3,7 +3,7 @@
# ============================================================================ # ============================================================================
# Base stage - shared dependencies # Base stage - shared dependencies
# ============================================================================ # ============================================================================
FROM node:22.11.0-slim AS base FROM node:22.14.0-slim AS base
# Enable corepack for pnpm # Enable corepack for pnpm
RUN npm install -g corepack@latest && corepack enable RUN npm install -g corepack@latest && corepack enable
@@ -20,57 +20,31 @@ RUN mkdir -p ./packages/frontend && \
printf 'PUBLIC_API_URL=\nPUBLIC_URL=\nPUBLIC_UMAMI_ID=\nPUBLIC_UMAMI_SCRIPT=\n' > ./packages/frontend/.env printf 'PUBLIC_API_URL=\nPUBLIC_URL=\nPUBLIC_UMAMI_ID=\nPUBLIC_UMAMI_SCRIPT=\n' > ./packages/frontend/.env
# ============================================================================ # ============================================================================
# Builder stage - compile application with Rust/WASM support # Builder stage - compile frontend
# ============================================================================ # ============================================================================
FROM base AS builder FROM base AS builder
ARG CI=false ARG CI=false
ENV CI=$CI ENV CI=$CI
# Install build dependencies for Rust and native modules
RUN apt-get update && apt-get install -y \
curl \
build-essential \
pkg-config \
libssl-dev \
ca-certificates \
&& rm -rf /var/lib/apt/lists/*
# Install Rust toolchain
RUN curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y \
--default-toolchain stable \
--profile minimal \
--target wasm32-unknown-unknown
# Add Rust to PATH
ENV PATH="/root/.cargo/bin:${PATH}"
# Install wasm-bindgen-cli
RUN cargo install wasm-bindgen-cli
# Copy source files # Copy source files
COPY packages ./packages COPY packages ./packages
# Install all dependencies # Install all dependencies
RUN pnpm install --frozen-lockfile RUN pnpm install --frozen-lockfile
# Build packages in correct order with WASM support # Generate SvelteKit type definitions (creates .svelte-kit/tsconfig.json)
# 1. Build buttplug WASM RUN pnpm --filter @sexy.pivoine.art/frontend exec svelte-kit sync
RUN RUSTFLAGS='--cfg getrandom_backend="wasm_js" --cfg=web_sys_unstable_apis' \
pnpm --filter @sexy.pivoine.art/buttplug build:wasm
# 2. Build buttplug TypeScript # Build frontend
RUN pnpm --filter @sexy.pivoine.art/buttplug build
# 3. Build frontend
RUN pnpm --filter @sexy.pivoine.art/frontend build RUN pnpm --filter @sexy.pivoine.art/frontend build
# Prune dev dependencies for production # Prune dev dependencies for production
RUN pnpm install -rP RUN CI=true pnpm install -rP
# ============================================================================ # ============================================================================
# Runner stage - minimal production image # Runner stage - minimal production image
# ============================================================================ # ============================================================================
FROM node:22.11.0-slim AS runner FROM node:22.14.0-slim AS runner
# Install dumb-init for proper signal handling # Install dumb-init for proper signal handling
RUN apt-get update && apt-get install -y \ RUN apt-get update && apt-get install -y \
@@ -91,19 +65,14 @@ COPY --from=builder --chown=node:node /app/package.json ./package.json
COPY --from=builder --chown=node:node /app/pnpm-lock.yaml ./pnpm-lock.yaml COPY --from=builder --chown=node:node /app/pnpm-lock.yaml ./pnpm-lock.yaml
COPY --from=builder --chown=node:node /app/pnpm-workspace.yaml ./pnpm-workspace.yaml COPY --from=builder --chown=node:node /app/pnpm-workspace.yaml ./pnpm-workspace.yaml
# Create package directories # Create package directory
RUN mkdir -p packages/frontend packages/buttplug RUN mkdir -p packages/frontend
# Copy frontend artifacts # Copy frontend artifacts
COPY --from=builder --chown=node:node /app/packages/frontend/build ./packages/frontend/build COPY --from=builder --chown=node:node /app/packages/frontend/build ./packages/frontend/build
COPY --from=builder --chown=node:node /app/packages/frontend/node_modules ./packages/frontend/node_modules COPY --from=builder --chown=node:node /app/packages/frontend/node_modules ./packages/frontend/node_modules
COPY --from=builder --chown=node:node /app/packages/frontend/package.json ./packages/frontend/package.json COPY --from=builder --chown=node:node /app/packages/frontend/package.json ./packages/frontend/package.json
# Copy buttplug artifacts
COPY --from=builder --chown=node:node /app/packages/buttplug/dist ./packages/buttplug/dist
COPY --from=builder --chown=node:node /app/packages/buttplug/node_modules ./packages/buttplug/node_modules
COPY --from=builder --chown=node:node /app/packages/buttplug/package.json ./packages/buttplug/package.json
# Switch to non-root user # Switch to non-root user
USER node USER node

View File

@@ -3,27 +3,42 @@
# ============================================================================ # ============================================================================
# Builder stage # Builder stage
# ============================================================================ # ============================================================================
FROM node:22.11.0-slim AS builder FROM node:22.14.0-slim AS builder
RUN npm install -g corepack@latest && corepack enable RUN npm install -g corepack@latest && corepack enable
WORKDIR /app WORKDIR /app
# Copy all package manifests so pnpm can resolve the workspace lockfile,
# but use --ignore-scripts to skip buttplug's Rust/WASM build entirely.
COPY pnpm-workspace.yaml package.json pnpm-lock.yaml ./ COPY pnpm-workspace.yaml package.json pnpm-lock.yaml ./
COPY packages/backend/package.json ./packages/backend/package.json COPY packages/backend/package.json ./packages/backend/package.json
COPY packages/frontend/package.json ./packages/frontend/package.json
COPY packages/buttplug/package.json ./packages/buttplug/package.json
COPY packages/types/package.json ./packages/types/package.json
COPY packages/email/package.json ./packages/email/package.json
RUN pnpm install --frozen-lockfile --filter @sexy.pivoine.art/backend RUN pnpm install --frozen-lockfile --filter @sexy.pivoine.art/backend --filter @sexy.pivoine.art/email --ignore-scripts
# Rebuild native bindings (argon2, sharp)
RUN pnpm rebuild argon2 sharp
COPY packages/types ./packages/types
COPY packages/email ./packages/email
COPY packages/backend ./packages/backend COPY packages/backend ./packages/backend
RUN pnpm --filter @sexy.pivoine.art/email build
RUN pnpm --filter @sexy.pivoine.art/backend build RUN pnpm --filter @sexy.pivoine.art/backend build
RUN pnpm install -rP --filter @sexy.pivoine.art/backend RUN CI=true pnpm install --frozen-lockfile --filter @sexy.pivoine.art/backend --prod --ignore-scripts
RUN pnpm rebuild argon2 sharp
# ============================================================================ # ============================================================================
# Runner stage # Runner stage
# ============================================================================ # ============================================================================
FROM node:22.11.0-slim AS runner FROM node:22.14.0-slim AS runner
RUN apt-get update && apt-get install -y \ RUN apt-get update && apt-get install -y \
dumb-init \ dumb-init \
@@ -37,11 +52,19 @@ RUN userdel -r node && \
WORKDIR /home/node/app WORKDIR /home/node/app
RUN mkdir -p packages/backend RUN mkdir -p packages/backend packages/email
COPY --from=builder --chown=node:node /app/node_modules ./node_modules
COPY --from=builder --chown=node:node /app/package.json ./package.json
COPY --from=builder --chown=node:node /app/packages/backend/dist ./packages/backend/dist COPY --from=builder --chown=node:node /app/packages/backend/dist ./packages/backend/dist
COPY --from=builder --chown=node:node /app/packages/backend/node_modules ./packages/backend/node_modules COPY --from=builder --chown=node:node /app/packages/backend/node_modules ./packages/backend/node_modules
COPY --from=builder --chown=node:node /app/packages/backend/package.json ./packages/backend/package.json COPY --from=builder --chown=node:node /app/packages/backend/package.json ./packages/backend/package.json
COPY --from=builder --chown=node:node /app/packages/backend/src/migrations ./packages/backend/dist/migrations
COPY --from=builder --chown=node:node /app/packages/email/dist ./packages/email/dist
COPY --from=builder --chown=node:node /app/packages/email/node_modules ./packages/email/node_modules
COPY --from=builder --chown=node:node /app/packages/email/email.css ./packages/email/email.css
COPY --from=builder --chown=node:node /app/packages/email/templates ./packages/email/templates
COPY --from=builder --chown=node:node /app/packages/email/package.json ./packages/email/package.json
RUN mkdir -p /data/uploads && chown node:node /data/uploads RUN mkdir -p /data/uploads && chown node:node /data/uploads

65
Dockerfile.buttplug Normal file
View File

@@ -0,0 +1,65 @@
# syntax=docker/dockerfile:1
# ============================================================================
# Builder stage - compile Rust/WASM and TypeScript
# ============================================================================
FROM node:22.14.0-slim AS builder
# Install build dependencies for Rust
RUN apt-get update && apt-get install -y \
curl \
build-essential \
pkg-config \
libssl-dev \
ca-certificates \
&& rm -rf /var/lib/apt/lists/*
# Enable corepack for pnpm
RUN npm install -g corepack@latest && corepack enable
# Install Rust toolchain
RUN curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y \
--default-toolchain stable \
--profile minimal \
--target wasm32-unknown-unknown
ENV PATH="/root/.cargo/bin:${PATH}"
# Install wasm-bindgen-cli
RUN cargo install wasm-bindgen-cli
WORKDIR /app
# Copy workspace configuration
COPY pnpm-workspace.yaml package.json pnpm-lock.yaml ./
COPY packages/buttplug ./packages/buttplug
# Install dependencies
RUN pnpm install --frozen-lockfile --filter @sexy.pivoine.art/buttplug
# Build WASM
RUN RUSTFLAGS='--cfg getrandom_backend="wasm_js" --cfg=web_sys_unstable_apis' \
pnpm --filter @sexy.pivoine.art/buttplug build:wasm
# Build TypeScript
RUN pnpm --filter @sexy.pivoine.art/buttplug build
# ============================================================================
# Runner stage - nginx serving dist/ and wasm/
# ============================================================================
FROM nginx:1.27-alpine AS runner
# Remove default nginx config
RUN rm /etc/nginx/conf.d/default.conf
# Copy nginx config
COPY nginx.buttplug.conf /etc/nginx/conf.d/buttplug.conf
# Copy built artifacts
COPY --from=builder /app/packages/buttplug/dist /usr/share/nginx/html/dist
COPY --from=builder /app/packages/buttplug/wasm /usr/share/nginx/html/wasm
EXPOSE 80
HEALTHCHECK --interval=30s --timeout=3s --retries=3 \
CMD wget --no-verbose --tries=1 --spider http://localhost/dist/index.js || exit 1

355
README.md
View File

@@ -4,7 +4,7 @@
![sexy lips tongue mouth american apparel moist lip gloss ](https://i.gifer.com/1pYe.gif) ![sexy lips tongue mouth american apparel moist lip gloss ](https://i.gifer.com/1pYe.gif)
*"Lust und Liebe gehören zusammen - wer das eine verteufelt, zerstört auch das andere."* _"Lust und Liebe gehören zusammen - wer das eine verteufelt, zerstört auch das andere."_
**Beate Uhse**, Pionierin der sexuellen Befreiung ✈️ **Beate Uhse**, Pionierin der sexuellen Befreiung ✈️
--- ---
@@ -13,10 +13,11 @@
Built with passion, technology, and the fearless spirit of sexual empowerment Built with passion, technology, and the fearless spirit of sexual empowerment
[![Build Status](https://img.shields.io/github/actions/workflow/status/valknarxxx/sexy.pivoine.art/docker-build-push.yml?style=for-the-badge&logo=docker&logoColor=white&color=FF69B4&labelColor=8B008B)](https://github.com/valknarxxx/sexy.pivoine.art/actions/workflows/docker-build-push.yml) [![Build Frontend](https://dev.pivoine.art/valknar/sexy/actions/workflows/docker-build-frontend.yml/badge.svg)](https://dev.pivoine.art/valknar/sexy/actions)
[![Security Scan](https://img.shields.io/github/actions/workflow/status/valknarxxx/sexy.pivoine.art/docker-scan.yml?style=for-the-badge&logo=security&logoColor=white&label=Security&color=DA70D6&labelColor=8B008B)](https://github.com/valknarxxx/sexy.pivoine.art/actions/workflows/docker-scan.yml) [![Build Backend](https://dev.pivoine.art/valknar/sexy/actions/workflows/docker-build-backend.yml/badge.svg)](https://dev.pivoine.art/valknar/sexy/actions)
[![Build Buttplug](https://dev.pivoine.art/valknar/sexy/actions/workflows/docker-build-buttplug.yml/badge.svg)](https://dev.pivoine.art/valknar/sexy/actions)
[![License](https://img.shields.io/badge/License-For_Pleasure-FF1493?style=for-the-badge&logo=heart&logoColor=white&labelColor=8B008B)](LICENSE) [![License](https://img.shields.io/badge/License-For_Pleasure-FF1493?style=for-the-badge&logo=heart&logoColor=white&labelColor=8B008B)](LICENSE)
[![Made with Love](https://img.shields.io/badge/Made_with-💜_Love-FF69B4?style=for-the-badge&labelColor=8B008B)](http://sexy.pivoine.art) [![Made with Love](https://img.shields.io/badge/Made_with-💜_Love-FF69B4?style=for-the-badge&labelColor=8B008B)](https://sexy.pivoine.art)
</div> </div>
@@ -24,20 +25,24 @@ Built with passion, technology, and the fearless spirit of sexual empowerment
## 👅 What Is This Delicious Creation? ## 👅 What Is This Delicious Creation?
Welcome, dear pleasure-seeker! This is **sexy.pivoine.art** — a modern, sensual platform combining the elegance of **SvelteKit**, the power of **Directus CMS**, and the intimate connection of **Buttplug.io** hardware integration. Welcome, dear pleasure-seeker! This is **sexy.pivoine.art** — a modern, sensual platform built from the ground up with full control over every intimate detail. A **SvelteKit** frontend caresses a purpose-built **Fastify + GraphQL** backend, while **Buttplug.io** hardware integration brings the experience into the physical world.
Like Beate Uhse breaking barriers in post-war Germany, we believe in the freedom to explore, create, and celebrate sexuality without shame. This platform is built for **models**, **creators**, and **connoisseurs** of adult content who deserve technology as sophisticated as their desires. Like Beate Uhse breaking barriers in post-war Germany, we believe in the freedom to explore, create, and celebrate sexuality without shame. This platform is built for **models**, **creators**, and **connoisseurs** of adult content who deserve technology as sophisticated as their desires.
### ♉ Features That'll Make You Blush ♊ ### ♉ Features That'll Make You Blush ♊
- 💖 **Sensual SvelteKit Frontend** with Tailwind CSS 4 styling - 💖 **Sensual SvelteKit Frontend** with Tailwind CSS 4 styling
- 🗄️ **Headless CMS** powered by Directus for content liberation - **Purpose-built GraphQL Backend** — lean, fast, no CMS overhead
- 🔐 **Session-based Auth** with Redis & Argon2 — discretion guaranteed
- 🖼️ **Smart Image Transforms** via Sharp (WebP, multiple presets, cached)
- 🎮 **Hardware Integration** via Buttplug.io (yes, really!) - 🎮 **Hardware Integration** via Buttplug.io (yes, really!)
- 🌐 **Multi-Platform Support** (AMD64 + ARM64) — pleasure everywhere
- 🔒 **Session-Based Authentication** — discretion guaranteed
- 📱 **Responsive Design** that looks sexy on any device - 📱 **Responsive Design** that looks sexy on any device
- 🌍 **Internationalization** — pleasure speaks all languages - 🌍 **Internationalization** — pleasure speaks all languages
- 🏆 **Gamification** — achievements, leaderboards, and reward points
- 💬 **Comments & Social** — build your community
- 💌 **Professional HTML Emails** — Maizzle v6 + Tailwind CSS 4 templated email (verification, password reset)
- 📊 **Analytics Integration** (Umami) — know your admirers - 📊 **Analytics Integration** (Umami) — know your admirers
- 🐳 **Self-hosted CI/CD** via Gitea Actions on `dev.pivoine.art`
<div align="center"> <div align="center">
@@ -48,25 +53,36 @@ Like Beate Uhse breaking barriers in post-war Germany, we believe in the freedom
``` ```
┌─────────────────────────────────────────────────────────────┐ ┌─────────────────────────────────────────────────────────────┐
│ 💋 Frontend Layer │ │ 💋 Frontend Layer │
│ ├─ SvelteKit 2.0 → Smooth as silk │ │ ├─ SvelteKit 2 → Smooth as silk │
│ ├─ Tailwind CSS 4 → Styled to seduce │ │ ├─ Tailwind CSS 4 → Styled to seduce │
│ ├─ bits-ui Components → Building blocks of pleasure │ │ ├─ bits-ui Components → Building blocks of pleasure │
│ ├─ graphql-request v7 → Whispering to the backend │
│ └─ Vite → Fast and furious │ │ └─ Vite → Fast and furious │
├─────────────────────────────────────────────────────────────┤ ├─────────────────────────────────────────────────────────────┤
│ 🍷 Backend Layer │ │ 🍷 Backend Layer │
│ ├─ Directus CMS → Content with no limits │ ├─ Fastify v5 → The fastest penetration
│ ├─ Custom Extensions → Bespoke pleasures │ ├─ GraphQL Yoga v5 → Flexible positions
─ PostgreSQL → Data deep and secure ─ Pothos (code-first) → Schema with intention
│ ├─ Drizzle ORM → Data with grace │
│ ├─ PostgreSQL 16 → Deep and persistent │
│ ├─ Redis → Sessions that never forget │
│ ├─ Sharp → Images transformed beautifully │
│ └─ Argon2 → Passwords hashed with passion │
├─────────────────────────────────────────────────────────────┤ ├─────────────────────────────────────────────────────────────┤
│ 🎀 Hardware Layer │ │ 🎀 Hardware Layer │
│ ├─ Buttplug.io → Real connections │ │ ├─ Buttplug.io → Real connections │
│ ├─ TypeScript + Rust → Power and precision │ │ ├─ TypeScript + Rust → Power and precision │
│ └─ WebBluetooth API → Wireless intimacy │ │ └─ WebBluetooth API → Wireless intimacy │
├─────────────────────────────────────────────────────────────┤ ├─────────────────────────────────────────────────────────────┤
│ 💌 Email Layer │
│ ├─ Maizzle v6 → HTML email framework │
│ ├─ @maizzle/tailwindcss → Email-safe Tailwind CSS 4 │
│ └─ Nodemailer → SMTP delivery │
├─────────────────────────────────────────────────────────────┤
│ 🌸 DevOps Layer │ │ 🌸 DevOps Layer │
│ ├─ Docker → Containerized ecstasy │ │ ├─ Docker → Containerized ecstasy │
│ ├─ GitHub Actions → Automated seduction │ ├─ Gitea Actions Self-hosted seduction │
│ └─ GHCR → Images served hot │ └─ dev.pivoine.art → Our own pleasure palace
└─────────────────────────────────────────────────────────────┘ └─────────────────────────────────────────────────────────────┘
``` ```
@@ -74,147 +90,177 @@ Like Beate Uhse breaking barriers in post-war Germany, we believe in the freedom
## 🔥 Quick Start — Get Intimate Fast ## 🔥 Quick Start — Get Intimate Fast
### 💕 Option 1: Using Docker (Recommended) ### 💕 Option 1: Using Docker Compose (Recommended)
```bash ```bash
# Pull the pleasure # Clone the repository
docker pull ghcr.io/valknarxxx/sexy:latest git clone https://dev.pivoine.art/valknar/sexy.git
cd sexy.pivoine.art
# Run with passion # Configure your secrets
docker run -d -p 3000:3000 \ cp .env.example .env
-e PUBLIC_API_URL=https://api.your-domain.com \ # Edit .env with your intimate details
-e PUBLIC_URL=https://your-domain.com \
ghcr.io/valknarxxx/sexy:latest # Awaken all services (postgres, redis, backend, frontend)
docker compose up -d
# Visit your creation at http://localhost:3000 💋 # Visit your creation at http://localhost:3000 💋
``` ```
See [QUICKSTART.md](QUICKSTART.md) for the full seduction guide.
### 💜 Option 2: Local Development ### 💜 Option 2: Local Development
**Prerequisites:** **Prerequisites:**
1. Node.js 20.19.1 — *the foundation* 1. Node.js 20.19.1 — _the foundation_
2. `corepack enable`*unlock the tools* 2. `corepack enable`_unlock the tools_
3. `pnpm install`*gather your ingredients* 3. `pnpm install`_gather your ingredients_
4. Rust + `cargo install wasm-bindgen-cli`*forge the connection* 4. PostgreSQL 16 + Redis_the data lovers_
**Start your pleasure journey:** **Start your pleasure journey:**
```bash ```bash
# Awaken all services # Awaken data services
pnpm dev pnpm dev:data
# Or tease them one by one # Start the backend (port 4000)
pnpm dev:data # The foundation pnpm dev:backend
pnpm dev:directus # The content
pnpm --filter @sexy.pivoine.art/frontend dev # The face # Start the frontend (port 3000, proxied to :4000)
pnpm --filter @sexy.pivoine.art/frontend dev
``` ```
Visit `http://localhost:3000` and let the experience begin... 💋 Visit `http://localhost:3000` and let the experience begin... 💋
GraphQL playground is available at `http://localhost:4000/graphql` — explore every query.
--- ---
## 🌹 Project Structure ## 🌹 Project Structure
This monorepo contains three packages, each serving its purpose: This monorepo contains four packages, each serving its purpose:
``` ```
sexy.pivoine.art/ sexy.pivoine.art/
├─ 💄 packages/frontend/ → SvelteKit app (the seduction) ├─ 💄 packages/frontend/ → SvelteKit app (the seduction)
├─ 🎭 packages/bundle/ Directus extensions (the power) ├─ packages/backend/Fastify + GraphQL API (the engine)
─ 🎮 packages/buttplug/ → Hardware control (the connection) ─ 🎮 packages/buttplug/ → Hardware control (the connection)
└─ 💌 packages/email/ → Maizzle HTML email templates
```
### 💄 Frontend (`packages/frontend/`)
SvelteKit 2 application with server-side rendering, i18n, and a clean component library.
Communicates with the backend exclusively via GraphQL using `graphql-request`.
Assets served via `/api/assets/:id?transform=<preset>` — no CDN, no Directus, just raw power.
### ⚡ Backend (`packages/backend/`)
Purpose-built Fastify v5 + GraphQL Yoga server. All business logic lives here:
auth, file uploads, video processing, comments, gamification, and analytics.
Files stored as `<UPLOAD_DIR>/<uuid>/<filename>` with on-demand WebP transforms cached on disk.
### 🎮 Buttplug (`packages/buttplug/`)
Hybrid TypeScript/Rust package for intimate hardware control via WebBluetooth.
Compiled to WebAssembly for browser-based Bluetooth device communication.
### 💌 Email (`packages/email/`)
Professional HTML email templates built with **Maizzle v6** + **Tailwind CSS 4** (`@maizzle/tailwindcss`).
Design tokens mirror the frontend's `app.css` exactly — same oklch colors, Noto Sans font, semantic classes.
LightningCSS automatically converts oklch values to hex for email client compatibility.
Exported functions: `renderVerification({ token })` and `renderPasswordReset({ token })` — each returns `{ subject, html }`.
---
## 🗃️ Database Schema
Built with Drizzle ORM — clean tables, no `directus_` prefix, full control:
```
users → profiles, roles (model/viewer/admin), auth tokens
files → uploaded assets with metadata and duration
videos → content with model junctions, likes, plays
articles → magazine / editorial content
recordings → user-created content with play tracking
comments → threaded by collection + item_id
achievements → gamification goals
user_points → points ledger
user_stats → cached leaderboard data
``` ```
--- ---
## 📚 Documentation — Your Guide to Pleasure ## 🔐 Authentication Flow
<div align="center">
| Document | Purpose | Emoji |
|----------|---------|-------|
| [QUICKSTART.md](QUICKSTART.md) | Get wet... I mean, get started! | 💦 |
| [COMPOSE.md](COMPOSE.md) | Docker Compose setup guide | 🐳 |
| [DOCKER.md](DOCKER.md) | Standalone Docker deployment | 🐋 |
| [CLAUDE.md](CLAUDE.md) | Architecture & development | 🤖 |
| [.github/workflows/README.md](.github/workflows/README.md) | CI/CD workflows | ⚙️ |
</div>
---
## 🎨 Building — Craft Your Masterpiece
### Build All Packages
```bash
# Prepare everything
pnpm install
# Build the WASM foundation
pnpm --filter @sexy.pivoine.art/buttplug build:wasm
# Build the packages
pnpm --filter @sexy.pivoine.art/buttplug build
pnpm --filter @sexy.pivoine.art/frontend build
pnpm --filter @sexy.pivoine.art/bundle build
``` ```
POST /graphql (login mutation)
→ verify argon2 password hash
→ nanoid(32) session token
→ SET session:<token> <user JSON> EX 86400 in Redis
→ set httpOnly cookie: session_token
→ return CurrentUser
### Build Docker Image Every request:
→ read session_token cookie
```bash → GET session:<token> from Redis
# Quick build → inject currentUser into GraphQL context
./build.sh
# Manual control
docker build -t sexy.pivoine.art:latest .
# Multi-platform pleasure
docker buildx build --platform linux/amd64,linux/arm64 -t sexy.pivoine.art:latest .
``` ```
--- ---
## 🚀 Deployment — Share Your Creation ## 🖼️ Image Transforms
Assets are transformed on first request and cached as WebP:
| Preset | Size | Fit | Use |
| ----------- | ----------- | ------ | ---------------- |
| `mini` | 80×80 | cover | Avatars in lists |
| `thumbnail` | 300×300 | cover | Profile photos |
| `preview` | 800px wide | inside | Video teasers |
| `medium` | 1400px wide | inside | Full-size images |
| `banner` | 1600×480 | cover | Profile banners |
---
## 🚀 Deployment
### Production with Docker Compose ### Production with Docker Compose
```bash ```bash
# Configure your secrets # Configure your secrets
cp .env.production.example .env.production cp .env.example .env.production
# Edit .env.production with your intimate details # Edit .env.production — set DB credentials, SMTP, cookie secret, CORS origin
# Deploy with grace (uses Traefik for routing) # Deploy
docker-compose -f compose.production.yml --env-file .env.production up -d docker compose --env-file .env.production up -d
``` ```
### Production without Docker Key environment variables for the backend:
```bash ```env
# Build everything DATABASE_URL=postgresql://sexy:sexy@postgres:5432/sexy
pnpm build:frontend REDIS_URL=redis://redis:6379
COOKIE_SECRET=your-very-secret-key
# Start serving CORS_ORIGIN=https://sexy.pivoine.art
pnpm --filter @sexy.pivoine.art/frontend start UPLOAD_DIR=/data/uploads
SMTP_HOST=your.smtp.host
SMTP_PORT=587
EMAIL_FROM=noreply@sexy.pivoine.art
PUBLIC_URL=https://sexy.pivoine.art
``` ```
--- ### 🎬 CI/CD — Self-Hosted Seduction
## 🌈 Environment Variables Automated builds run on **[dev.pivoine.art](https://dev.pivoine.art/valknar/sexy)** via Gitea Actions:
### 💖 Required (The Essentials) - ✅ Frontend image → `dev.pivoine.art/valknar/sexy:latest`
- ✅ Backend image → `dev.pivoine.art/valknar/sexy-backend:latest`
- ✅ Buttplug image → `dev.pivoine.art/valknar/sexy-buttplug:latest`
- ✅ Triggers on push to `main`, `develop`, or version tags (`v*.*.*`)
- ✅ Build cache via registry for fast successive builds
- `PUBLIC_API_URL` — Your Directus backend Images are pulled on the production server via Watchtower or manual `docker compose pull && docker compose up -d`.
- `PUBLIC_URL` — Your frontend domain
### 💜 Optional (The Extras)
- `PUBLIC_UMAMI_ID` — Analytics tracking ID
- `PUBLIC_UMAMI_SCRIPT` — Umami script URL
See [.env.production.example](.env.production.example) for the full configuration.
--- ---
@@ -225,60 +271,54 @@ graph LR
A[💡 Idea] --> B[💻 Code] A[💡 Idea] --> B[💻 Code]
B --> C[🧪 Test Locally] B --> C[🧪 Test Locally]
C --> D[🌿 Feature Branch] C --> D[🌿 Feature Branch]
D --> E[📤 Push & PR] D --> E[📤 Push to dev.pivoine.art]
E --> F{✅ CI Pass?} E --> F{✅ Build Pass?}
F -->|Yes| G[🔀 Merge to Main] F -->|Yes| G[🔀 Merge to Main]
F -->|No| B F -->|No| B
G --> H[🚀 Auto Deploy] G --> H[🚀 Images Built & Pushed]
H --> I[🏷️ Tag Release] H --> I[🎉 Deploy to Production]
I --> J[🎉 Celebrate]
``` ```
1. Create → `git checkout -b feature/my-sexy-feature` 1. Create → `git checkout -b feature/my-sexy-feature`
2. Develop → Write beautiful code 2. Develop → Write beautiful code
3. Test → `pnpm dev` 3. Test → `pnpm dev:data && pnpm dev:backend && pnpm dev`
4. Push → Create PR (triggers CI build) 4. Push → `git push` to `dev.pivoine.art` (triggers CI build)
5. Merge → Automatic deployment to production 5. Merge → Images published, deploy to production
6. Release → `git tag v1.0.0 && git push origin v1.0.0` 6. Release → `git tag v1.0.0 && git push origin v1.0.0`
--- ---
## 🔐 Security — Protected Pleasure ## 🌈 Environment Variables
- 🛡️ Daily vulnerability scans with Trivy ### Backend (required)
- 🔒 Non-root Docker containers
- 📊 Security reports in GitHub Security tab
- 🤐 Confidential issue reporting available
*Report security concerns privately via GitHub Security.* | Variable | Description |
| --------------- | ----------------------------- |
| `DATABASE_URL` | PostgreSQL connection string |
| `REDIS_URL` | Redis connection string |
| `COOKIE_SECRET` | Session cookie signing secret |
| `CORS_ORIGIN` | Allowed frontend origin |
| `UPLOAD_DIR` | Path for uploaded files |
--- ### Backend (optional)
## 💝 Contributing — Join the Movement | Variable | Default | Description |
| ------------ | ------- | ------------------------------ |
| `PORT` | `4000` | Backend listen port |
| `LOG_LEVEL` | `info` | Fastify log level |
| `SMTP_HOST` | — | Email server for auth flows |
| `SMTP_PORT` | `587` | Email server port |
| `EMAIL_FROM` | — | Sender address |
| `PUBLIC_URL` | — | Frontend URL (for email links) |
Like Beate Uhse fought for sexual liberation, we welcome contributors who believe in freedom, pleasure, and quality code. ### Frontend
1. **Fork** this repository | Variable | Description |
2. **Create** your feature branch | --------------------- | --------------------------------------------- |
3. **Commit** your changes | `PUBLIC_API_URL` | Backend URL (e.g. `http://sexy_backend:4000`) |
4. **Push** to your branch | `PUBLIC_URL` | Frontend public URL |
5. **Submit** a pull request | `PUBLIC_UMAMI_ID` | Umami analytics site ID (optional) |
| `PUBLIC_UMAMI_SCRIPT` | Umami script URL (optional) |
All contributors are bound by our code of conduct: **Respect, Consent, and Quality.**
---
## 🎯 CI/CD Pipeline — Automated Seduction
Our GitHub Actions workflows handle:
- ✅ Multi-platform Docker builds (AMD64 + ARM64)
- ✅ Automated publishing to GHCR
- ✅ Daily security vulnerability scans
- ✅ Weekly cleanup of old images
- ✅ Semantic versioning from git tags
**Images available at:** `ghcr.io/valknarxxx/sexy`
--- ---
@@ -288,20 +328,27 @@ Our GitHub Actions workflows handle:
### 🌸 Created with Love by 🌸 ### 🌸 Created with Love by 🌸
**[Palina](http://sexy.pivoine.art) & [Valknar](http://sexy.pivoine.art)** **[Palina](https://sexy.pivoine.art) & [Valknar](https://sexy.pivoine.art)**
*Für die Mäuse...* 🐭💕 _Für die Mäuse..._ 🐭💕
--- ---
### 🙏 Built With ### 🙏 Built With
| Technology | Purpose | | Technology | Purpose |
|------------|---------| | --------------------------------------------------------- | -------------------- |
| [SvelteKit](https://kit.svelte.dev/) | Framework | | [SvelteKit](https://kit.svelte.dev/) | Frontend framework |
| [Directus](https://directus.io/) | CMS | | [Fastify](https://fastify.dev/) | HTTP server |
| [Buttplug.io](https://buttplug.io/) | Hardware | | [GraphQL Yoga](https://the-guild.dev/graphql/yoga-server) | GraphQL server |
| [bits-ui](https://www.bits-ui.com/) | Components | | [Pothos](https://pothos-graphql.dev/) | Code-first schema |
| [Drizzle ORM](https://orm.drizzle.team/) | Database |
| [Sharp](https://sharp.pixelplumbing.com/) | Image transforms |
| [Buttplug.io](https://buttplug.io/) | Hardware |
| [Maizzle](https://maizzle.com/) | HTML email framework |
| [Nodemailer](https://nodemailer.com/) | Email delivery |
| [bits-ui](https://www.bits-ui.com/) | UI components |
| [Gitea](https://dev.pivoine.art) | Self-hosted VCS & CI |
--- ---
@@ -310,7 +357,7 @@ Our GitHub Actions workflows handle:
Pioneer of sexual liberation (1919-2001) Pioneer of sexual liberation (1919-2001)
Pilot, Entrepreneur, Freedom Fighter Pilot, Entrepreneur, Freedom Fighter
*"Eine Frau, die ihre Sexualität selbstbestimmt lebt, ist eine freie Frau."* _"Eine Frau, die ihre Sexualität selbstbestimmt lebt, ist eine freie Frau."_
![Beate Uhse Quote](https://img.shields.io/badge/Beate_Uhse-Sexual_Liberation_Pioneer-FF1493?style=for-the-badge&logo=heart&logoColor=white&labelColor=8B008B) ![Beate Uhse Quote](https://img.shields.io/badge/Beate_Uhse-Sexual_Liberation_Pioneer-FF1493?style=for-the-badge&logo=heart&logoColor=white&labelColor=8B008B)
@@ -331,9 +378,9 @@ Pilot, Entrepreneur, Freedom Fighter
<div align="center"> <div align="center">
[![Issues](https://img.shields.io/badge/🐛_Issues-Report_Here-FF69B4?style=for-the-badge&labelColor=8B008B)](https://github.com/valknarxxx/sexy.pivoine.art/issues) [![Repository](https://img.shields.io/badge/🐙_Repository-dev.pivoine.art-FF69B4?style=for-the-badge&labelColor=8B008B)](https://dev.pivoine.art/valknar/sexy)
[![Discussions](https://img.shields.io/badge/💭_Discussions-Join_Here-DA70D6?style=for-the-badge&labelColor=8B008B)](https://github.com/valknarxxx/sexy.pivoine.art/discussions) [![Issues](https://img.shields.io/badge/🐛_Issues-Report_Here-DA70D6?style=for-the-badge&labelColor=8B008B)](https://dev.pivoine.art/valknar/sexy/issues)
[![Website](https://img.shields.io/badge/🌐_Website-Visit_Here-FF1493?style=for-the-badge&labelColor=8B008B)](http://sexy.pivoine.art) [![Website](https://img.shields.io/badge/🌐_Website-Visit_Here-FF1493?style=for-the-badge&labelColor=8B008B)](https://sexy.pivoine.art)
</div> </div>
@@ -352,8 +399,8 @@ Pilot, Entrepreneur, Freedom Fighter
╚═════╝ ╚══════╝╚═╝ ╚═╝ ╚═╝ ╚═════╝ ╚══════╝╚═╝ ╚═╝ ╚═╝
</pre> </pre>
*Pleasure is a human right. Technology is freedom. Together, they are power.* _Pleasure is a human right. Technology is freedom. Together, they are power._
**[sexy.pivoine.art](http://sexy.pivoine.art)** | © 2025 Palina & Valknar **[sexy.pivoine.art](https://sexy.pivoine.art)** | © 2026 Palina & Valknar
</div> </div>

View File

@@ -4,6 +4,8 @@ services:
image: postgres:16-alpine image: postgres:16-alpine
container_name: sexy_postgres container_name: sexy_postgres
restart: unless-stopped restart: unless-stopped
ports:
- "5432:5432"
volumes: volumes:
- postgres_data:/var/lib/postgresql/data - postgres_data:/var/lib/postgresql/data
environment: environment:
@@ -19,6 +21,8 @@ services:
image: redis:7-alpine image: redis:7-alpine
container_name: sexy_redis container_name: sexy_redis
restart: unless-stopped restart: unless-stopped
ports:
- "6379:6379"
volumes: volumes:
- redis_data:/data - redis_data:/data
command: redis-server --appendonly yes command: redis-server --appendonly yes
@@ -60,6 +64,21 @@ services:
timeout: 10s timeout: 10s
retries: 3 retries: 3
start_period: 20s start_period: 20s
buttplug:
build:
context: .
dockerfile: Dockerfile.buttplug
container_name: sexy_buttplug
restart: unless-stopped
ports:
- "8080:80"
healthcheck:
test:
["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost/dist/index.js"]
interval: 30s
timeout: 5s
retries: 3
start_period: 10s
frontend: frontend:
build: build:
context: . context: .
@@ -74,9 +93,12 @@ services:
HOST: 0.0.0.0 HOST: 0.0.0.0
PUBLIC_API_URL: http://sexy_backend:4000 PUBLIC_API_URL: http://sexy_backend:4000
PUBLIC_URL: http://localhost:3000 PUBLIC_URL: http://localhost:3000
BUTTPLUG_URL: http://sexy_buttplug:80
depends_on: depends_on:
backend: backend:
condition: service_healthy condition: service_healthy
buttplug:
condition: service_healthy
volumes: volumes:
uploads_data: uploads_data:

File diff suppressed because it is too large Load Diff

57
eslint.config.js Normal file
View File

@@ -0,0 +1,57 @@
import js from "@eslint/js";
import ts from "typescript-eslint";
import svelte from "eslint-plugin-svelte";
import prettier from "eslint-config-prettier";
import globals from "globals";
export default ts.config(
js.configs.recommended,
...ts.configs.recommended,
...svelte.configs["flat/recommended"],
prettier,
...svelte.configs["flat/prettier"],
{
languageOptions: {
globals: {
...globals.browser,
...globals.node,
},
},
},
{
files: ["**/*.svelte"],
languageOptions: {
parserOptions: {
parser: ts.parser,
},
},
},
{
rules: {
// Allow unused vars prefixed with _ (common pattern for intentional ignores)
"@typescript-eslint/no-unused-vars": [
"error",
{ argsIgnorePattern: "^_", varsIgnorePattern: "^_" },
],
// Enforce consistent type imports
"@typescript-eslint/consistent-type-imports": [
"error",
{ prefer: "type-imports", fixStyle: "inline-type-imports" },
],
// This rule is meant for onNavigate() callbacks only; standard SvelteKit href/goto is fine
"svelte/no-navigation-without-resolve": "off",
// {@html} is used intentionally for trusted content (e.g. legal page)
"svelte/no-at-html-tags": "warn",
},
},
{
ignores: [
"**/build/",
"**/.svelte-kit/",
"**/dist/",
"**/node_modules/",
"**/migrations/",
"**/wasm/",
],
},
);

23
nginx.buttplug.conf Normal file
View File

@@ -0,0 +1,23 @@
server {
listen 80;
server_name _;
root /usr/share/nginx/html;
# WASM MIME type
include /etc/nginx/mime.types;
types {
application/wasm wasm;
}
# Cache JS and WASM aggressively (content-addressed by build)
location ~* \.(js|wasm)$ {
add_header Cache-Control "public, max-age=31536000, immutable";
add_header Cross-Origin-Resource-Policy "cross-origin";
add_header Cross-Origin-Embedder-Policy "require-corp";
}
location / {
try_files $uri =404;
}
}

View File

@@ -1,33 +1,50 @@
{ {
"name": "sexy.pivoine.art", "name": "sexy.pivoine.art",
"version": "1.0.0", "version": "1.0.0",
"description": "", "description": "",
"main": "index.js", "type": "module",
"scripts": { "scripts": {
"test": "echo \"Error: no test specified\" && exit 1", "test": "echo \"Error: no test specified\" && exit 1",
"build:frontend": "git pull && pnpm install && pnpm --filter @sexy.pivoine.art/frontend build", "build:frontend": "pnpm --filter @sexy.pivoine.art/frontend build",
"build:backend": "git pull && pnpm install && pnpm --filter @sexy.pivoine.art/backend build", "build:backend": "pnpm --filter @sexy.pivoine.art/backend build",
"dev:data": "docker compose up -d postgres redis", "dev:buttplug": "pnpm --filter @sexy.pivoine.art/buttplug serve",
"dev:backend": "pnpm --filter @sexy.pivoine.art/backend dev", "dev:data": "docker compose up -d postgres redis",
"dev": "pnpm dev:data && pnpm dev:backend & pnpm --filter @sexy.pivoine.art/frontend dev" "dev:backend": "pnpm --filter @sexy.pivoine.art/backend dev",
}, "dev": "pnpm dev:data && pnpm dev:backend & pnpm dev:buttplug & pnpm --filter @sexy.pivoine.art/frontend dev",
"keywords": [], "lint": "eslint .",
"author": { "lint:fix": "eslint . --fix",
"name": "Valknar", "format": "prettier --write .",
"email": "valknar@pivoine.art" "format:check": "prettier --check .",
}, "check": "pnpm -r --filter=!sexy.pivoine.art check"
"license": "MIT", },
"packageManager": "pnpm@10.19.0", "keywords": [],
"pnpm": { "author": {
"onlyBuiltDependencies": [ "name": "Valknar",
"es5-ext", "email": "valknar@pivoine.art"
"esbuild", },
"svelte-preprocess", "license": "MIT",
"wasm-pack" "packageManager": "pnpm@10.31.0",
], "pnpm": {
"ignoredBuiltDependencies": [ "onlyBuiltDependencies": [
"@tailwindcss/oxide", "argon2",
"node-sass" "es5-ext",
] "esbuild",
} "svelte-preprocess",
"wasm-pack"
],
"ignoredBuiltDependencies": [
"@tailwindcss/oxide",
"node-sass"
]
},
"devDependencies": {
"@eslint/js": "^10.0.1",
"eslint": "^10.0.2",
"eslint-config-prettier": "^10.1.8",
"eslint-plugin-svelte": "^3.15.0",
"globals": "^17.4.0",
"prettier": "^3.8.1",
"prettier-plugin-svelte": "^3.5.1",
"typescript-eslint": "^8.56.1"
}
} }

View File

@@ -1,7 +1,7 @@
import { defineConfig } from "drizzle-kit"; import { defineConfig } from "drizzle-kit";
export default defineConfig({ export default defineConfig({
schema: "./src/db/schema/index.ts", schema: "./src/db/schema/*.ts",
out: "./src/migrations", out: "./src/migrations",
dialect: "postgresql", dialect: "postgresql",
dbCredentials: { dbCredentials: {

View File

@@ -1,16 +1,17 @@
{ {
"name": "@sexy.pivoine.art/backend", "name": "@sexy.pivoine.art/backend",
"version": "1.0.0", "version": "1.0.0",
"type": "module",
"private": true, "private": true,
"scripts": { "scripts": {
"dev": "tsx watch src/index.ts", "dev": "UPLOAD_DIR=../../.data/uploads DATABASE_URL=postgresql://sexy:sexy@localhost:5432/sexy REDIS_URL=redis://localhost:6379 tsx watch src/index.ts",
"build": "tsc", "build": "tsc",
"start": "node dist/index.js", "start": "node dist/index.js",
"db:generate": "drizzle-kit generate", "db:generate": "drizzle-kit generate",
"db:migrate": "drizzle-kit migrate", "db:migrate": "drizzle-kit migrate",
"db:studio": "drizzle-kit studio", "db:studio": "drizzle-kit studio",
"migrate": "tsx src/scripts/data-migration.ts" "schema:migrate": "tsx src/scripts/migrate.ts",
"migrate": "tsx src/scripts/data-migration.ts",
"check": "tsc --noEmit"
}, },
"dependencies": { "dependencies": {
"@fastify/cookie": "^11.0.2", "@fastify/cookie": "^11.0.2",
@@ -19,7 +20,10 @@
"@fastify/static": "^8.1.1", "@fastify/static": "^8.1.1",
"@pothos/core": "^4.4.0", "@pothos/core": "^4.4.0",
"@pothos/plugin-errors": "^4.2.0", "@pothos/plugin-errors": "^4.2.0",
"@sexy.pivoine.art/email": "workspace:*",
"@sexy.pivoine.art/types": "workspace:*",
"argon2": "^0.43.0", "argon2": "^0.43.0",
"bullmq": "^5.70.4",
"drizzle-orm": "^0.44.1", "drizzle-orm": "^0.44.1",
"fastify": "^5.4.0", "fastify": "^5.4.0",
"fluent-ffmpeg": "^2.1.3", "fluent-ffmpeg": "^2.1.3",
@@ -28,21 +32,18 @@
"graphql-ws": "^6.0.4", "graphql-ws": "^6.0.4",
"graphql-yoga": "^5.13.4", "graphql-yoga": "^5.13.4",
"ioredis": "^5.6.1", "ioredis": "^5.6.1",
"nanoid": "^5.1.5", "nanoid": "^3.3.11",
"nodemailer": "^7.0.3", "nodemailer": "^7.0.3",
"pg": "^8.16.0", "pg": "^8.16.0",
"sharp": "^0.33.5",
"slugify": "^1.6.6", "slugify": "^1.6.6",
"uuid": "^11.1.0" "uuid": "^11.1.0"
}, },
"pnpm": {
"onlyBuiltDependencies": [
"argon2"
]
},
"devDependencies": { "devDependencies": {
"@types/fluent-ffmpeg": "^2.1.27", "@types/fluent-ffmpeg": "^2.1.27",
"@types/nodemailer": "^6.4.17", "@types/nodemailer": "^6.4.17",
"@types/pg": "^8.15.4", "@types/pg": "^8.15.4",
"@types/sharp": "^0.32.0",
"@types/uuid": "^10.0.0", "@types/uuid": "^10.0.0",
"drizzle-kit": "^0.31.1", "drizzle-kit": "^0.31.1",
"tsx": "^4.19.4", "tsx": "^4.19.4",

View File

@@ -1,6 +1,6 @@
import { drizzle } from "drizzle-orm/node-postgres"; import { drizzle } from "drizzle-orm/node-postgres";
import { Pool } from "pg"; import { Pool } from "pg";
import * as schema from "./schema/index.js"; import * as schema from "./schema/index";
const pool = new Pool({ const pool = new Pool({
connectionString: process.env.DATABASE_URL || "postgresql://sexy:sexy@localhost:5432/sexy", connectionString: process.env.DATABASE_URL || "postgresql://sexy:sexy@localhost:5432/sexy",

View File

@@ -1,18 +1,13 @@
import { import { pgTable, text, timestamp, boolean, index, uniqueIndex } from "drizzle-orm/pg-core";
pgTable, import { users } from "./users";
text, import { files } from "./files";
timestamp,
boolean,
index,
uniqueIndex,
} from "drizzle-orm/pg-core";
import { users } from "./users.js";
import { files } from "./files.js";
export const articles = pgTable( export const articles = pgTable(
"articles", "articles",
{ {
id: text("id").primaryKey().$defaultFn(() => crypto.randomUUID()), id: text("id")
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
slug: text("slug").notNull(), slug: text("slug").notNull(),
title: text("title").notNull(), title: text("title").notNull(),
excerpt: text("excerpt"), excerpt: text("excerpt"),

View File

@@ -1,11 +1,5 @@
import { import { pgTable, text, timestamp, index, integer } from "drizzle-orm/pg-core";
pgTable, import { users } from "./users";
text,
timestamp,
index,
integer,
} from "drizzle-orm/pg-core";
import { users } from "./users.js";
export const comments = pgTable( export const comments = pgTable(
"comments", "comments",

View File

@@ -1,16 +1,11 @@
import { import { pgTable, text, timestamp, bigint, integer, index } from "drizzle-orm/pg-core";
pgTable,
text,
timestamp,
bigint,
integer,
index,
} from "drizzle-orm/pg-core";
export const files = pgTable( export const files = pgTable(
"files", "files",
{ {
id: text("id").primaryKey().$defaultFn(() => crypto.randomUUID()), id: text("id")
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
title: text("title"), title: text("title"),
description: text("description"), description: text("description"),
filename: text("filename").notNull(), filename: text("filename").notNull(),

View File

@@ -8,18 +8,18 @@ import {
pgEnum, pgEnum,
uniqueIndex, uniqueIndex,
} from "drizzle-orm/pg-core"; } from "drizzle-orm/pg-core";
import { users } from "./users.js"; import { sql } from "drizzle-orm";
import { recordings } from "./recordings.js"; import { users } from "./users";
import { recordings } from "./recordings";
export const achievementStatusEnum = pgEnum("achievement_status", [ export const achievementStatusEnum = pgEnum("achievement_status", ["draft", "published"]);
"draft",
"published",
]);
export const achievements = pgTable( export const achievements = pgTable(
"achievements", "achievements",
{ {
id: text("id").primaryKey().$defaultFn(() => crypto.randomUUID()), id: text("id")
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
code: text("code").notNull(), code: text("code").notNull(),
name: text("name").notNull(), name: text("name").notNull(),
description: text("description"), description: text("description"),
@@ -69,6 +69,11 @@ export const user_points = pgTable(
(t) => [ (t) => [
index("user_points_user_idx").on(t.user_id), index("user_points_user_idx").on(t.user_id),
index("user_points_date_idx").on(t.date_created), index("user_points_date_idx").on(t.date_created),
uniqueIndex("user_points_unique_action_recording")
.on(t.user_id, t.action, t.recording_id)
.where(
sql`"action" IN ('RECORDING_CREATE', 'RECORDING_FEATURED') AND "recording_id" IS NOT NULL`,
),
], ],
); );

View File

@@ -1,7 +1,7 @@
export * from "./files.js"; export * from "./files";
export * from "./users.js"; export * from "./users";
export * from "./videos.js"; export * from "./videos";
export * from "./articles.js"; export * from "./articles";
export * from "./recordings.js"; export * from "./recordings";
export * from "./comments.js"; export * from "./comments";
export * from "./gamification.js"; export * from "./gamification";

View File

@@ -9,19 +9,17 @@ import {
uniqueIndex, uniqueIndex,
jsonb, jsonb,
} from "drizzle-orm/pg-core"; } from "drizzle-orm/pg-core";
import { users } from "./users.js"; import { users } from "./users";
import { videos } from "./videos.js"; import { videos } from "./videos";
export const recordingStatusEnum = pgEnum("recording_status", [ export const recordingStatusEnum = pgEnum("recording_status", ["draft", "published"]);
"draft",
"published",
"archived",
]);
export const recordings = pgTable( export const recordings = pgTable(
"recordings", "recordings",
{ {
id: text("id").primaryKey().$defaultFn(() => crypto.randomUUID()), id: text("id")
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
title: text("title").notNull(), title: text("title").notNull(),
description: text("description"), description: text("description"),
slug: text("slug").notNull(), slug: text("slug").notNull(),
@@ -53,7 +51,9 @@ export const recordings = pgTable(
export const recording_plays = pgTable( export const recording_plays = pgTable(
"recording_plays", "recording_plays",
{ {
id: text("id").primaryKey().$defaultFn(() => crypto.randomUUID()), id: text("id")
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
recording_id: text("recording_id") recording_id: text("recording_id")
.notNull() .notNull()
.references(() => recordings.id, { onDelete: "cascade" }), .references(() => recordings.id, { onDelete: "cascade" }),

View File

@@ -8,14 +8,16 @@ import {
uniqueIndex, uniqueIndex,
integer, integer,
} from "drizzle-orm/pg-core"; } from "drizzle-orm/pg-core";
import { files } from "./files.js"; import { files } from "./files";
export const roleEnum = pgEnum("user_role", ["model", "viewer", "admin"]); export const roleEnum = pgEnum("user_role", ["model", "viewer", "admin"]);
export const users = pgTable( export const users = pgTable(
"users", "users",
{ {
id: text("id").primaryKey().$defaultFn(() => crypto.randomUUID()), id: text("id")
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
email: text("email").notNull(), email: text("email").notNull(),
password_hash: text("password_hash").notNull(), password_hash: text("password_hash").notNull(),
first_name: text("first_name"), first_name: text("first_name"),
@@ -27,6 +29,8 @@ export const users = pgTable(
role: roleEnum("role").notNull().default("viewer"), role: roleEnum("role").notNull().default("viewer"),
avatar: text("avatar").references(() => files.id, { onDelete: "set null" }), avatar: text("avatar").references(() => files.id, { onDelete: "set null" }),
banner: text("banner").references(() => files.id, { onDelete: "set null" }), banner: text("banner").references(() => files.id, { onDelete: "set null" }),
photo: text("photo").references(() => files.id, { onDelete: "set null" }),
is_admin: boolean("is_admin").notNull().default(false),
email_verified: boolean("email_verified").notNull().default(false), email_verified: boolean("email_verified").notNull().default(false),
email_verify_token: text("email_verify_token"), email_verify_token: text("email_verify_token"),
password_reset_token: text("password_reset_token"), password_reset_token: text("password_reset_token"),

View File

@@ -8,13 +8,15 @@ import {
uniqueIndex, uniqueIndex,
primaryKey, primaryKey,
} from "drizzle-orm/pg-core"; } from "drizzle-orm/pg-core";
import { users } from "./users.js"; import { users } from "./users";
import { files } from "./files.js"; import { files } from "./files";
export const videos = pgTable( export const videos = pgTable(
"videos", "videos",
{ {
id: text("id").primaryKey().$defaultFn(() => crypto.randomUUID()), id: text("id")
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
slug: text("slug").notNull(), slug: text("slug").notNull(),
title: text("title").notNull(), title: text("title").notNull(),
description: text("description"), description: text("description"),
@@ -50,7 +52,9 @@ export const video_models = pgTable(
export const video_likes = pgTable( export const video_likes = pgTable(
"video_likes", "video_likes",
{ {
id: text("id").primaryKey().$defaultFn(() => crypto.randomUUID()), id: text("id")
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
video_id: text("video_id") video_id: text("video_id")
.notNull() .notNull()
.references(() => videos.id, { onDelete: "cascade" }), .references(() => videos.id, { onDelete: "cascade" }),
@@ -68,7 +72,9 @@ export const video_likes = pgTable(
export const video_plays = pgTable( export const video_plays = pgTable(
"video_plays", "video_plays",
{ {
id: text("id").primaryKey().$defaultFn(() => crypto.randomUUID()), id: text("id")
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
video_id: text("video_id") video_id: text("video_id")
.notNull() .notNull()
.references(() => videos.id, { onDelete: "cascade" }), .references(() => videos.id, { onDelete: "cascade" }),

View File

@@ -1,7 +1,7 @@
import SchemaBuilder from "@pothos/core"; import SchemaBuilder from "@pothos/core";
import ErrorsPlugin from "@pothos/plugin-errors"; import ErrorsPlugin from "@pothos/plugin-errors";
import type { DB } from "../db/connection.js"; import type { DB } from "../db/connection";
import type { SessionUser } from "../lib/auth.js"; import type { SessionUser } from "../lib/auth";
import type Redis from "ioredis"; import type Redis from "ioredis";
import { GraphQLDateTime, GraphQLJSON } from "graphql-scalars"; import { GraphQLDateTime, GraphQLJSON } from "graphql-scalars";

View File

@@ -1,10 +1,20 @@
import type { YogaInitialContext } from "graphql-yoga"; import type { YogaInitialContext } from "graphql-yoga";
import type { Context } from "./builder.js"; import type { FastifyRequest, FastifyReply } from "fastify";
import { getSession } from "../lib/auth.js"; import type { Context } from "./builder";
import { db } from "../db/connection.js"; import { getSession, setSession } from "../lib/auth";
import { redis } from "../lib/auth.js"; import { db } from "../db/connection";
import { redis } from "../lib/auth";
import { users } from "../db/schema/index";
import { eq } from "drizzle-orm";
export async function buildContext(ctx: YogaInitialContext & { request: Request; reply: unknown; db: typeof db; redis: typeof redis }): Promise<Context> { type ServerContext = {
req: FastifyRequest;
reply: FastifyReply;
db: typeof db;
redis: typeof redis;
};
export async function buildContext(ctx: YogaInitialContext & ServerContext): Promise<Context> {
const request = ctx.request; const request = ctx.request;
const cookieHeader = request.headers.get("cookie") || ""; const cookieHeader = request.headers.get("cookie") || "";
@@ -17,7 +27,34 @@ export async function buildContext(ctx: YogaInitialContext & { request: Request;
); );
const token = cookies["session_token"]; const token = cookies["session_token"];
const currentUser = token ? await getSession(token) : null; let currentUser = null;
if (token) {
const session = await getSession(token); // also slides TTL
if (session) {
const dbInstance = ctx.db || db;
const [dbUser] = await dbInstance
.select()
.from(users)
.where(eq(users.id, session.id))
.limit(1);
if (dbUser) {
currentUser = {
id: dbUser.id,
email: dbUser.email,
role: (dbUser.role === "admin" ? "viewer" : dbUser.role) as "model" | "viewer",
is_admin: dbUser.is_admin,
first_name: dbUser.first_name,
last_name: dbUser.last_name,
artist_name: dbUser.artist_name,
slug: dbUser.slug,
avatar: dbUser.avatar,
};
// Refresh cached session with up-to-date data
await setSession(token, currentUser);
}
}
}
return { return {
db: ctx.db || db, db: ctx.db || db,

View File

@@ -9,6 +9,7 @@ import "./resolvers/recordings.js";
import "./resolvers/comments.js"; import "./resolvers/comments.js";
import "./resolvers/gamification.js"; import "./resolvers/gamification.js";
import "./resolvers/stats.js"; import "./resolvers/stats.js";
import { builder } from "./builder.js"; import "./resolvers/queues.js";
import { builder } from "./builder";
export const schema = builder.toSchema(); export const schema = builder.toSchema();

View File

@@ -1,47 +1,75 @@
import { builder } from "../builder.js"; import { builder } from "../builder";
import { ArticleType } from "../types/index.js"; import { ArticleType, ArticleListType, AdminArticleListType } from "../types/index";
import { articles, users } from "../../db/schema/index.js"; import { articles, users } from "../../db/schema/index";
import { eq, and, lte, desc } from "drizzle-orm"; import { eq, and, lte, desc, asc, ilike, or, count, arrayContains, type SQL } from "drizzle-orm";
import { requireAdmin } from "../../lib/acl";
import type { DB } from "../../db/connection";
async function enrichArticle(db: DB, article: typeof articles.$inferSelect) {
let author = null;
if (article.author) {
const authorUser = await db
.select({
id: users.id,
artist_name: users.artist_name,
slug: users.slug,
avatar: users.avatar,
description: users.description,
})
.from(users)
.where(eq(users.id, article.author))
.limit(1);
author = authorUser[0] || null;
}
return { ...article, author };
}
builder.queryField("articles", (t) => builder.queryField("articles", (t) =>
t.field({ t.field({
type: [ArticleType], type: ArticleListType,
args: { args: {
featured: t.arg.boolean(), featured: t.arg.boolean(),
limit: t.arg.int(), limit: t.arg.int(),
search: t.arg.string(),
category: t.arg.string(),
offset: t.arg.int(),
sortBy: t.arg.string(),
tag: t.arg.string(),
}, },
resolve: async (_root, args, ctx) => { resolve: async (_root, args, ctx) => {
let query = ctx.db const pageSize = args.limit ?? 24;
.select() const offset = args.offset ?? 0;
.from(articles)
.where(lte(articles.publish_date, new Date()))
.orderBy(desc(articles.publish_date));
if (args.limit) { const conditions: SQL<unknown>[] = [lte(articles.publish_date, new Date())];
query = (query as any).limit(args.limit); if (args.featured !== null && args.featured !== undefined) {
conditions.push(eq(articles.featured, args.featured));
}
if (args.category) conditions.push(eq(articles.category, args.category));
if (args.tag) conditions.push(arrayContains(articles.tags, [args.tag]));
if (args.search) {
conditions.push(
or(
ilike(articles.title, `%${args.search}%`),
ilike(articles.excerpt, `%${args.search}%`),
) as SQL<unknown>,
);
} }
const articleList = await query; const where = and(...conditions);
const baseQuery = ctx.db.select().from(articles).where(where);
const ordered =
args.sortBy === "name"
? baseQuery.orderBy(asc(articles.title))
: args.sortBy === "featured"
? baseQuery.orderBy(desc(articles.featured), desc(articles.publish_date))
: baseQuery.orderBy(desc(articles.publish_date));
return Promise.all( const [articleList, totalRows] = await Promise.all([
articleList.map(async (article: any) => { ordered.limit(pageSize).offset(offset),
let author = null; ctx.db.select({ total: count() }).from(articles).where(where),
if (article.author) { ]);
const authorUser = await ctx.db const items = await Promise.all(articleList.map((article) => enrichArticle(ctx.db, article)));
.select({ return { items, total: totalRows[0]?.total ?? 0 };
first_name: users.first_name,
last_name: users.last_name,
avatar: users.avatar,
description: users.description,
})
.from(users)
.where(eq(users.id, article.author))
.limit(1);
author = authorUser[0] || null;
}
return { ...article, author };
}),
);
}, },
}), }),
); );
@@ -61,23 +89,163 @@ builder.queryField("article", (t) =>
.limit(1); .limit(1);
if (!article[0]) return null; if (!article[0]) return null;
return enrichArticle(ctx.db, article[0]);
let author = null; },
if (article[0].author) { }),
const authorUser = await ctx.db );
.select({
first_name: users.first_name, builder.queryField("adminGetArticle", (t) =>
last_name: users.last_name, t.field({
avatar: users.avatar, type: ArticleType,
description: users.description, nullable: true,
}) args: {
.from(users) id: t.arg.string({ required: true }),
.where(eq(users.id, article[0].author)) },
.limit(1); resolve: async (_root, args, ctx) => {
author = authorUser[0] || null; requireAdmin(ctx);
} const article = await ctx.db.select().from(articles).where(eq(articles.id, args.id)).limit(1);
if (!article[0]) return null;
return { ...article[0], author }; return enrichArticle(ctx.db, article[0]);
},
}),
);
// ─── Admin queries & mutations ────────────────────────────────────────────────
builder.queryField("adminListArticles", (t) =>
t.field({
type: AdminArticleListType,
args: {
search: t.arg.string(),
category: t.arg.string(),
featured: t.arg.boolean(),
limit: t.arg.int(),
offset: t.arg.int(),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const limit = args.limit ?? 50;
const offset = args.offset ?? 0;
const conditions: SQL<unknown>[] = [];
if (args.search) {
conditions.push(
or(
ilike(articles.title, `%${args.search}%`),
ilike(articles.excerpt, `%${args.search}%`),
) as SQL<unknown>,
);
}
if (args.category) conditions.push(eq(articles.category, args.category));
if (args.featured !== null && args.featured !== undefined)
conditions.push(eq(articles.featured, args.featured));
const where = conditions.length > 0 ? and(...conditions) : undefined;
const [articleList, totalRows] = await Promise.all([
ctx.db
.select()
.from(articles)
.where(where)
.orderBy(desc(articles.publish_date))
.limit(limit)
.offset(offset),
ctx.db.select({ total: count() }).from(articles).where(where),
]);
const items = await Promise.all(articleList.map((article) => enrichArticle(ctx.db, article)));
return { items, total: totalRows[0]?.total ?? 0 };
},
}),
);
builder.mutationField("createArticle", (t) =>
t.field({
type: ArticleType,
args: {
title: t.arg.string({ required: true }),
slug: t.arg.string({ required: true }),
excerpt: t.arg.string(),
content: t.arg.string(),
imageId: t.arg.string(),
tags: t.arg.stringList(),
category: t.arg.string(),
featured: t.arg.boolean(),
publishDate: t.arg.string(),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const inserted = await ctx.db
.insert(articles)
.values({
title: args.title,
slug: args.slug,
excerpt: args.excerpt || null,
content: args.content || null,
image: args.imageId || null,
tags: args.tags || [],
category: args.category || null,
featured: args.featured ?? false,
publish_date: args.publishDate ? new Date(args.publishDate) : new Date(),
author: ctx.currentUser!.id,
})
.returning();
return enrichArticle(ctx.db, inserted[0]);
},
}),
);
builder.mutationField("updateArticle", (t) =>
t.field({
type: ArticleType,
nullable: true,
args: {
id: t.arg.string({ required: true }),
title: t.arg.string(),
slug: t.arg.string(),
excerpt: t.arg.string(),
content: t.arg.string(),
imageId: t.arg.string(),
authorId: t.arg.string(),
tags: t.arg.stringList(),
category: t.arg.string(),
featured: t.arg.boolean(),
publishDate: t.arg.string(),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const updates: Record<string, unknown> = { date_updated: new Date() };
if (args.title !== undefined && args.title !== null) updates.title = args.title;
if (args.slug !== undefined && args.slug !== null) updates.slug = args.slug;
if (args.excerpt !== undefined) updates.excerpt = args.excerpt;
if (args.content !== undefined) updates.content = args.content;
if (args.imageId !== undefined) updates.image = args.imageId;
if (args.authorId !== undefined) updates.author = args.authorId;
if (args.tags !== undefined && args.tags !== null) updates.tags = args.tags;
if (args.category !== undefined) updates.category = args.category;
if (args.featured !== undefined && args.featured !== null) updates.featured = args.featured;
if (args.publishDate !== undefined && args.publishDate !== null)
updates.publish_date = new Date(args.publishDate);
const updated = await ctx.db
.update(articles)
.set(updates as Partial<typeof articles.$inferInsert>)
.where(eq(articles.id, args.id))
.returning();
if (!updated[0]) return null;
return enrichArticle(ctx.db, updated[0]);
},
}),
);
builder.mutationField("deleteArticle", (t) =>
t.field({
type: "Boolean",
args: {
id: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
await ctx.db.delete(articles).where(eq(articles.id, args.id));
return true;
}, },
}), }),
); );

View File

@@ -1,12 +1,16 @@
import { GraphQLError } from "graphql"; import { GraphQLError } from "graphql";
import { builder } from "../builder.js"; import { builder } from "../builder";
import { CurrentUserType } from "../types/index.js"; import { CurrentUserType } from "../types/index";
import { users } from "../../db/schema/index.js"; import { users } from "../../db/schema/index";
import { eq } from "drizzle-orm"; import { eq } from "drizzle-orm";
import { hash, verify as verifyArgon } from "../../lib/argon.js";
import { setSession, deleteSession } from "../../lib/auth.js"; interface ReplyLike {
import { sendVerification, sendPasswordReset } from "../../lib/email.js"; header?: (name: string, value: string) => void;
import { slugify } from "../../lib/slugify.js"; }
import { hash, verify as verifyArgon } from "../../lib/argon";
import { setSession, deleteSession } from "../../lib/auth";
import { enqueueVerification, enqueuePasswordReset } from "../../lib/email";
import { slugify } from "../../lib/slugify";
import { nanoid } from "nanoid"; import { nanoid } from "nanoid";
builder.mutationField("login", (t) => builder.mutationField("login", (t) =>
@@ -32,7 +36,8 @@ builder.mutationField("login", (t) =>
const sessionUser = { const sessionUser = {
id: user[0].id, id: user[0].id,
email: user[0].email, email: user[0].email,
role: user[0].role, role: (user[0].role === "admin" ? "viewer" : user[0].role) as "model" | "viewer",
is_admin: user[0].is_admin,
first_name: user[0].first_name, first_name: user[0].first_name,
last_name: user[0].last_name, last_name: user[0].last_name,
artist_name: user[0].artist_name, artist_name: user[0].artist_name,
@@ -44,13 +49,8 @@ builder.mutationField("login", (t) =>
// Set session cookie // Set session cookie
const isProduction = process.env.NODE_ENV === "production"; const isProduction = process.env.NODE_ENV === "production";
const cookieValue = `session_token=${token}; HttpOnly; Path=/; SameSite=Lax; Max-Age=86400${isProduction ? "; Secure" : ""}`; const cookieValue = `session_token=${token}; HttpOnly; Path=/; SameSite=Strict; Max-Age=86400${isProduction ? "; Secure" : ""}`;
(ctx.reply as any).header?.("Set-Cookie", cookieValue); (ctx.reply as ReplyLike).header?.("Set-Cookie", cookieValue);
// For graphql-yoga response
if ((ctx as any).serverResponse) {
(ctx as any).serverResponse.setHeader("Set-Cookie", cookieValue);
}
return user[0]; return user[0];
}, },
@@ -73,8 +73,9 @@ builder.mutationField("logout", (t) =>
await deleteSession(token); await deleteSession(token);
} }
// Clear cookie // Clear cookie
const cookieValue = "session_token=; HttpOnly; Path=/; Max-Age=0"; const isProduction = process.env.NODE_ENV === "production";
(ctx.reply as any).header?.("Set-Cookie", cookieValue); const cookieValue = `session_token=; HttpOnly; Path=/; SameSite=Strict; Max-Age=0${isProduction ? "; Secure" : ""}`;
(ctx.reply as ReplyLike).header?.("Set-Cookie", cookieValue);
return true; return true;
}, },
}), }),
@@ -129,7 +130,11 @@ builder.mutationField("register", (t) =>
email_verified: false, email_verified: false,
}); });
await sendVerification(args.email, verifyToken); try {
await enqueueVerification(args.email, verifyToken);
} catch (e) {
console.warn("Failed to enqueue verification email:", (e as Error).message);
}
return true; return true;
}, },
}), }),
@@ -184,7 +189,11 @@ builder.mutationField("requestPasswordReset", (t) =>
.set({ password_reset_token: token, password_reset_expiry: expiry }) .set({ password_reset_token: token, password_reset_expiry: expiry })
.where(eq(users.id, user[0].id)); .where(eq(users.id, user[0].id));
await sendPasswordReset(args.email, token); try {
await enqueuePasswordReset(args.email, token);
} catch (e) {
console.warn("Failed to enqueue password reset email:", (e as Error).message);
}
return true; return true;
}, },
}), }),

View File

@@ -1,9 +1,10 @@
import { GraphQLError } from "graphql"; import { GraphQLError } from "graphql";
import { builder } from "../builder.js"; import { builder } from "../builder";
import { CommentType } from "../types/index.js"; import { CommentType, AdminCommentListType } from "../types/index";
import { comments, users } from "../../db/schema/index.js"; import { comments, users } from "../../db/schema/index";
import { eq, and, desc } from "drizzle-orm"; import { eq, and, desc, ilike, count } from "drizzle-orm";
import { awardPoints, checkAchievements } from "../../lib/gamification.js"; import { requireOwnerOrAdmin, requireAdmin } from "../../lib/acl";
import { gamificationQueue } from "../../queues/index";
builder.queryField("commentsForVideo", (t) => builder.queryField("commentsForVideo", (t) =>
t.field({ t.field({
@@ -19,9 +20,15 @@ builder.queryField("commentsForVideo", (t) =>
.orderBy(desc(comments.date_created)); .orderBy(desc(comments.date_created));
return Promise.all( return Promise.all(
commentList.map(async (c: any) => { commentList.map(async (c) => {
const user = await ctx.db const user = await ctx.db
.select({ id: users.id, first_name: users.first_name, last_name: users.last_name, avatar: users.avatar }) .select({
id: users.id,
first_name: users.first_name,
last_name: users.last_name,
artist_name: users.artist_name,
avatar: users.avatar,
})
.from(users) .from(users)
.where(eq(users.id, c.user_id)) .where(eq(users.id, c.user_id))
.limit(1); .limit(1);
@@ -52,12 +59,25 @@ builder.mutationField("createCommentForVideo", (t) =>
}) })
.returning(); .returning();
// Gamification await gamificationQueue.add("awardPoints", {
await awardPoints(ctx.db, ctx.currentUser.id, "COMMENT_CREATE"); job: "awardPoints",
await checkAchievements(ctx.db, ctx.currentUser.id, "social"); userId: ctx.currentUser.id,
action: "COMMENT_CREATE",
});
await gamificationQueue.add("checkAchievements", {
job: "checkAchievements",
userId: ctx.currentUser.id,
category: "social",
});
const user = await ctx.db const user = await ctx.db
.select({ id: users.id, first_name: users.first_name, last_name: users.last_name, avatar: users.avatar }) .select({
id: users.id,
first_name: users.first_name,
last_name: users.last_name,
artist_name: users.artist_name,
avatar: users.avatar,
})
.from(users) .from(users)
.where(eq(users.id, ctx.currentUser.id)) .where(eq(users.id, ctx.currentUser.id))
.limit(1); .limit(1);
@@ -66,3 +86,80 @@ builder.mutationField("createCommentForVideo", (t) =>
}, },
}), }),
); );
builder.mutationField("deleteComment", (t) =>
t.field({
type: "Boolean",
args: {
id: t.arg.int({ required: true }),
},
resolve: async (_root, args, ctx) => {
const comment = await ctx.db.select().from(comments).where(eq(comments.id, args.id)).limit(1);
if (!comment[0]) throw new GraphQLError("Comment not found");
requireOwnerOrAdmin(ctx, comment[0].user_id);
await ctx.db.delete(comments).where(eq(comments.id, args.id));
await gamificationQueue.add("revokePoints", {
job: "revokePoints",
userId: comment[0].user_id,
action: "COMMENT_CREATE",
});
await gamificationQueue.add("checkAchievements", {
job: "checkAchievements",
userId: comment[0].user_id,
category: "social",
});
return true;
},
}),
);
builder.queryField("adminListComments", (t) =>
t.field({
type: AdminCommentListType,
args: {
search: t.arg.string(),
limit: t.arg.int(),
offset: t.arg.int(),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const limit = args.limit ?? 50;
const offset = args.offset ?? 0;
const conditions = args.search ? [ilike(comments.comment, `%${args.search}%`)] : [];
const where = conditions.length > 0 ? and(...conditions) : undefined;
const [commentList, totalRows] = await Promise.all([
ctx.db
.select()
.from(comments)
.where(where)
.orderBy(desc(comments.date_created))
.limit(limit)
.offset(offset),
ctx.db.select({ total: count() }).from(comments).where(where),
]);
const items = await Promise.all(
commentList.map(async (c) => {
const user = await ctx.db
.select({
id: users.id,
first_name: users.first_name,
last_name: users.last_name,
artist_name: users.artist_name,
avatar: users.avatar,
})
.from(users)
.where(eq(users.id, c.user_id))
.limit(1);
return { ...c, user: user[0] || null };
}),
);
return { items, total: totalRows[0]?.total ?? 0 };
},
}),
);

View File

@@ -1,7 +1,13 @@
import { builder } from "../builder.js"; import { builder } from "../builder";
import { LeaderboardEntryType, UserGamificationType, AchievementType } from "../types/index.js"; import { LeaderboardEntryType, UserGamificationType, AchievementType } from "../types/index";
import { user_stats, users, user_achievements, achievements, user_points } from "../../db/schema/index.js"; import {
import { eq, desc, gt, count, isNotNull } from "drizzle-orm"; user_stats,
users,
user_achievements,
achievements,
user_points,
} from "../../db/schema/index";
import { eq, desc, gt, count, isNotNull, and } from "drizzle-orm";
builder.queryField("leaderboard", (t) => builder.queryField("leaderboard", (t) =>
t.field({ t.field({
@@ -31,7 +37,7 @@ builder.queryField("leaderboard", (t) =>
.limit(limit) .limit(limit)
.offset(offset); .offset(offset);
return entries.map((e: any, i: number) => ({ ...e, rank: offset + i + 1 })); return entries.map((e, i) => ({ ...e, rank: offset + i + 1 }));
}, },
}), }),
); );
@@ -73,8 +79,12 @@ builder.queryField("userGamification", (t) =>
}) })
.from(user_achievements) .from(user_achievements)
.leftJoin(achievements, eq(user_achievements.achievement_id, achievements.id)) .leftJoin(achievements, eq(user_achievements.achievement_id, achievements.id))
.where(eq(user_achievements.user_id, args.userId)) .where(
.where(isNotNull(user_achievements.date_unlocked)) and(
eq(user_achievements.user_id, args.userId),
isNotNull(user_achievements.date_unlocked),
),
)
.orderBy(desc(user_achievements.date_unlocked)); .orderBy(desc(user_achievements.date_unlocked));
const recentPoints = await ctx.db const recentPoints = await ctx.db
@@ -91,8 +101,15 @@ builder.queryField("userGamification", (t) =>
return { return {
stats: stats[0] ? { ...stats[0], rank } : null, stats: stats[0] ? { ...stats[0], rank } : null,
achievements: userAchievements.map((a: any) => ({ achievements: userAchievements.map((a) => ({
...a, id: a.id!,
code: a.code!,
name: a.name!,
description: a.description!,
icon: a.icon!,
category: a.category!,
required_count: a.required_count!,
progress: a.progress!,
date_unlocked: a.date_unlocked!, date_unlocked: a.date_unlocked!,
})), })),
recent_points: recentPoints, recent_points: recentPoints,

View File

@@ -1,9 +1,10 @@
import { builder } from "../builder.js"; import { builder } from "../builder";
import { ModelType } from "../types/index.js"; import { ModelType, ModelListType } from "../types/index";
import { users, user_photos, files } from "../../db/schema/index.js"; import { users, user_photos, files } from "../../db/schema/index";
import { eq, and, desc } from "drizzle-orm"; import { eq, and, desc, asc, ilike, count, arrayContains, type SQL } from "drizzle-orm";
import type { DB } from "../../db/connection";
async function enrichModel(db: any, user: any) { async function enrichModel(db: DB, user: typeof users.$inferSelect) {
// Fetch photos // Fetch photos
const photoRows = await db const photoRows = await db
.select({ id: files.id, filename: files.filename }) .select({ id: files.id, filename: files.filename })
@@ -12,32 +13,42 @@ async function enrichModel(db: any, user: any) {
.where(eq(user_photos.user_id, user.id)) .where(eq(user_photos.user_id, user.id))
.orderBy(user_photos.sort); .orderBy(user_photos.sort);
return { const seen = new Set<string>();
...user, const photos = photoRows
photos: photoRows.map((p: any) => ({ id: p.id, filename: p.filename })), .filter((p) => p.id !== null && !seen.has(p.id!) && seen.add(p.id!))
}; .map((p) => ({ id: p.id!, filename: p.filename! }));
return { ...user, photos };
} }
builder.queryField("models", (t) => builder.queryField("models", (t) =>
t.field({ t.field({
type: [ModelType], type: ModelListType,
args: { args: {
featured: t.arg.boolean(), featured: t.arg.boolean(),
limit: t.arg.int(), limit: t.arg.int(),
search: t.arg.string(),
offset: t.arg.int(),
sortBy: t.arg.string(),
tag: t.arg.string(),
}, },
resolve: async (_root, args, ctx) => { resolve: async (_root, args, ctx) => {
let query = ctx.db const pageSize = args.limit ?? 24;
.select() const offset = args.offset ?? 0;
.from(users)
.where(eq(users.role, "model"))
.orderBy(desc(users.date_created));
if (args.limit) { const conditions: SQL<unknown>[] = [eq(users.role, "model")];
query = (query as any).limit(args.limit); if (args.search) conditions.push(ilike(users.artist_name, `%${args.search}%`));
} if (args.tag) conditions.push(arrayContains(users.tags, [args.tag]));
const modelList = await query; const order = args.sortBy === "recent" ? desc(users.date_created) : asc(users.artist_name);
return Promise.all(modelList.map((m: any) => enrichModel(ctx.db, m)));
const where = and(...conditions);
const [modelList, totalRows] = await Promise.all([
ctx.db.select().from(users).where(where).orderBy(order).limit(pageSize).offset(offset),
ctx.db.select({ total: count() }).from(users).where(where),
]);
const items = await Promise.all(modelList.map((m) => enrichModel(ctx.db, m)));
return { items, total: totalRows[0]?.total ?? 0 };
}, },
}), }),
); );

View File

@@ -0,0 +1,151 @@
import { GraphQLError } from "graphql";
import type { Job } from "bullmq";
import { builder } from "../builder.js";
import { JobType, QueueInfoType } from "../types/index.js";
import { queues } from "../../queues/index.js";
import { requireAdmin } from "../../lib/acl.js";
const JOB_STATUSES = ["waiting", "active", "completed", "failed", "delayed"] as const;
type JobStatus = (typeof JOB_STATUSES)[number];
async function toJobData(job: Job, queueName: string) {
const status = await job.getState();
return {
id: job.id ?? "",
name: job.name,
queue: queueName,
status,
data: job.data as unknown,
result: job.returnvalue as unknown,
failedReason: job.failedReason ?? null,
attemptsMade: job.attemptsMade,
createdAt: new Date(job.timestamp),
processedAt: job.processedOn ? new Date(job.processedOn) : null,
finishedAt: job.finishedOn ? new Date(job.finishedOn) : null,
progress: typeof job.progress === "number" ? job.progress : null,
};
}
builder.queryField("adminQueues", (t) =>
t.field({
type: [QueueInfoType],
resolve: async (_root, _args, ctx) => {
requireAdmin(ctx);
return Promise.all(
Object.entries(queues).map(async ([name, queue]) => {
const counts = await queue.getJobCounts(
"waiting",
"active",
"completed",
"failed",
"delayed",
"paused",
);
const isPaused = await queue.isPaused();
return {
name,
counts: {
waiting: counts.waiting ?? 0,
active: counts.active ?? 0,
completed: counts.completed ?? 0,
failed: counts.failed ?? 0,
delayed: counts.delayed ?? 0,
paused: counts.paused ?? 0,
},
isPaused,
};
}),
);
},
}),
);
builder.queryField("adminQueueJobs", (t) =>
t.field({
type: [JobType],
args: {
queue: t.arg.string({ required: true }),
status: t.arg.string(),
limit: t.arg.int(),
offset: t.arg.int(),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const queue = queues[args.queue];
if (!queue) throw new GraphQLError(`Queue "${args.queue}" not found`);
const limit = args.limit ?? 25;
const offset = args.offset ?? 0;
const statuses: JobStatus[] = args.status ? [args.status as JobStatus] : [...JOB_STATUSES];
const jobs = await queue.getJobs(statuses, offset, offset + limit - 1);
return Promise.all(jobs.map((job) => toJobData(job, args.queue)));
},
}),
);
builder.mutationField("adminRetryJob", (t) =>
t.field({
type: "Boolean",
args: {
queue: t.arg.string({ required: true }),
jobId: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const queue = queues[args.queue];
if (!queue) throw new GraphQLError(`Queue "${args.queue}" not found`);
const job = await queue.getJob(args.jobId);
if (!job) throw new GraphQLError(`Job "${args.jobId}" not found`);
await job.retry();
return true;
},
}),
);
builder.mutationField("adminRemoveJob", (t) =>
t.field({
type: "Boolean",
args: {
queue: t.arg.string({ required: true }),
jobId: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const queue = queues[args.queue];
if (!queue) throw new GraphQLError(`Queue "${args.queue}" not found`);
const job = await queue.getJob(args.jobId);
if (!job) throw new GraphQLError(`Job "${args.jobId}" not found`);
await job.remove();
return true;
},
}),
);
builder.mutationField("adminPauseQueue", (t) =>
t.field({
type: "Boolean",
args: { queue: t.arg.string({ required: true }) },
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const queue = queues[args.queue];
if (!queue) throw new GraphQLError(`Queue "${args.queue}" not found`);
await queue.pause();
return true;
},
}),
);
builder.mutationField("adminResumeQueue", (t) =>
t.field({
type: "Boolean",
args: { queue: t.arg.string({ required: true }) },
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const queue = queues[args.queue];
if (!queue) throw new GraphQLError(`Queue "${args.queue}" not found`);
await queue.resume();
return true;
},
}),
);

View File

@@ -1,10 +1,11 @@
import { GraphQLError } from "graphql"; import { GraphQLError } from "graphql";
import { builder } from "../builder.js"; import { builder } from "../builder";
import { RecordingType } from "../types/index.js"; import { RecordingType, AdminRecordingListType } from "../types/index";
import { recordings, recording_plays } from "../../db/schema/index.js"; import { recordings, recording_plays } from "../../db/schema/index";
import { eq, and, desc } from "drizzle-orm"; import { eq, and, desc, ilike, count, type SQL } from "drizzle-orm";
import { slugify } from "../../lib/slugify.js"; import { slugify } from "../../lib/slugify";
import { awardPoints, checkAchievements } from "../../lib/gamification.js"; import { requireAdmin } from "../../lib/acl";
import { gamificationQueue } from "../../queues/index";
builder.queryField("recordings", (t) => builder.queryField("recordings", (t) =>
t.field({ t.field({
@@ -20,7 +21,7 @@ builder.queryField("recordings", (t) =>
if (!ctx.currentUser) throw new GraphQLError("Unauthorized"); if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
const conditions = [eq(recordings.user_id, ctx.currentUser.id)]; const conditions = [eq(recordings.user_id, ctx.currentUser.id)];
if (args.status) conditions.push(eq(recordings.status, args.status as any)); if (args.status) conditions.push(eq(recordings.status, args.status as "draft" | "published"));
if (args.linkedVideoId) conditions.push(eq(recordings.linked_video, args.linkedVideoId)); if (args.linkedVideoId) conditions.push(eq(recordings.linked_video, args.linkedVideoId));
const limit = args.limit || 50; const limit = args.limit || 50;
@@ -114,17 +115,25 @@ builder.mutationField("createRecording", (t) =>
user_id: ctx.currentUser.id, user_id: ctx.currentUser.id,
tags: args.tags || [], tags: args.tags || [],
linked_video: args.linkedVideoId || null, linked_video: args.linkedVideoId || null,
status: (args.status as any) || "draft", status: (args.status as "draft" | "published") || "draft",
public: false, public: false,
}) })
.returning(); .returning();
const recording = newRecording[0]; const recording = newRecording[0];
// Gamification: award points if published
if (recording.status === "published") { if (recording.status === "published") {
await awardPoints(ctx.db, ctx.currentUser.id, "RECORDING_CREATE", recording.id); await gamificationQueue.add("awardPoints", {
await checkAchievements(ctx.db, ctx.currentUser.id, "recordings"); job: "awardPoints",
userId: ctx.currentUser.id,
action: "RECORDING_CREATE",
recordingId: recording.id,
});
await gamificationQueue.add("checkAchievements", {
job: "checkAchievements",
userId: ctx.currentUser.id,
category: "recordings",
});
} }
return recording; return recording;
@@ -162,28 +171,61 @@ builder.mutationField("updateRecording", (t) =>
updates.title = args.title; updates.title = args.title;
updates.slug = slugify(args.title); updates.slug = slugify(args.title);
} }
if (args.description !== null && args.description !== undefined) updates.description = args.description; if (args.description !== null && args.description !== undefined)
updates.description = args.description;
if (args.tags !== null && args.tags !== undefined) updates.tags = args.tags; if (args.tags !== null && args.tags !== undefined) updates.tags = args.tags;
if (args.status !== null && args.status !== undefined) updates.status = args.status; if (args.status !== null && args.status !== undefined) updates.status = args.status;
if (args.public !== null && args.public !== undefined) updates.public = args.public; if (args.public !== null && args.public !== undefined) updates.public = args.public;
if (args.linkedVideoId !== null && args.linkedVideoId !== undefined) updates.linked_video = args.linkedVideoId; if (args.linkedVideoId !== null && args.linkedVideoId !== undefined)
updates.linked_video = args.linkedVideoId;
const updated = await ctx.db const updated = await ctx.db
.update(recordings) .update(recordings)
.set(updates as any) .set(updates as Partial<typeof recordings.$inferInsert>)
.where(eq(recordings.id, args.id)) .where(eq(recordings.id, args.id))
.returning(); .returning();
const recording = updated[0]; const recording = updated[0];
// Gamification: if newly published
if (args.status === "published" && existing[0].status !== "published") { if (args.status === "published" && existing[0].status !== "published") {
await awardPoints(ctx.db, ctx.currentUser.id, "RECORDING_CREATE", recording.id); // draft → published: award creation points
await checkAchievements(ctx.db, ctx.currentUser.id, "recordings"); await gamificationQueue.add("awardPoints", {
} job: "awardPoints",
if (args.status === "published" && recording.featured && !existing[0].featured) { userId: ctx.currentUser.id,
await awardPoints(ctx.db, ctx.currentUser.id, "RECORDING_FEATURED", recording.id); action: "RECORDING_CREATE",
await checkAchievements(ctx.db, ctx.currentUser.id, "recordings"); recordingId: recording.id,
});
await gamificationQueue.add("checkAchievements", {
job: "checkAchievements",
userId: ctx.currentUser.id,
category: "recordings",
});
} else if (args.status === "draft" && existing[0].status === "published") {
// published → draft: revoke creation points
await gamificationQueue.add("revokePoints", {
job: "revokePoints",
userId: ctx.currentUser.id,
action: "RECORDING_CREATE",
recordingId: recording.id,
});
await gamificationQueue.add("checkAchievements", {
job: "checkAchievements",
userId: ctx.currentUser.id,
category: "recordings",
});
} else if (args.status === "published" && recording.featured && !existing[0].featured) {
// newly featured while published: award featured bonus
await gamificationQueue.add("awardPoints", {
job: "awardPoints",
userId: ctx.currentUser.id,
action: "RECORDING_FEATURED",
recordingId: recording.id,
});
await gamificationQueue.add("checkAchievements", {
job: "checkAchievements",
userId: ctx.currentUser.id,
category: "recordings",
});
} }
return recording; return recording;
@@ -209,10 +251,29 @@ builder.mutationField("deleteRecording", (t) =>
if (!existing[0]) throw new GraphQLError("Recording not found"); if (!existing[0]) throw new GraphQLError("Recording not found");
if (existing[0].user_id !== ctx.currentUser.id) throw new GraphQLError("Forbidden"); if (existing[0].user_id !== ctx.currentUser.id) throw new GraphQLError("Forbidden");
await ctx.db if (existing[0].status === "published") {
.update(recordings) await gamificationQueue.add("revokePoints", {
.set({ status: "archived", date_updated: new Date() }) job: "revokePoints",
.where(eq(recordings.id, args.id)); userId: ctx.currentUser.id,
action: "RECORDING_CREATE",
recordingId: args.id,
});
if (existing[0].featured) {
await gamificationQueue.add("revokePoints", {
job: "revokePoints",
userId: ctx.currentUser.id,
action: "RECORDING_FEATURED",
recordingId: args.id,
});
}
await gamificationQueue.add("checkAchievements", {
job: "checkAchievements",
userId: ctx.currentUser.id,
category: "content",
});
}
await ctx.db.delete(recordings).where(eq(recordings.id, args.id));
return true; return true;
}, },
@@ -288,10 +349,18 @@ builder.mutationField("recordRecordingPlay", (t) =>
}) })
.returning({ id: recording_plays.id }); .returning({ id: recording_plays.id });
// Gamification
if (ctx.currentUser && recording[0].user_id !== ctx.currentUser.id) { if (ctx.currentUser && recording[0].user_id !== ctx.currentUser.id) {
await awardPoints(ctx.db, ctx.currentUser.id, "RECORDING_PLAY", args.recordingId); await gamificationQueue.add("awardPoints", {
await checkAchievements(ctx.db, ctx.currentUser.id, "playback"); job: "awardPoints",
userId: ctx.currentUser.id,
action: "RECORDING_PLAY",
recordingId: args.recordingId,
});
await gamificationQueue.add("checkAchievements", {
job: "checkAchievements",
userId: ctx.currentUser.id,
category: "playback",
});
} }
return { success: true, play_id: play[0].id }; return { success: true, play_id: play[0].id };
@@ -319,15 +388,77 @@ builder.mutationField("updateRecordingPlay", (t) =>
await ctx.db await ctx.db
.update(recording_plays) .update(recording_plays)
.set({ duration_played: args.durationPlayed, completed: args.completed, date_updated: new Date() }) .set({
duration_played: args.durationPlayed,
completed: args.completed,
date_updated: new Date(),
})
.where(eq(recording_plays.id, args.playId)); .where(eq(recording_plays.id, args.playId));
if (args.completed && !wasCompleted && ctx.currentUser) { if (args.completed && !wasCompleted && ctx.currentUser) {
await awardPoints(ctx.db, ctx.currentUser.id, "RECORDING_COMPLETE", existing[0].recording_id); await gamificationQueue.add("awardPoints", {
await checkAchievements(ctx.db, ctx.currentUser.id, "playback"); job: "awardPoints",
userId: ctx.currentUser.id,
action: "RECORDING_COMPLETE",
recordingId: existing[0].recording_id,
});
await gamificationQueue.add("checkAchievements", {
job: "checkAchievements",
userId: ctx.currentUser.id,
category: "playback",
});
} }
return true; return true;
}, },
}), }),
); );
builder.queryField("adminListRecordings", (t) =>
t.field({
type: AdminRecordingListType,
args: {
search: t.arg.string(),
status: t.arg.string(),
limit: t.arg.int(),
offset: t.arg.int(),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const limit = args.limit ?? 50;
const offset = args.offset ?? 0;
const conditions: SQL<unknown>[] = [];
if (args.search) conditions.push(ilike(recordings.title, `%${args.search}%`));
if (args.status) conditions.push(eq(recordings.status, args.status as "draft" | "published"));
const where = conditions.length > 0 ? and(...conditions) : undefined;
const [rows, totalRows] = await Promise.all([
ctx.db
.select()
.from(recordings)
.where(where)
.orderBy(desc(recordings.date_created))
.limit(limit)
.offset(offset),
ctx.db.select({ total: count() }).from(recordings).where(where),
]);
return { items: rows, total: totalRows[0]?.total ?? 0 };
},
}),
);
builder.mutationField("adminDeleteRecording", (t) =>
t.field({
type: "Boolean",
args: {
id: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
await ctx.db.delete(recordings).where(eq(recordings.id, args.id));
return true;
},
}),
);

View File

@@ -1,6 +1,6 @@
import { builder } from "../builder.js"; import { builder } from "../builder";
import { StatsType } from "../types/index.js"; import { StatsType } from "../types/index";
import { users, videos } from "../../db/schema/index.js"; import { users, videos } from "../../db/schema/index";
import { eq, count } from "drizzle-orm"; import { eq, count } from "drizzle-orm";
builder.queryField("stats", (t) => builder.queryField("stats", (t) =>
@@ -15,9 +15,7 @@ builder.queryField("stats", (t) =>
.select({ count: count() }) .select({ count: count() })
.from(users) .from(users)
.where(eq(users.role, "viewer")); .where(eq(users.role, "viewer"));
const videosCount = await ctx.db const videosCount = await ctx.db.select({ count: count() }).from(videos);
.select({ count: count() })
.from(videos);
return { return {
models_count: modelsCount[0]?.count || 0, models_count: modelsCount[0]?.count || 0,

View File

@@ -1,8 +1,9 @@
import { GraphQLError } from "graphql"; import { GraphQLError } from "graphql";
import { builder } from "../builder.js"; import { builder } from "../builder";
import { CurrentUserType, UserType } from "../types/index.js"; import { CurrentUserType, UserType, AdminUserListType, AdminUserDetailType } from "../types/index";
import { users } from "../../db/schema/index.js"; import { users, user_photos, files } from "../../db/schema/index";
import { eq } from "drizzle-orm"; import { eq, ilike, or, count, and, asc, type SQL } from "drizzle-orm";
import { requireAdmin } from "../../lib/acl";
builder.queryField("me", (t) => builder.queryField("me", (t) =>
t.field({ t.field({
@@ -28,11 +29,7 @@ builder.queryField("userProfile", (t) =>
id: t.arg.string({ required: true }), id: t.arg.string({ required: true }),
}, },
resolve: async (_root, args, ctx) => { resolve: async (_root, args, ctx) => {
const user = await ctx.db const user = await ctx.db.select().from(users).where(eq(users.id, args.id)).limit(1);
.select()
.from(users)
.where(eq(users.id, args.id))
.limit(1);
return user[0] || null; return user[0] || null;
}, },
}), }),
@@ -48,18 +45,26 @@ builder.mutationField("updateProfile", (t) =>
artistName: t.arg.string(), artistName: t.arg.string(),
description: t.arg.string(), description: t.arg.string(),
tags: t.arg.stringList(), tags: t.arg.stringList(),
avatar: t.arg.string(),
}, },
resolve: async (_root, args, ctx) => { resolve: async (_root, args, ctx) => {
if (!ctx.currentUser) throw new GraphQLError("Unauthorized"); if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
const updates: Record<string, unknown> = { date_updated: new Date() }; const updates: Record<string, unknown> = { date_updated: new Date() };
if (args.firstName !== undefined && args.firstName !== null) updates.first_name = args.firstName; if (args.firstName !== undefined && args.firstName !== null)
updates.first_name = args.firstName;
if (args.lastName !== undefined && args.lastName !== null) updates.last_name = args.lastName; if (args.lastName !== undefined && args.lastName !== null) updates.last_name = args.lastName;
if (args.artistName !== undefined && args.artistName !== null) updates.artist_name = args.artistName; if (args.artistName !== undefined && args.artistName !== null)
if (args.description !== undefined && args.description !== null) updates.description = args.description; updates.artist_name = args.artistName;
if (args.description !== undefined && args.description !== null)
updates.description = args.description;
if (args.tags !== undefined && args.tags !== null) updates.tags = args.tags; if (args.tags !== undefined && args.tags !== null) updates.tags = args.tags;
if (args.avatar !== undefined) updates.avatar = args.avatar;
await ctx.db.update(users).set(updates as any).where(eq(users.id, ctx.currentUser.id)); await ctx.db
.update(users)
.set(updates as Partial<typeof users.$inferInsert>)
.where(eq(users.id, ctx.currentUser.id));
const updated = await ctx.db const updated = await ctx.db
.select() .select()
@@ -70,3 +75,163 @@ builder.mutationField("updateProfile", (t) =>
}, },
}), }),
); );
// ─── Admin queries & mutations ────────────────────────────────────────────────
builder.queryField("adminListUsers", (t) =>
t.field({
type: AdminUserListType,
args: {
role: t.arg.string(),
search: t.arg.string(),
limit: t.arg.int(),
offset: t.arg.int(),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const limit = args.limit ?? 50;
const offset = args.offset ?? 0;
const conditions: SQL<unknown>[] = [];
if (args.role) {
conditions.push(eq(users.role, args.role as "model" | "viewer" | "admin"));
}
if (args.search) {
const pattern = `%${args.search}%`;
conditions.push(
or(ilike(users.email, pattern), ilike(users.artist_name, pattern)) as SQL<unknown>,
);
}
const where = conditions.length > 0 ? and(...conditions) : undefined;
const [items, totalRows] = await Promise.all([
ctx.db
.select()
.from(users)
.where(where)
.orderBy(asc(users.artist_name))
.limit(limit)
.offset(offset),
ctx.db.select({ total: count() }).from(users).where(where),
]);
return { items, total: totalRows[0]?.total ?? 0 };
},
}),
);
builder.mutationField("adminUpdateUser", (t) =>
t.field({
type: UserType,
nullable: true,
args: {
userId: t.arg.string({ required: true }),
role: t.arg.string(),
isAdmin: t.arg.boolean(),
firstName: t.arg.string(),
lastName: t.arg.string(),
artistName: t.arg.string(),
avatarId: t.arg.string(),
bannerId: t.arg.string(),
photoId: t.arg.string(),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const updates: Record<string, unknown> = { date_updated: new Date() };
if (args.role !== undefined && args.role !== null)
updates.role = args.role as "model" | "viewer" | "admin";
if (args.isAdmin !== undefined && args.isAdmin !== null) updates.is_admin = args.isAdmin;
if (args.firstName !== undefined && args.firstName !== null)
updates.first_name = args.firstName;
if (args.lastName !== undefined && args.lastName !== null) updates.last_name = args.lastName;
if (args.artistName !== undefined && args.artistName !== null)
updates.artist_name = args.artistName;
if (args.avatarId !== undefined && args.avatarId !== null) updates.avatar = args.avatarId;
if (args.bannerId !== undefined && args.bannerId !== null) updates.banner = args.bannerId;
if (args.photoId !== undefined && args.photoId !== null) updates.photo = args.photoId;
const updated = await ctx.db
.update(users)
.set(updates as Partial<typeof users.$inferInsert>)
.where(eq(users.id, args.userId))
.returning();
return updated[0] || null;
},
}),
);
builder.mutationField("adminDeleteUser", (t) =>
t.field({
type: "Boolean",
args: {
userId: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
if (args.userId === ctx.currentUser!.id) throw new GraphQLError("Cannot delete yourself");
await ctx.db.delete(users).where(eq(users.id, args.userId));
return true;
},
}),
);
builder.queryField("adminGetUser", (t) =>
t.field({
type: AdminUserDetailType,
nullable: true,
args: {
userId: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const user = await ctx.db.select().from(users).where(eq(users.id, args.userId)).limit(1);
if (!user[0]) return null;
const photoRows = await ctx.db
.select({ id: files.id, filename: files.filename })
.from(user_photos)
.leftJoin(files, eq(user_photos.file_id, files.id))
.where(eq(user_photos.user_id, args.userId))
.orderBy(user_photos.sort);
const seen = new Set<string>();
const photos = photoRows
.filter((p) => p.id !== null && !seen.has(p.id!) && seen.add(p.id!))
.map((p) => ({ id: p.id!, filename: p.filename! }));
return { ...user[0], photos };
},
}),
);
builder.mutationField("adminAddUserPhoto", (t) =>
t.field({
type: "Boolean",
args: {
userId: t.arg.string({ required: true }),
fileId: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
await ctx.db.insert(user_photos).values({ user_id: args.userId, file_id: args.fileId });
return true;
},
}),
);
builder.mutationField("adminRemoveUserPhoto", (t) =>
t.field({
type: "Boolean",
args: {
userId: t.arg.string({ required: true }),
fileId: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
await ctx.db
.delete(user_photos)
.where(and(eq(user_photos.user_id, args.userId), eq(user_photos.file_id, args.fileId)));
return true;
},
}),
);

View File

@@ -1,10 +1,39 @@
import { GraphQLError } from "graphql"; import { GraphQLError } from "graphql";
import { builder } from "../builder.js"; import { builder } from "../builder";
import { VideoType, VideoLikeResponseType, VideoPlayResponseType, VideoLikeStatusType } from "../types/index.js"; import {
import { videos, video_models, video_likes, video_plays, users, files } from "../../db/schema/index.js"; VideoType,
import { eq, and, lte, desc, inArray, count } from "drizzle-orm"; VideoListType,
AdminVideoListType,
VideoLikeResponseType,
VideoPlayResponseType,
VideoLikeStatusType,
} from "../types/index";
import {
videos,
video_models,
video_likes,
video_plays,
users,
files,
} from "../../db/schema/index";
import {
eq,
and,
lte,
desc,
asc,
inArray,
count,
ilike,
lt,
gte,
arrayContains,
type SQL,
} from "drizzle-orm";
import { requireAdmin } from "../../lib/acl";
import type { DB } from "../../db/connection";
async function enrichVideo(db: any, video: any) { async function enrichVideo(db: DB, video: typeof videos.$inferSelect) {
// Fetch models // Fetch models
const modelRows = await db const modelRows = await db
.select({ .select({
@@ -12,6 +41,7 @@ async function enrichVideo(db: any, video: any) {
artist_name: users.artist_name, artist_name: users.artist_name,
slug: users.slug, slug: users.slug,
avatar: users.avatar, avatar: users.avatar,
description: users.description,
}) })
.from(video_models) .from(video_models)
.leftJoin(users, eq(video_models.user_id, users.id)) .leftJoin(users, eq(video_models.user_id, users.id))
@@ -25,12 +55,28 @@ async function enrichVideo(db: any, video: any) {
} }
// Count likes // Count likes
const likesCount = await db.select({ count: count() }).from(video_likes).where(eq(video_likes.video_id, video.id)); const likesCount = await db
const playsCount = await db.select({ count: count() }).from(video_plays).where(eq(video_plays.video_id, video.id)); .select({ count: count() })
.from(video_likes)
.where(eq(video_likes.video_id, video.id));
const playsCount = await db
.select({ count: count() })
.from(video_plays)
.where(eq(video_plays.video_id, video.id));
const models = modelRows
.filter((m) => m.id !== null)
.map((m) => ({
id: m.id!,
artist_name: m.artist_name,
slug: m.slug,
avatar: m.avatar,
description: m.description,
}));
return { return {
...video, ...video,
models: modelRows, models,
movie_file: movieFile, movie_file: movieFile,
likes_count: likesCount[0]?.count || 0, likes_count: likesCount[0]?.count || 0,
plays_count: playsCount[0]?.count || 0, plays_count: playsCount[0]?.count || 0,
@@ -39,55 +85,93 @@ async function enrichVideo(db: any, video: any) {
builder.queryField("videos", (t) => builder.queryField("videos", (t) =>
t.field({ t.field({
type: [VideoType], type: VideoListType,
args: { args: {
modelId: t.arg.string(), modelId: t.arg.string(),
featured: t.arg.boolean(), featured: t.arg.boolean(),
limit: t.arg.int(), limit: t.arg.int(),
search: t.arg.string(),
offset: t.arg.int(),
sortBy: t.arg.string(),
duration: t.arg.string(),
tag: t.arg.string(),
}, },
resolve: async (_root, args, ctx) => { resolve: async (_root, args, ctx) => {
let query = ctx.db const pageSize = args.limit ?? 24;
.select({ v: videos }) const offset = args.offset ?? 0;
.from(videos)
.where(lte(videos.upload_date, new Date()))
.orderBy(desc(videos.upload_date));
const conditions: SQL<unknown>[] = [lte(videos.upload_date, new Date())];
if (!ctx.currentUser) conditions.push(eq(videos.premium, false));
if (args.featured !== null && args.featured !== undefined) {
conditions.push(eq(videos.featured, args.featured));
}
if (args.search) {
conditions.push(ilike(videos.title, `%${args.search}%`));
}
if (args.tag) {
conditions.push(arrayContains(videos.tags, [args.tag]));
}
if (args.modelId) { if (args.modelId) {
const videoIds = await ctx.db const videoIds = await ctx.db
.select({ video_id: video_models.video_id }) .select({ video_id: video_models.video_id })
.from(video_models) .from(video_models)
.where(eq(video_models.user_id, args.modelId)); .where(eq(video_models.user_id, args.modelId));
if (videoIds.length === 0) return { items: [], total: 0 };
if (videoIds.length === 0) return []; conditions.push(
inArray(
query = ctx.db videos.id,
.select({ v: videos }) videoIds.map((v) => v.video_id),
.from(videos) ),
.where(and( );
lte(videos.upload_date, new Date()),
inArray(videos.id, videoIds.map((v: any) => v.video_id)),
))
.orderBy(desc(videos.upload_date));
} }
if (args.featured !== null && args.featured !== undefined) { const order =
query = ctx.db args.sortBy === "most_liked"
.select({ v: videos }) ? desc(videos.likes_count)
.from(videos) : args.sortBy === "most_played"
.where(and( ? desc(videos.plays_count)
lte(videos.upload_date, new Date()), : args.sortBy === "name"
eq(videos.featured, args.featured), ? asc(videos.title)
)) : desc(videos.upload_date);
.orderBy(desc(videos.upload_date));
const where = and(...conditions);
// Duration filter requires JOIN to files table
if (args.duration && args.duration !== "all") {
const durationCond =
args.duration === "short"
? lt(files.duration, 600)
: args.duration === "medium"
? and(gte(files.duration, 600), lt(files.duration, 1200))
: gte(files.duration, 1200);
const fullWhere = and(where, durationCond);
const [rows, totalRows] = await Promise.all([
ctx.db
.select({ v: videos })
.from(videos)
.leftJoin(files, eq(videos.movie, files.id))
.where(fullWhere)
.orderBy(order)
.limit(pageSize)
.offset(offset),
ctx.db
.select({ total: count() })
.from(videos)
.leftJoin(files, eq(videos.movie, files.id))
.where(fullWhere),
]);
const videoList = rows.map((r) => r.v);
const items = await Promise.all(videoList.map((v) => enrichVideo(ctx.db, v)));
return { items, total: totalRows[0]?.total ?? 0 };
} }
if (args.limit) { const [rows, totalRows] = await Promise.all([
query = (query as any).limit(args.limit); ctx.db.select().from(videos).where(where).orderBy(order).limit(pageSize).offset(offset),
} ctx.db.select({ total: count() }).from(videos).where(where),
]);
const rows = await query; const items = await Promise.all(rows.map((v) => enrichVideo(ctx.db, v)));
const videoList = rows.map((r: any) => r.v || r); return { items, total: totalRows[0]?.total ?? 0 };
return Promise.all(videoList.map((v: any) => enrichVideo(ctx.db, v)));
}, },
}), }),
); );
@@ -107,6 +191,27 @@ builder.queryField("video", (t) =>
.limit(1); .limit(1);
if (!video[0]) return null; if (!video[0]) return null;
if (video[0].premium && !ctx.currentUser) {
throw new GraphQLError("Unauthorized");
}
return enrichVideo(ctx.db, video[0]);
},
}),
);
builder.queryField("adminGetVideo", (t) =>
t.field({
type: VideoType,
nullable: true,
args: {
id: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const video = await ctx.db.select().from(videos).where(eq(videos.id, args.id)).limit(1);
if (!video[0]) return null;
return enrichVideo(ctx.db, video[0]); return enrichVideo(ctx.db, video[0]);
}, },
}), }),
@@ -123,7 +228,9 @@ builder.queryField("videoLikeStatus", (t) =>
const existing = await ctx.db const existing = await ctx.db
.select() .select()
.from(video_likes) .from(video_likes)
.where(and(eq(video_likes.video_id, args.videoId), eq(video_likes.user_id, ctx.currentUser.id))) .where(
and(eq(video_likes.video_id, args.videoId), eq(video_likes.user_id, ctx.currentUser.id)),
)
.limit(1); .limit(1);
return { liked: existing.length > 0 }; return { liked: existing.length > 0 };
}, },
@@ -142,7 +249,9 @@ builder.mutationField("likeVideo", (t) =>
const existing = await ctx.db const existing = await ctx.db
.select() .select()
.from(video_likes) .from(video_likes)
.where(and(eq(video_likes.video_id, args.videoId), eq(video_likes.user_id, ctx.currentUser.id))) .where(
and(eq(video_likes.video_id, args.videoId), eq(video_likes.user_id, ctx.currentUser.id)),
)
.limit(1); .limit(1);
if (existing.length > 0) throw new GraphQLError("Already liked"); if (existing.length > 0) throw new GraphQLError("Already liked");
@@ -154,10 +263,22 @@ builder.mutationField("likeVideo", (t) =>
await ctx.db await ctx.db
.update(videos) .update(videos)
.set({ likes_count: (await ctx.db.select({ c: videos.likes_count }).from(videos).where(eq(videos.id, args.videoId)).limit(1))[0]?.c as number + 1 || 1 }) .set({
likes_count:
((
await ctx.db
.select({ c: videos.likes_count })
.from(videos)
.where(eq(videos.id, args.videoId))
.limit(1)
)[0]?.c as number) + 1 || 1,
})
.where(eq(videos.id, args.videoId)); .where(eq(videos.id, args.videoId));
const likesCount = await ctx.db.select({ count: count() }).from(video_likes).where(eq(video_likes.video_id, args.videoId)); const likesCount = await ctx.db
.select({ count: count() })
.from(video_likes)
.where(eq(video_likes.video_id, args.videoId));
return { liked: true, likes_count: likesCount[0]?.count || 1 }; return { liked: true, likes_count: likesCount[0]?.count || 1 };
}, },
}), }),
@@ -175,21 +296,39 @@ builder.mutationField("unlikeVideo", (t) =>
const existing = await ctx.db const existing = await ctx.db
.select() .select()
.from(video_likes) .from(video_likes)
.where(and(eq(video_likes.video_id, args.videoId), eq(video_likes.user_id, ctx.currentUser.id))) .where(
and(eq(video_likes.video_id, args.videoId), eq(video_likes.user_id, ctx.currentUser.id)),
)
.limit(1); .limit(1);
if (existing.length === 0) throw new GraphQLError("Not liked"); if (existing.length === 0) throw new GraphQLError("Not liked");
await ctx.db await ctx.db
.delete(video_likes) .delete(video_likes)
.where(and(eq(video_likes.video_id, args.videoId), eq(video_likes.user_id, ctx.currentUser.id))); .where(
and(eq(video_likes.video_id, args.videoId), eq(video_likes.user_id, ctx.currentUser.id)),
);
await ctx.db await ctx.db
.update(videos) .update(videos)
.set({ likes_count: Math.max(((await ctx.db.select({ c: videos.likes_count }).from(videos).where(eq(videos.id, args.videoId)).limit(1))[0]?.c as number || 1) - 1, 0) }) .set({
likes_count: Math.max(
(((
await ctx.db
.select({ c: videos.likes_count })
.from(videos)
.where(eq(videos.id, args.videoId))
.limit(1)
)[0]?.c as number) || 1) - 1,
0,
),
})
.where(eq(videos.id, args.videoId)); .where(eq(videos.id, args.videoId));
const likesCount = await ctx.db.select({ count: count() }).from(video_likes).where(eq(video_likes.video_id, args.videoId)); const likesCount = await ctx.db
.select({ count: count() })
.from(video_likes)
.where(eq(video_likes.video_id, args.videoId));
return { liked: false, likes_count: likesCount[0]?.count || 0 }; return { liked: false, likes_count: likesCount[0]?.count || 0 };
}, },
}), }),
@@ -203,13 +342,19 @@ builder.mutationField("recordVideoPlay", (t) =>
sessionId: t.arg.string(), sessionId: t.arg.string(),
}, },
resolve: async (_root, args, ctx) => { resolve: async (_root, args, ctx) => {
const play = await ctx.db.insert(video_plays).values({ const play = await ctx.db
video_id: args.videoId, .insert(video_plays)
user_id: ctx.currentUser?.id || null, .values({
session_id: args.sessionId || null, video_id: args.videoId,
}).returning({ id: video_plays.id }); user_id: ctx.currentUser?.id || null,
session_id: args.sessionId || null,
})
.returning({ id: video_plays.id });
const playsCount = await ctx.db.select({ count: count() }).from(video_plays).where(eq(video_plays.video_id, args.videoId)); const playsCount = await ctx.db
.select({ count: count() })
.from(video_plays)
.where(eq(video_plays.video_id, args.videoId));
await ctx.db await ctx.db
.update(videos) .update(videos)
@@ -235,9 +380,26 @@ builder.mutationField("updateVideoPlay", (t) =>
completed: t.arg.boolean({ required: true }), completed: t.arg.boolean({ required: true }),
}, },
resolve: async (_root, args, ctx) => { resolve: async (_root, args, ctx) => {
const play = await ctx.db
.select()
.from(video_plays)
.where(eq(video_plays.id, args.playId))
.limit(1);
if (!play[0]) return false;
// If play belongs to a user, verify ownership
if (play[0].user_id && (!ctx.currentUser || play[0].user_id !== ctx.currentUser.id)) {
throw new GraphQLError("Forbidden");
}
await ctx.db await ctx.db
.update(video_plays) .update(video_plays)
.set({ duration_watched: args.durationWatched, completed: args.completed, date_updated: new Date() }) .set({
duration_watched: args.durationWatched,
completed: args.completed,
date_updated: new Date(),
})
.where(eq(video_plays.id, args.playId)); .where(eq(video_plays.id, args.playId));
return true; return true;
}, },
@@ -262,25 +424,38 @@ builder.queryField("analytics", (t) =>
.where(eq(video_models.user_id, userId)); .where(eq(video_models.user_id, userId));
if (modelVideoIds.length === 0) { if (modelVideoIds.length === 0) {
return { total_videos: 0, total_likes: 0, total_plays: 0, plays_by_date: {}, likes_by_date: {}, videos: [] }; return {
total_videos: 0,
total_likes: 0,
total_plays: 0,
plays_by_date: {},
likes_by_date: {},
videos: [],
};
} }
const videoIds = modelVideoIds.map((v: any) => v.video_id); const videoIds = modelVideoIds.map((v) => v.video_id);
const videoList = await ctx.db.select().from(videos).where(inArray(videos.id, videoIds)); const videoList = await ctx.db.select().from(videos).where(inArray(videos.id, videoIds));
const plays = await ctx.db.select().from(video_plays).where(inArray(video_plays.video_id, videoIds)); const plays = await ctx.db
const likes = await ctx.db.select().from(video_likes).where(inArray(video_likes.video_id, videoIds)); .select()
.from(video_plays)
.where(inArray(video_plays.video_id, videoIds));
const likes = await ctx.db
.select()
.from(video_likes)
.where(inArray(video_likes.video_id, videoIds));
const totalLikes = videoList.reduce((sum, v) => sum + (v.likes_count || 0), 0); const totalLikes = videoList.reduce((sum, v) => sum + (v.likes_count || 0), 0);
const totalPlays = videoList.reduce((sum, v) => sum + (v.plays_count || 0), 0); const totalPlays = videoList.reduce((sum, v) => sum + (v.plays_count || 0), 0);
const playsByDate = plays.reduce((acc: any, play) => { const playsByDate = plays.reduce((acc: Record<string, number>, play) => {
const date = new Date(play.date_created).toISOString().split("T")[0]; const date = new Date(play.date_created).toISOString().split("T")[0];
if (!acc[date]) acc[date] = 0; if (!acc[date]) acc[date] = 0;
acc[date]++; acc[date]++;
return acc; return acc;
}, {}); }, {});
const likesByDate = likes.reduce((acc: any, like) => { const likesByDate = likes.reduce((acc: Record<string, number>, like) => {
const date = new Date(like.date_created).toISOString().split("T")[0]; const date = new Date(like.date_created).toISOString().split("T")[0];
if (!acc[date]) acc[date] = 0; if (!acc[date]) acc[date] = 0;
acc[date]++; acc[date]++;
@@ -290,9 +465,10 @@ builder.queryField("analytics", (t) =>
const videoAnalytics = videoList.map((video) => { const videoAnalytics = videoList.map((video) => {
const vPlays = plays.filter((p) => p.video_id === video.id); const vPlays = plays.filter((p) => p.video_id === video.id);
const completedPlays = vPlays.filter((p) => p.completed).length; const completedPlays = vPlays.filter((p) => p.completed).length;
const avgWatchTime = vPlays.length > 0 const avgWatchTime =
? vPlays.reduce((sum, p) => sum + (p.duration_watched || 0), 0) / vPlays.length vPlays.length > 0
: 0; ? vPlays.reduce((sum, p) => sum + (p.duration_watched || 0), 0) / vPlays.length
: 0;
return { return {
id: video.id, id: video.id,
@@ -318,3 +494,157 @@ builder.queryField("analytics", (t) =>
}, },
}), }),
); );
// ─── Admin queries & mutations ────────────────────────────────────────────────
builder.queryField("adminListVideos", (t) =>
t.field({
type: AdminVideoListType,
args: {
search: t.arg.string(),
premium: t.arg.boolean(),
featured: t.arg.boolean(),
limit: t.arg.int(),
offset: t.arg.int(),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const limit = args.limit ?? 50;
const offset = args.offset ?? 0;
const conditions: SQL<unknown>[] = [];
if (args.search) conditions.push(ilike(videos.title, `%${args.search}%`));
if (args.premium !== null && args.premium !== undefined)
conditions.push(eq(videos.premium, args.premium));
if (args.featured !== null && args.featured !== undefined)
conditions.push(eq(videos.featured, args.featured));
const where = conditions.length > 0 ? and(...conditions) : undefined;
const [rows, totalRows] = await Promise.all([
ctx.db
.select()
.from(videos)
.where(where)
.orderBy(desc(videos.upload_date))
.limit(limit)
.offset(offset),
ctx.db.select({ total: count() }).from(videos).where(where),
]);
const items = await Promise.all(rows.map((v) => enrichVideo(ctx.db, v)));
return { items, total: totalRows[0]?.total ?? 0 };
},
}),
);
builder.mutationField("createVideo", (t) =>
t.field({
type: VideoType,
args: {
title: t.arg.string({ required: true }),
slug: t.arg.string({ required: true }),
description: t.arg.string(),
imageId: t.arg.string(),
movieId: t.arg.string(),
tags: t.arg.stringList(),
premium: t.arg.boolean(),
featured: t.arg.boolean(),
uploadDate: t.arg.string(),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const inserted = await ctx.db
.insert(videos)
.values({
title: args.title,
slug: args.slug,
description: args.description || null,
image: args.imageId || null,
movie: args.movieId || null,
tags: args.tags || [],
premium: args.premium ?? false,
featured: args.featured ?? false,
upload_date: args.uploadDate ? new Date(args.uploadDate) : new Date(),
})
.returning();
return enrichVideo(ctx.db, inserted[0]);
},
}),
);
builder.mutationField("updateVideo", (t) =>
t.field({
type: VideoType,
nullable: true,
args: {
id: t.arg.string({ required: true }),
title: t.arg.string(),
slug: t.arg.string(),
description: t.arg.string(),
imageId: t.arg.string(),
movieId: t.arg.string(),
tags: t.arg.stringList(),
premium: t.arg.boolean(),
featured: t.arg.boolean(),
uploadDate: t.arg.string(),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const updates: Record<string, unknown> = {};
if (args.title !== undefined && args.title !== null) updates.title = args.title;
if (args.slug !== undefined && args.slug !== null) updates.slug = args.slug;
if (args.description !== undefined) updates.description = args.description;
if (args.imageId !== undefined) updates.image = args.imageId;
if (args.movieId !== undefined) updates.movie = args.movieId;
if (args.tags !== undefined && args.tags !== null) updates.tags = args.tags;
if (args.premium !== undefined && args.premium !== null) updates.premium = args.premium;
if (args.featured !== undefined && args.featured !== null) updates.featured = args.featured;
if (args.uploadDate !== undefined && args.uploadDate !== null)
updates.upload_date = new Date(args.uploadDate);
const updated = await ctx.db
.update(videos)
.set(updates as Partial<typeof videos.$inferInsert>)
.where(eq(videos.id, args.id))
.returning();
if (!updated[0]) return null;
return enrichVideo(ctx.db, updated[0]);
},
}),
);
builder.mutationField("deleteVideo", (t) =>
t.field({
type: "Boolean",
args: {
id: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
await ctx.db.delete(videos).where(eq(videos.id, args.id));
return true;
},
}),
);
builder.mutationField("setVideoModels", (t) =>
t.field({
type: "Boolean",
args: {
videoId: t.arg.string({ required: true }),
userIds: t.arg.stringList({ required: true }),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
await ctx.db.delete(video_models).where(eq(video_models.video_id, args.videoId));
if (args.userIds.length > 0) {
await ctx.db.insert(video_models).values(
args.userIds.map((userId) => ({
video_id: args.videoId,
user_id: userId,
})),
);
}
return true;
},
}),
);

View File

@@ -1,17 +1,33 @@
import { builder } from "../builder.js"; import type {
MediaFile,
User,
VideoModel,
VideoFile,
Video,
ModelPhoto,
Model,
Article,
CommentUser,
Comment,
Stats,
Recording,
VideoLikeStatus,
VideoLikeResponse,
VideoPlayResponse,
VideoAnalytics,
Analytics,
LeaderboardEntry,
UserStats,
UserAchievement,
RecentPoint,
UserGamification,
Achievement,
} from "@sexy.pivoine.art/types";
// File type type AdminUserDetail = User & { photos: ModelPhoto[] };
export const FileType = builder.objectRef<{ import { builder } from "../builder";
id: string;
title: string | null; export const FileType = builder.objectRef<MediaFile>("File").implement({
description: string | null;
filename: string;
mime_type: string | null;
filesize: number | null;
duration: number | null;
uploaded_by: string | null;
date_created: Date;
}>("File").implement({
fields: (t) => ({ fields: (t) => ({
id: t.exposeString("id"), id: t.exposeString("id"),
title: t.exposeString("title", { nullable: true }), title: t.exposeString("title", { nullable: true }),
@@ -25,22 +41,7 @@ export const FileType = builder.objectRef<{
}), }),
}); });
// User type export const UserType = builder.objectRef<User>("User").implement({
export const UserType = builder.objectRef<{
id: string;
email: string;
first_name: string | null;
last_name: string | null;
artist_name: string | null;
slug: string | null;
description: string | null;
tags: string[] | null;
role: "model" | "viewer" | "admin";
avatar: string | null;
banner: string | null;
email_verified: boolean;
date_created: Date;
}>("User").implement({
fields: (t) => ({ fields: (t) => ({
id: t.exposeString("id"), id: t.exposeString("id"),
email: t.exposeString("email"), email: t.exposeString("email"),
@@ -51,29 +52,17 @@ export const UserType = builder.objectRef<{
description: t.exposeString("description", { nullable: true }), description: t.exposeString("description", { nullable: true }),
tags: t.exposeStringList("tags", { nullable: true }), tags: t.exposeStringList("tags", { nullable: true }),
role: t.exposeString("role"), role: t.exposeString("role"),
is_admin: t.exposeBoolean("is_admin"),
avatar: t.exposeString("avatar", { nullable: true }), avatar: t.exposeString("avatar", { nullable: true }),
banner: t.exposeString("banner", { nullable: true }), banner: t.exposeString("banner", { nullable: true }),
photo: t.exposeString("photo", { nullable: true }),
email_verified: t.exposeBoolean("email_verified"), email_verified: t.exposeBoolean("email_verified"),
date_created: t.expose("date_created", { type: "DateTime" }), date_created: t.expose("date_created", { type: "DateTime" }),
}), }),
}); });
// CurrentUser type (same shape, used for auth context) // CurrentUser is the same shape as User
export const CurrentUserType = builder.objectRef<{ export const CurrentUserType = builder.objectRef<User>("CurrentUser").implement({
id: string;
email: string;
first_name: string | null;
last_name: string | null;
artist_name: string | null;
slug: string | null;
description: string | null;
tags: string[] | null;
role: "model" | "viewer" | "admin";
avatar: string | null;
banner: string | null;
email_verified: boolean;
date_created: Date;
}>("CurrentUser").implement({
fields: (t) => ({ fields: (t) => ({
id: t.exposeString("id"), id: t.exposeString("id"),
email: t.exposeString("email"), email: t.exposeString("email"),
@@ -84,30 +73,35 @@ export const CurrentUserType = builder.objectRef<{
description: t.exposeString("description", { nullable: true }), description: t.exposeString("description", { nullable: true }),
tags: t.exposeStringList("tags", { nullable: true }), tags: t.exposeStringList("tags", { nullable: true }),
role: t.exposeString("role"), role: t.exposeString("role"),
is_admin: t.exposeBoolean("is_admin"),
avatar: t.exposeString("avatar", { nullable: true }), avatar: t.exposeString("avatar", { nullable: true }),
banner: t.exposeString("banner", { nullable: true }), banner: t.exposeString("banner", { nullable: true }),
photo: t.exposeString("photo", { nullable: true }),
email_verified: t.exposeBoolean("email_verified"), email_verified: t.exposeBoolean("email_verified"),
date_created: t.expose("date_created", { type: "DateTime" }), date_created: t.expose("date_created", { type: "DateTime" }),
}), }),
}); });
// Video type export const VideoModelType = builder.objectRef<VideoModel>("VideoModel").implement({
export const VideoType = builder.objectRef<{ fields: (t) => ({
id: string; id: t.exposeString("id"),
slug: string; artist_name: t.exposeString("artist_name", { nullable: true }),
title: string; slug: t.exposeString("slug", { nullable: true }),
description: string | null; avatar: t.exposeString("avatar", { nullable: true }),
image: string | null; description: t.exposeString("description", { nullable: true }),
movie: string | null; }),
tags: string[] | null; });
upload_date: Date;
premium: boolean | null; export const VideoFileType = builder.objectRef<VideoFile>("VideoFile").implement({
featured: boolean | null; fields: (t) => ({
likes_count: number | null; id: t.exposeString("id"),
plays_count: number | null; filename: t.exposeString("filename"),
models?: { id: string; artist_name: string | null; slug: string | null; avatar: string | null }[]; mime_type: t.exposeString("mime_type", { nullable: true }),
movie_file?: { id: string; filename: string; mime_type: string | null; duration: number | null } | null; duration: t.exposeInt("duration", { nullable: true }),
}>("Video").implement({ }),
});
export const VideoType = builder.objectRef<Video>("Video").implement({
fields: (t) => ({ fields: (t) => ({
id: t.exposeString("id"), id: t.exposeString("id"),
slug: t.exposeString("slug"), slug: t.exposeString("slug"),
@@ -126,46 +120,14 @@ export const VideoType = builder.objectRef<{
}), }),
}); });
export const VideoModelType = builder.objectRef<{ export const ModelPhotoType = builder.objectRef<ModelPhoto>("ModelPhoto").implement({
id: string;
artist_name: string | null;
slug: string | null;
avatar: string | null;
}>("VideoModel").implement({
fields: (t) => ({
id: t.exposeString("id"),
artist_name: t.exposeString("artist_name", { nullable: true }),
slug: t.exposeString("slug", { nullable: true }),
avatar: t.exposeString("avatar", { nullable: true }),
}),
});
export const VideoFileType = builder.objectRef<{
id: string;
filename: string;
mime_type: string | null;
duration: number | null;
}>("VideoFile").implement({
fields: (t) => ({ fields: (t) => ({
id: t.exposeString("id"), id: t.exposeString("id"),
filename: t.exposeString("filename"), filename: t.exposeString("filename"),
mime_type: t.exposeString("mime_type", { nullable: true }),
duration: t.exposeInt("duration", { nullable: true }),
}), }),
}); });
// Model type (model profile, enriched user) export const ModelType = builder.objectRef<Model>("Model").implement({
export const ModelType = builder.objectRef<{
id: string;
slug: string | null;
artist_name: string | null;
description: string | null;
avatar: string | null;
banner: string | null;
tags: string[] | null;
date_created: Date;
photos?: { id: string; filename: string }[];
}>("Model").implement({
fields: (t) => ({ fields: (t) => ({
id: t.exposeString("id"), id: t.exposeString("id"),
slug: t.exposeString("slug", { nullable: true }), slug: t.exposeString("slug", { nullable: true }),
@@ -173,36 +135,14 @@ export const ModelType = builder.objectRef<{
description: t.exposeString("description", { nullable: true }), description: t.exposeString("description", { nullable: true }),
avatar: t.exposeString("avatar", { nullable: true }), avatar: t.exposeString("avatar", { nullable: true }),
banner: t.exposeString("banner", { nullable: true }), banner: t.exposeString("banner", { nullable: true }),
photo: t.exposeString("photo", { nullable: true }),
tags: t.exposeStringList("tags", { nullable: true }), tags: t.exposeStringList("tags", { nullable: true }),
date_created: t.expose("date_created", { type: "DateTime" }), date_created: t.expose("date_created", { type: "DateTime" }),
photos: t.expose("photos", { type: [ModelPhotoType], nullable: true }), photos: t.expose("photos", { type: [ModelPhotoType], nullable: true }),
}), }),
}); });
export const ModelPhotoType = builder.objectRef<{ export const ArticleType = builder.objectRef<Article>("Article").implement({
id: string;
filename: string;
}>("ModelPhoto").implement({
fields: (t) => ({
id: t.exposeString("id"),
filename: t.exposeString("filename"),
}),
});
// Article type
export const ArticleType = builder.objectRef<{
id: string;
slug: string;
title: string;
excerpt: string | null;
content: string | null;
image: string | null;
tags: string[] | null;
publish_date: Date;
category: string | null;
featured: boolean | null;
author?: { first_name: string | null; last_name: string | null; avatar: string | null; description: string | null } | null;
}>("Article").implement({
fields: (t) => ({ fields: (t) => ({
id: t.exposeString("id"), id: t.exposeString("id"),
slug: t.exposeString("slug"), slug: t.exposeString("slug"),
@@ -214,42 +154,41 @@ export const ArticleType = builder.objectRef<{
publish_date: t.expose("publish_date", { type: "DateTime" }), publish_date: t.expose("publish_date", { type: "DateTime" }),
category: t.exposeString("category", { nullable: true }), category: t.exposeString("category", { nullable: true }),
featured: t.exposeBoolean("featured", { nullable: true }), featured: t.exposeBoolean("featured", { nullable: true }),
author: t.expose("author", { type: ArticleAuthorType, nullable: true }), author: t.expose("author", { type: VideoModelType, nullable: true }),
}), }),
}); });
export const ArticleAuthorType = builder.objectRef<{ export const CommentUserType = builder.objectRef<CommentUser>("CommentUser").implement({
first_name: string | null;
last_name: string | null;
avatar: string | null;
description: string | null;
}>("ArticleAuthor").implement({
fields: (t) => ({ fields: (t) => ({
id: t.exposeString("id"),
first_name: t.exposeString("first_name", { nullable: true }), first_name: t.exposeString("first_name", { nullable: true }),
last_name: t.exposeString("last_name", { nullable: true }), last_name: t.exposeString("last_name", { nullable: true }),
artist_name: t.exposeString("artist_name", { nullable: true }),
avatar: t.exposeString("avatar", { nullable: true }), avatar: t.exposeString("avatar", { nullable: true }),
description: t.exposeString("description", { nullable: true }),
}), }),
}); });
// Recording type export const CommentType = builder.objectRef<Comment>("Comment").implement({
export const RecordingType = builder.objectRef<{ fields: (t) => ({
id: string; id: t.exposeInt("id"),
title: string; collection: t.exposeString("collection"),
description: string | null; item_id: t.exposeString("item_id"),
slug: string; comment: t.exposeString("comment"),
duration: number; user_id: t.exposeString("user_id"),
events: object[] | null; date_created: t.expose("date_created", { type: "DateTime" }),
device_info: object[] | null; user: t.expose("user", { type: CommentUserType, nullable: true }),
user_id: string; }),
status: string; });
tags: string[] | null;
linked_video: string | null; export const StatsType = builder.objectRef<Stats>("Stats").implement({
featured: boolean | null; fields: (t) => ({
public: boolean | null; videos_count: t.exposeInt("videos_count"),
date_created: Date; models_count: t.exposeInt("models_count"),
date_updated: Date | null; viewers_count: t.exposeInt("viewers_count"),
}>("Recording").implement({ }),
});
export const RecordingType = builder.objectRef<Recording>("Recording").implement({
fields: (t) => ({ fields: (t) => ({
id: t.exposeString("id"), id: t.exposeString("id"),
title: t.exposeString("title"), title: t.exposeString("title"),
@@ -269,237 +208,32 @@ export const RecordingType = builder.objectRef<{
}), }),
}); });
// Comment type export const VideoLikeResponseType = builder
export const CommentType = builder.objectRef<{ .objectRef<VideoLikeResponse>("VideoLikeResponse")
id: number; .implement({
collection: string; fields: (t) => ({
item_id: string; liked: t.exposeBoolean("liked"),
comment: string; likes_count: t.exposeInt("likes_count"),
user_id: string; }),
date_created: Date; });
user?: { id: string; first_name: string | null; last_name: string | null; avatar: string | null } | null;
}>("Comment").implement({ export const VideoPlayResponseType = builder
.objectRef<VideoPlayResponse>("VideoPlayResponse")
.implement({
fields: (t) => ({
success: t.exposeBoolean("success"),
play_id: t.exposeString("play_id"),
plays_count: t.exposeInt("plays_count"),
}),
});
export const VideoLikeStatusType = builder.objectRef<VideoLikeStatus>("VideoLikeStatus").implement({
fields: (t) => ({ fields: (t) => ({
id: t.exposeInt("id"), liked: t.exposeBoolean("liked"),
collection: t.exposeString("collection"),
item_id: t.exposeString("item_id"),
comment: t.exposeString("comment"),
user_id: t.exposeString("user_id"),
date_created: t.expose("date_created", { type: "DateTime" }),
user: t.expose("user", { type: CommentUserType, nullable: true }),
}), }),
}); });
export const CommentUserType = builder.objectRef<{ export const VideoAnalyticsType = builder.objectRef<VideoAnalytics>("VideoAnalytics").implement({
id: string;
first_name: string | null;
last_name: string | null;
avatar: string | null;
}>("CommentUser").implement({
fields: (t) => ({
id: t.exposeString("id"),
first_name: t.exposeString("first_name", { nullable: true }),
last_name: t.exposeString("last_name", { nullable: true }),
avatar: t.exposeString("avatar", { nullable: true }),
}),
});
// Stats type
export const StatsType = builder.objectRef<{
videos_count: number;
models_count: number;
viewers_count: number;
}>("Stats").implement({
fields: (t) => ({
videos_count: t.exposeInt("videos_count"),
models_count: t.exposeInt("models_count"),
viewers_count: t.exposeInt("viewers_count"),
}),
});
// Gamification types
export const LeaderboardEntryType = builder.objectRef<{
user_id: string;
display_name: string | null;
avatar: string | null;
total_weighted_points: number | null;
total_raw_points: number | null;
recordings_count: number | null;
playbacks_count: number | null;
achievements_count: number | null;
rank: number;
}>("LeaderboardEntry").implement({
fields: (t) => ({
user_id: t.exposeString("user_id"),
display_name: t.exposeString("display_name", { nullable: true }),
avatar: t.exposeString("avatar", { nullable: true }),
total_weighted_points: t.exposeFloat("total_weighted_points", { nullable: true }),
total_raw_points: t.exposeInt("total_raw_points", { nullable: true }),
recordings_count: t.exposeInt("recordings_count", { nullable: true }),
playbacks_count: t.exposeInt("playbacks_count", { nullable: true }),
achievements_count: t.exposeInt("achievements_count", { nullable: true }),
rank: t.exposeInt("rank"),
}),
});
export const AchievementType = builder.objectRef<{
id: string;
code: string;
name: string;
description: string | null;
icon: string | null;
category: string | null;
required_count: number;
points_reward: number;
}>("Achievement").implement({
fields: (t) => ({
id: t.exposeString("id"),
code: t.exposeString("code"),
name: t.exposeString("name"),
description: t.exposeString("description", { nullable: true }),
icon: t.exposeString("icon", { nullable: true }),
category: t.exposeString("category", { nullable: true }),
required_count: t.exposeInt("required_count"),
points_reward: t.exposeInt("points_reward"),
}),
});
export const UserGamificationType = builder.objectRef<{
stats: {
user_id: string;
total_raw_points: number | null;
total_weighted_points: number | null;
recordings_count: number | null;
playbacks_count: number | null;
comments_count: number | null;
achievements_count: number | null;
rank: number;
} | null;
achievements: {
id: string;
code: string;
name: string;
description: string | null;
icon: string | null;
category: string | null;
date_unlocked: Date;
progress: number | null;
required_count: number;
}[];
recent_points: {
action: string;
points: number;
date_created: Date;
recording_id: string | null;
}[];
}>("UserGamification").implement({
fields: (t) => ({
stats: t.expose("stats", { type: UserStatsType, nullable: true }),
achievements: t.expose("achievements", { type: [UserAchievementType] }),
recent_points: t.expose("recent_points", { type: [RecentPointType] }),
}),
});
export const UserStatsType = builder.objectRef<{
user_id: string;
total_raw_points: number | null;
total_weighted_points: number | null;
recordings_count: number | null;
playbacks_count: number | null;
comments_count: number | null;
achievements_count: number | null;
rank: number;
}>("UserStats").implement({
fields: (t) => ({
user_id: t.exposeString("user_id"),
total_raw_points: t.exposeInt("total_raw_points", { nullable: true }),
total_weighted_points: t.exposeFloat("total_weighted_points", { nullable: true }),
recordings_count: t.exposeInt("recordings_count", { nullable: true }),
playbacks_count: t.exposeInt("playbacks_count", { nullable: true }),
comments_count: t.exposeInt("comments_count", { nullable: true }),
achievements_count: t.exposeInt("achievements_count", { nullable: true }),
rank: t.exposeInt("rank"),
}),
});
export const UserAchievementType = builder.objectRef<{
id: string;
code: string;
name: string;
description: string | null;
icon: string | null;
category: string | null;
date_unlocked: Date;
progress: number | null;
required_count: number;
}>("UserAchievement").implement({
fields: (t) => ({
id: t.exposeString("id"),
code: t.exposeString("code"),
name: t.exposeString("name"),
description: t.exposeString("description", { nullable: true }),
icon: t.exposeString("icon", { nullable: true }),
category: t.exposeString("category", { nullable: true }),
date_unlocked: t.expose("date_unlocked", { type: "DateTime" }),
progress: t.exposeInt("progress", { nullable: true }),
required_count: t.exposeInt("required_count"),
}),
});
export const RecentPointType = builder.objectRef<{
action: string;
points: number;
date_created: Date;
recording_id: string | null;
}>("RecentPoint").implement({
fields: (t) => ({
action: t.exposeString("action"),
points: t.exposeInt("points"),
date_created: t.expose("date_created", { type: "DateTime" }),
recording_id: t.exposeString("recording_id", { nullable: true }),
}),
});
// Analytics types
export const AnalyticsType = builder.objectRef<{
total_videos: number;
total_likes: number;
total_plays: number;
plays_by_date: Record<string, number>;
likes_by_date: Record<string, number>;
videos: {
id: string;
title: string;
slug: string;
upload_date: Date;
likes: number;
plays: number;
completed_plays: number;
completion_rate: number;
avg_watch_time: number;
}[];
}>("Analytics").implement({
fields: (t) => ({
total_videos: t.exposeInt("total_videos"),
total_likes: t.exposeInt("total_likes"),
total_plays: t.exposeInt("total_plays"),
plays_by_date: t.expose("plays_by_date", { type: "JSON" }),
likes_by_date: t.expose("likes_by_date", { type: "JSON" }),
videos: t.expose("videos", { type: [VideoAnalyticsType] }),
}),
});
export const VideoAnalyticsType = builder.objectRef<{
id: string;
title: string;
slug: string;
upload_date: Date;
likes: number;
plays: number;
completed_plays: number;
completion_rate: number;
avg_watch_time: number;
}>("VideoAnalytics").implement({
fields: (t) => ({ fields: (t) => ({
id: t.exposeString("id"), id: t.exposeString("id"),
title: t.exposeString("title"), title: t.exposeString("title"),
@@ -513,33 +247,249 @@ export const VideoAnalyticsType = builder.objectRef<{
}), }),
}); });
// Response types export const AnalyticsType = builder.objectRef<Analytics>("Analytics").implement({
export const VideoLikeResponseType = builder.objectRef<{
liked: boolean;
likes_count: number;
}>("VideoLikeResponse").implement({
fields: (t) => ({ fields: (t) => ({
liked: t.exposeBoolean("liked"), total_videos: t.exposeInt("total_videos"),
likes_count: t.exposeInt("likes_count"), total_likes: t.exposeInt("total_likes"),
total_plays: t.exposeInt("total_plays"),
plays_by_date: t.expose("plays_by_date", { type: "JSON" }),
likes_by_date: t.expose("likes_by_date", { type: "JSON" }),
videos: t.expose("videos", { type: [VideoAnalyticsType] }),
}), }),
}); });
export const VideoPlayResponseType = builder.objectRef<{ export const LeaderboardEntryType = builder
success: boolean; .objectRef<LeaderboardEntry>("LeaderboardEntry")
play_id: string; .implement({
plays_count: number; fields: (t) => ({
}>("VideoPlayResponse").implement({ user_id: t.exposeString("user_id"),
display_name: t.exposeString("display_name", { nullable: true }),
avatar: t.exposeString("avatar", { nullable: true }),
total_weighted_points: t.exposeFloat("total_weighted_points", { nullable: true }),
total_raw_points: t.exposeInt("total_raw_points", { nullable: true }),
recordings_count: t.exposeInt("recordings_count", { nullable: true }),
playbacks_count: t.exposeInt("playbacks_count", { nullable: true }),
achievements_count: t.exposeInt("achievements_count", { nullable: true }),
rank: t.exposeInt("rank"),
}),
});
export const UserStatsType = builder.objectRef<UserStats>("UserStats").implement({
fields: (t) => ({ fields: (t) => ({
success: t.exposeBoolean("success"), user_id: t.exposeString("user_id"),
play_id: t.exposeString("play_id"), total_raw_points: t.exposeInt("total_raw_points", { nullable: true }),
plays_count: t.exposeInt("plays_count"), total_weighted_points: t.exposeFloat("total_weighted_points", { nullable: true }),
recordings_count: t.exposeInt("recordings_count", { nullable: true }),
playbacks_count: t.exposeInt("playbacks_count", { nullable: true }),
comments_count: t.exposeInt("comments_count", { nullable: true }),
achievements_count: t.exposeInt("achievements_count", { nullable: true }),
rank: t.exposeInt("rank"),
}), }),
}); });
export const VideoLikeStatusType = builder.objectRef<{ export const UserAchievementType = builder.objectRef<UserAchievement>("UserAchievement").implement({
liked: boolean;
}>("VideoLikeStatus").implement({
fields: (t) => ({ fields: (t) => ({
liked: t.exposeBoolean("liked"), id: t.exposeString("id"),
code: t.exposeString("code"),
name: t.exposeString("name"),
description: t.exposeString("description", { nullable: true }),
icon: t.exposeString("icon", { nullable: true }),
category: t.exposeString("category", { nullable: true }),
date_unlocked: t.expose("date_unlocked", { type: "DateTime" }),
progress: t.exposeInt("progress", { nullable: true }),
required_count: t.exposeInt("required_count"),
}),
});
export const RecentPointType = builder.objectRef<RecentPoint>("RecentPoint").implement({
fields: (t) => ({
action: t.exposeString("action"),
points: t.exposeInt("points"),
date_created: t.expose("date_created", { type: "DateTime" }),
recording_id: t.exposeString("recording_id", { nullable: true }),
}),
});
export const UserGamificationType = builder
.objectRef<UserGamification>("UserGamification")
.implement({
fields: (t) => ({
stats: t.expose("stats", { type: UserStatsType, nullable: true }),
achievements: t.expose("achievements", { type: [UserAchievementType] }),
recent_points: t.expose("recent_points", { type: [RecentPointType] }),
}),
});
export const AchievementType = builder.objectRef<Achievement>("Achievement").implement({
fields: (t) => ({
id: t.exposeString("id"),
code: t.exposeString("code"),
name: t.exposeString("name"),
description: t.exposeString("description", { nullable: true }),
icon: t.exposeString("icon", { nullable: true }),
category: t.exposeString("category", { nullable: true }),
required_count: t.exposeInt("required_count"),
points_reward: t.exposeInt("points_reward"),
}),
});
// --- Queue / Job types (admin only, not in shared types package) ---
type JobCounts = {
waiting: number;
active: number;
completed: number;
failed: number;
delayed: number;
paused: number;
};
type JobData = {
id: string;
name: string;
queue: string;
status: string;
data: unknown;
result: unknown;
failedReason: string | null;
attemptsMade: number;
createdAt: Date;
processedAt: Date | null;
finishedAt: Date | null;
progress: number | null;
};
type QueueInfoData = {
name: string;
counts: JobCounts;
isPaused: boolean;
};
export const JobCountsType = builder.objectRef<JobCounts>("JobCounts").implement({
fields: (t) => ({
waiting: t.exposeInt("waiting"),
active: t.exposeInt("active"),
completed: t.exposeInt("completed"),
failed: t.exposeInt("failed"),
delayed: t.exposeInt("delayed"),
paused: t.exposeInt("paused"),
}),
});
export const JobType = builder.objectRef<JobData>("Job").implement({
fields: (t) => ({
id: t.exposeString("id"),
name: t.exposeString("name"),
queue: t.exposeString("queue"),
status: t.exposeString("status"),
data: t.expose("data", { type: "JSON" }),
result: t.expose("result", { type: "JSON", nullable: true }),
failedReason: t.exposeString("failedReason", { nullable: true }),
attemptsMade: t.exposeInt("attemptsMade"),
createdAt: t.expose("createdAt", { type: "DateTime" }),
processedAt: t.expose("processedAt", { type: "DateTime", nullable: true }),
finishedAt: t.expose("finishedAt", { type: "DateTime", nullable: true }),
progress: t.exposeFloat("progress", { nullable: true }),
}),
});
export const QueueInfoType = builder.objectRef<QueueInfoData>("QueueInfo").implement({
fields: (t) => ({
name: t.exposeString("name"),
counts: t.expose("counts", { type: JobCountsType }),
isPaused: t.exposeBoolean("isPaused"),
}),
});
export const VideoListType = builder
.objectRef<{ items: Video[]; total: number }>("VideoList")
.implement({
fields: (t) => ({
items: t.expose("items", { type: [VideoType] }),
total: t.exposeInt("total"),
}),
});
export const ArticleListType = builder
.objectRef<{ items: Article[]; total: number }>("ArticleList")
.implement({
fields: (t) => ({
items: t.expose("items", { type: [ArticleType] }),
total: t.exposeInt("total"),
}),
});
export const ModelListType = builder
.objectRef<{ items: Model[]; total: number }>("ModelList")
.implement({
fields: (t) => ({
items: t.expose("items", { type: [ModelType] }),
total: t.exposeInt("total"),
}),
});
export const AdminVideoListType = builder
.objectRef<{ items: Video[]; total: number }>("AdminVideoList")
.implement({
fields: (t) => ({
items: t.expose("items", { type: [VideoType] }),
total: t.exposeInt("total"),
}),
});
export const AdminArticleListType = builder
.objectRef<{ items: Article[]; total: number }>("AdminArticleList")
.implement({
fields: (t) => ({
items: t.expose("items", { type: [ArticleType] }),
total: t.exposeInt("total"),
}),
});
export const AdminCommentListType = builder
.objectRef<{ items: Comment[]; total: number }>("AdminCommentList")
.implement({
fields: (t) => ({
items: t.expose("items", { type: [CommentType] }),
total: t.exposeInt("total"),
}),
});
export const AdminRecordingListType = builder
.objectRef<{ items: Recording[]; total: number }>("AdminRecordingList")
.implement({
fields: (t) => ({
items: t.expose("items", { type: [RecordingType] }),
total: t.exposeInt("total"),
}),
});
export const AdminUserListType = builder
.objectRef<{ items: User[]; total: number }>("AdminUserList")
.implement({
fields: (t) => ({
items: t.expose("items", { type: [UserType] }),
total: t.exposeInt("total"),
}),
});
export const AdminUserDetailType = builder.objectRef<AdminUserDetail>("AdminUserDetail").implement({
fields: (t) => ({
id: t.exposeString("id"),
email: t.exposeString("email"),
first_name: t.exposeString("first_name", { nullable: true }),
last_name: t.exposeString("last_name", { nullable: true }),
artist_name: t.exposeString("artist_name", { nullable: true }),
slug: t.exposeString("slug", { nullable: true }),
description: t.exposeString("description", { nullable: true }),
tags: t.exposeStringList("tags", { nullable: true }),
role: t.exposeString("role"),
is_admin: t.exposeBoolean("is_admin"),
avatar: t.exposeString("avatar", { nullable: true }),
banner: t.exposeString("banner", { nullable: true }),
photo: t.exposeString("photo", { nullable: true }),
email_verified: t.exposeBoolean("email_verified"),
date_created: t.expose("date_created", { type: "DateTime" }),
photos: t.expose("photos", { type: [ModelPhotoType] }),
}), }),
}); });

View File

@@ -1,87 +1,203 @@
import Fastify from "fastify"; import Fastify, { type FastifyRequest, type FastifyReply } from "fastify";
import fastifyCookie from "@fastify/cookie"; import fastifyCookie from "@fastify/cookie";
import fastifyCors from "@fastify/cors"; import fastifyCors from "@fastify/cors";
import fastifyMultipart from "@fastify/multipart"; import fastifyMultipart from "@fastify/multipart";
import fastifyStatic from "@fastify/static"; import fastifyStatic from "@fastify/static";
import { createYoga } from "graphql-yoga"; import { createYoga } from "graphql-yoga";
import { eq } from "drizzle-orm";
import { files } from "./db/schema/index";
import path from "path"; import path from "path";
import { schema } from "./graphql/index.js"; import { existsSync, mkdirSync } from "fs";
import { buildContext } from "./graphql/context.js"; import { writeFile, rm } from "fs/promises";
import { db } from "./db/connection.js"; import sharp from "sharp";
import { redis } from "./lib/auth.js"; import { schema } from "./graphql/index";
import { buildContext } from "./graphql/context";
import { db } from "./db/connection";
import { redis } from "./lib/auth";
import { logger } from "./lib/logger";
import { migrate } from "drizzle-orm/node-postgres/migrator";
import { startMailWorker } from "./queues/workers/mail";
import { startGamificationWorker } from "./queues/workers/gamification";
const PORT = parseInt(process.env.PORT || "4000"); const PORT = parseInt(process.env.PORT || "4000");
const UPLOAD_DIR = process.env.UPLOAD_DIR || "/data/uploads"; const UPLOAD_DIR = process.env.UPLOAD_DIR || "/data/uploads";
const CORS_ORIGIN = process.env.CORS_ORIGIN || "http://localhost:3000"; const CORS_ORIGIN = process.env.CORS_ORIGIN || "http://localhost:3000";
const fastify = Fastify({ async function main() {
logger: { // Run pending DB migrations before starting the server
level: process.env.LOG_LEVEL || "info", const migrationsFolder = path.join(__dirname, "migrations");
}, logger.info(`Running migrations from ${migrationsFolder}`);
}); await migrate(db, { migrationsFolder });
logger.info("Migrations complete");
await fastify.register(fastifyCookie, { // Start background workers
secret: process.env.COOKIE_SECRET || "change-me-in-production", startMailWorker();
}); startGamificationWorker();
logger.info("Queue workers started");
await fastify.register(fastifyCors, { const fastify = Fastify({ loggerInstance: logger });
origin: CORS_ORIGIN,
credentials: true,
methods: ["GET", "POST", "PUT", "PATCH", "DELETE", "OPTIONS"],
});
await fastify.register(fastifyMultipart, { await fastify.register(fastifyCookie, {
limits: { secret: process.env.COOKIE_SECRET || "change-me-in-production",
fileSize: 5 * 1024 * 1024 * 1024, // 5 GB });
},
});
await fastify.register(fastifyStatic, { await fastify.register(fastifyCors, {
root: path.resolve(UPLOAD_DIR), origin: CORS_ORIGIN,
prefix: "/assets/", credentials: true,
decorateReply: false, methods: ["GET", "POST", "PUT", "PATCH", "DELETE", "OPTIONS"],
}); });
const yoga = createYoga({ await fastify.register(fastifyMultipart, {
schema, limits: {
context: buildContext, fileSize: 5 * 1024 * 1024 * 1024, // 5 GB
graphqlEndpoint: "/graphql", },
healthCheckEndpoint: "/health", });
logging: {
debug: (...args) => fastify.log.debug(...args),
info: (...args) => fastify.log.info(...args),
warn: (...args) => fastify.log.warn(...args),
error: (...args) => fastify.log.error(...args),
},
});
fastify.route({ // fastify-static provides reply.sendFile(); files are stored as <UPLOAD_DIR>/<id>/<filename>
url: "/graphql", await fastify.register(fastifyStatic, {
method: ["GET", "POST", "OPTIONS"], root: path.resolve(UPLOAD_DIR),
handler: async (request, reply) => { prefix: "/assets/",
const response = await yoga.handleNodeRequestAndResponse(request, reply, { serve: false, // disable auto-serving; we use a custom route below
request, decorateReply: true,
reply, });
db,
redis, const yoga = createYoga<{
}); req: FastifyRequest;
reply.status(response.status); reply: FastifyReply;
for (const [key, value] of response.headers.entries()) { db: typeof db;
reply.header(key, value); redis: typeof redis;
}>({
schema,
context: buildContext,
graphqlEndpoint: "/graphql",
healthCheckEndpoint: "/health",
logging: {
debug: (...args) => args.forEach((arg) => fastify.log.debug(arg)),
info: (...args) => args.forEach((arg) => fastify.log.info(arg)),
warn: (...args) => args.forEach((arg) => fastify.log.warn(arg)),
error: (...args) => args.forEach((arg) => fastify.log.error(arg)),
},
});
fastify.route({
url: "/graphql",
method: ["GET", "POST", "OPTIONS"],
handler: (req, reply) =>
yoga.handleNodeRequestAndResponse(req, reply, { req, reply, db, redis }),
});
// Transform presets — only banner/thumbnail force a crop; others preserve aspect ratio
const TRANSFORMS: Record<string, { width: number; height?: number; fit?: "cover" | "inside" }> = {
mini: { width: 80, height: 80, fit: "cover" },
thumbnail: { width: 300, height: 300, fit: "cover" },
preview: { width: 800, fit: "inside" },
medium: { width: 1400, fit: "inside" },
banner: { width: 1600, height: 480, fit: "cover" },
};
// Serve uploaded files: GET /assets/:id?transform=<preset>
// Files are stored as <UPLOAD_DIR>/<id>/<filename> — look up filename in DB
fastify.get("/assets/:id", async (request, reply) => {
const { id } = request.params as { id: string };
const { transform } = request.query as { transform?: string };
const result = await db
.select({ filename: files.filename, mime_type: files.mime_type })
.from(files)
.where(eq(files.id, id))
.limit(1);
if (!result[0]) return reply.status(404).send({ error: "File not found" });
const { filename, mime_type } = result[0];
reply.header("Cache-Control", "public, max-age=31536000, immutable");
const preset = transform ? TRANSFORMS[transform] : null;
if (preset && mime_type?.startsWith("image/")) {
const cacheFile = path.join(UPLOAD_DIR, id, `${transform}.webp`);
if (!existsSync(cacheFile)) {
const originalPath = path.join(UPLOAD_DIR, id, filename);
await sharp(originalPath)
.resize({
width: preset.width,
height: preset.height,
fit: preset.fit ?? "inside",
withoutEnlargement: true,
})
.webp({ quality: 92 })
.toFile(cacheFile);
}
reply.header("Content-Type", "image/webp");
return reply.sendFile(path.join(id, `${transform}.webp`));
} }
return reply.send(response.body);
},
});
fastify.get("/health", async (_request, reply) => { reply.header("Content-Type", mime_type);
return reply.send({ status: "ok", timestamp: new Date().toISOString() }); return reply.sendFile(path.join(id, filename));
}); });
try { // Upload a file: POST /upload (multipart, requires session)
await fastify.listen({ port: PORT, host: "0.0.0.0" }); fastify.post("/upload", async (request, reply) => {
fastify.log.info(`Backend running at http://0.0.0.0:${PORT}`); const token = request.cookies["session_token"];
fastify.log.info(`GraphQL at http://0.0.0.0:${PORT}/graphql`); if (!token) return reply.status(401).send({ error: "Unauthorized" });
} catch (err) {
fastify.log.error(err); const sessionData = await redis.get(`session:${token}`);
process.exit(1); if (!sessionData) return reply.status(401).send({ error: "Unauthorized" });
const { id: userId } = JSON.parse(sessionData);
const data = await request.file();
if (!data) return reply.status(400).send({ error: "No file provided" });
const id = crypto.randomUUID();
const filename = data.filename;
const mime_type = data.mimetype;
const dir = path.join(UPLOAD_DIR, id);
mkdirSync(dir, { recursive: true });
const buffer = await data.toBuffer();
await writeFile(path.join(dir, filename), buffer);
const [file] = await db
.insert(files)
.values({ id, filename, mime_type, filesize: buffer.byteLength, uploaded_by: userId })
.returning();
return reply.status(201).send(file);
});
// Delete a file: DELETE /assets/:id (requires session)
fastify.delete("/assets/:id", async (request, reply) => {
const token = request.cookies["session_token"];
if (!token) return reply.status(401).send({ error: "Unauthorized" });
const sessionData = await redis.get(`session:${token}`);
if (!sessionData) return reply.status(401).send({ error: "Unauthorized" });
const { id } = request.params as { id: string };
const result = await db.select().from(files).where(eq(files.id, id)).limit(1);
if (!result[0]) return reply.status(404).send({ error: "File not found" });
await db.delete(files).where(eq(files.id, id));
const dir = path.join(UPLOAD_DIR, id);
await rm(dir, { recursive: true, force: true });
return reply.status(200).send({ ok: true });
});
fastify.get("/health", async (_request, reply) => {
return reply.send({ status: "ok", timestamp: new Date().toISOString() });
});
try {
await fastify.listen({ port: PORT, host: "0.0.0.0" });
fastify.log.info(`Backend running at http://0.0.0.0:${PORT}`);
fastify.log.info(`GraphQL at http://0.0.0.0:${PORT}/graphql`);
} catch (err) {
fastify.log.error(err);
process.exit(1);
}
} }
main().catch((err) => {
console.error("Fatal error:", err);
process.exit(1);
});

View File

@@ -0,0 +1,18 @@
import { GraphQLError } from "graphql";
import type { Context } from "../graphql/builder";
export function requireAuth(ctx: Context): void {
if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
}
export function requireAdmin(ctx: Context): void {
requireAuth(ctx);
if (!ctx.currentUser!.is_admin) throw new GraphQLError("Forbidden");
}
export function requireOwnerOrAdmin(ctx: Context, ownerId: string): void {
requireAuth(ctx);
if (ctx.currentUser!.id !== ownerId && !ctx.currentUser!.is_admin) {
throw new GraphQLError("Forbidden");
}
}

View File

@@ -3,7 +3,8 @@ import Redis from "ioredis";
export type SessionUser = { export type SessionUser = {
id: string; id: string;
email: string; email: string;
role: "model" | "viewer" | "admin"; role: "model" | "viewer";
is_admin: boolean;
first_name: string | null; first_name: string | null;
last_name: string | null; last_name: string | null;
artist_name: string | null; artist_name: string | null;
@@ -20,6 +21,8 @@ export async function setSession(token: string, user: SessionUser): Promise<void
export async function getSession(token: string): Promise<SessionUser | null> { export async function getSession(token: string): Promise<SessionUser | null> {
const data = await redis.get(`session:${token}`); const data = await redis.get(`session:${token}`);
if (!data) return null; if (!data) return null;
// Slide the expiration window on every access
await redis.expire(`session:${token}`, 86400);
return JSON.parse(data) as SessionUser; return JSON.parse(data) as SessionUser;
} }

View File

@@ -1,32 +1,37 @@
import nodemailer from "nodemailer"; import nodemailer from "nodemailer";
import { renderVerification, renderPasswordReset } from "@sexy.pivoine.art/email";
import { mailQueue } from "../queues/index.js";
const transporter = nodemailer.createTransport({ const transporter = nodemailer.createTransport({
host: process.env.SMTP_HOST || "localhost", host: process.env.SMTP_HOST || "localhost",
port: parseInt(process.env.SMTP_PORT || "587"), port: parseInt(process.env.SMTP_PORT || "587"),
secure: process.env.SMTP_SECURE === "true", secure: process.env.SMTP_SECURE === "true",
auth: process.env.SMTP_USER ? { auth: process.env.SMTP_USER
user: process.env.SMTP_USER, ? {
pass: process.env.SMTP_PASS, user: process.env.SMTP_USER,
} : undefined, pass: process.env.SMTP_PASS,
}
: undefined,
}); });
const FROM = process.env.EMAIL_FROM || "noreply@sexy.pivoine.art"; const FROM = process.env.EMAIL_FROM || "noreply@sexy.pivoine.art";
const BASE_URL = process.env.PUBLIC_URL || "http://localhost:3000";
export async function sendVerification(email: string, token: string): Promise<void> { export async function sendVerification(email: string, token: string): Promise<void> {
await transporter.sendMail({ const { subject, html } = await renderVerification({ token });
from: FROM, await transporter.sendMail({ from: FROM, to: email, subject, html });
to: email,
subject: "Verify your email",
html: `<p>Click <a href="${BASE_URL}/signup/verify?token=${token}">here</a> to verify your email.</p>`,
});
} }
export async function sendPasswordReset(email: string, token: string): Promise<void> { export async function sendPasswordReset(email: string, token: string): Promise<void> {
await transporter.sendMail({ const { subject, html } = await renderPasswordReset({ token });
from: FROM, await transporter.sendMail({ from: FROM, to: email, subject, html });
to: email, }
subject: "Reset your password",
html: `<p>Click <a href="${BASE_URL}/password/reset?token=${token}">here</a> to reset your password.</p>`, const jobOpts = { attempts: 3, backoff: { type: "exponential" as const, delay: 5000 } };
});
export async function enqueueVerification(email: string, token: string): Promise<void> {
await mailQueue.add("sendVerification", { email, token }, jobOpts);
}
export async function enqueuePasswordReset(email: string, token: string): Promise<void> {
await mailQueue.add("sendPasswordReset", { email, token }, jobOpts);
} }

View File

@@ -1,5 +1,5 @@
import { eq, sql, and, gt, isNotNull, count, sum } from "drizzle-orm"; import { eq, sql, and, gt, isNull, isNotNull, count, sum } from "drizzle-orm";
import type { DB } from "../db/connection.js"; import type { DB } from "../db/connection";
import { import {
user_points, user_points,
user_stats, user_stats,
@@ -9,7 +9,7 @@ import {
user_achievements, user_achievements,
achievements, achievements,
users, users,
} from "../db/schema/index.js"; } from "../db/schema/index";
export const POINT_VALUES = { export const POINT_VALUES = {
RECORDING_CREATE: 50, RECORDING_CREATE: 50,
@@ -28,26 +28,62 @@ export async function awardPoints(
recordingId?: string, recordingId?: string,
): Promise<void> { ): Promise<void> {
const points = POINT_VALUES[action]; const points = POINT_VALUES[action];
await db.insert(user_points).values({ await db
user_id: userId, .insert(user_points)
action, .values({
points, user_id: userId,
recording_id: recordingId || null, action,
date_created: new Date(), points,
}); recording_id: recordingId || null,
date_created: new Date(),
})
.onConflictDoNothing();
await updateUserStats(db, userId);
}
export async function revokePoints(
db: DB,
userId: string,
action: keyof typeof POINT_VALUES,
recordingId?: string,
): Promise<void> {
const recordingCondition = recordingId
? eq(user_points.recording_id, recordingId)
: isNull(user_points.recording_id);
// When no recordingId (e.g. COMMENT_CREATE), delete only one row so each
// revoke undoes exactly one prior award.
if (!recordingId) {
const row = await db
.select({ id: user_points.id })
.from(user_points)
.where(
and(eq(user_points.user_id, userId), eq(user_points.action, action), recordingCondition),
)
.limit(1);
if (row[0]) {
await db.delete(user_points).where(eq(user_points.id, row[0].id));
}
} else {
await db
.delete(user_points)
.where(
and(eq(user_points.user_id, userId), eq(user_points.action, action), recordingCondition),
);
}
await updateUserStats(db, userId); await updateUserStats(db, userId);
} }
export async function calculateWeightedScore(db: DB, userId: string): Promise<number> { export async function calculateWeightedScore(db: DB, userId: string): Promise<number> {
const now = new Date();
const result = await db.execute(sql` const result = await db.execute(sql`
SELECT SUM( SELECT SUM(
points * EXP(-${DECAY_LAMBDA} * EXTRACT(EPOCH FROM (${now}::timestamptz - date_created)) / 86400) points * EXP(${sql.raw(String(-DECAY_LAMBDA))} * EXTRACT(EPOCH FROM (NOW() - date_created)) / 86400)
) as weighted_score ) as weighted_score
FROM user_points FROM user_points
WHERE user_id = ${userId} WHERE user_id = ${userId}
`); `);
return parseFloat((result.rows[0] as any)?.weighted_score || "0"); return parseFloat((result.rows[0] as { weighted_score?: string })?.weighted_score || "0");
} }
export async function updateUserStats(db: DB, userId: string): Promise<void> { export async function updateUserStats(db: DB, userId: string): Promise<void> {
@@ -74,14 +110,17 @@ export async function updateUserStats(db: DB, userId: string): Promise<void> {
.where(eq(recordings.user_id, userId)); .where(eq(recordings.user_id, userId));
const ownIds = ownRecordingIds.map((r) => r.id); const ownIds = ownRecordingIds.map((r) => r.id);
let playbacksCount = 0; let playbacksCount: number;
if (ownIds.length > 0) { if (ownIds.length > 0) {
const playbacksResult = await db.execute(sql` const playbacksResult = await db.execute(sql`
SELECT COUNT(*) as count FROM recording_plays SELECT COUNT(*) as count FROM recording_plays
WHERE user_id = ${userId} WHERE user_id = ${userId}
AND recording_id NOT IN (${sql.join(ownIds.map(id => sql`${id}`), sql`, `)}) AND recording_id NOT IN (${sql.join(
ownIds.map((id) => sql`${id}`),
sql`, `,
)})
`); `);
playbacksCount = parseInt((playbacksResult.rows[0] as any)?.count || "0"); playbacksCount = parseInt((playbacksResult.rows[0] as { count?: string })?.count || "0");
} else { } else {
const playbacksResult = await db const playbacksResult = await db
.select({ count: count() }) .select({ count: count() })
@@ -93,7 +132,7 @@ export async function updateUserStats(db: DB, userId: string): Promise<void> {
const commentsResult = await db const commentsResult = await db
.select({ count: count() }) .select({ count: count() })
.from(comments) .from(comments)
.where(and(eq(comments.user_id, userId), eq(comments.collection, "recordings"))); .where(and(eq(comments.user_id, userId), eq(comments.collection, "videos")));
const commentsCount = commentsResult[0]?.count || 0; const commentsCount = commentsResult[0]?.count || 0;
const achievementsResult = await db const achievementsResult = await db
@@ -135,11 +174,7 @@ export async function updateUserStats(db: DB, userId: string): Promise<void> {
} }
} }
export async function checkAchievements( export async function checkAchievements(db: DB, userId: string, category?: string): Promise<void> {
db: DB,
userId: string,
category?: string,
): Promise<void> {
let achievementsQuery = db let achievementsQuery = db
.select() .select()
.from(achievements) .from(achievements)
@@ -176,7 +211,9 @@ export async function checkAchievements(
.update(user_achievements) .update(user_achievements)
.set({ .set({
progress, progress,
date_unlocked: isUnlocked ? (existing[0].date_unlocked || new Date()) : null, date_unlocked: isUnlocked
? (existing[0].date_unlocked ?? new Date())
: existing[0].date_unlocked,
}) })
.where( .where(
and( and(
@@ -243,7 +280,7 @@ async function getAchievementProgress(
WHERE rp.user_id = ${userId} WHERE rp.user_id = ${userId}
AND r.user_id != ${userId} AND r.user_id != ${userId}
`); `);
return parseInt((result.rows[0] as any)?.count || "0"); return parseInt((result.rows[0] as { count?: string })?.count || "0");
} }
if (["completionist_10", "completionist_100"].includes(code)) { if (["completionist_10", "completionist_100"].includes(code)) {
@@ -258,7 +295,7 @@ async function getAchievementProgress(
const result = await db const result = await db
.select({ count: count() }) .select({ count: count() })
.from(comments) .from(comments)
.where(and(eq(comments.user_id, userId), eq(comments.collection, "recordings"))); .where(and(eq(comments.user_id, userId), eq(comments.collection, "videos")));
return result[0]?.count || 0; return result[0]?.count || 0;
} }
@@ -294,7 +331,7 @@ async function getAchievementProgress(
WHERE rp.user_id = ${userId} AND r.user_id != ${userId} WHERE rp.user_id = ${userId} AND r.user_id != ${userId}
`); `);
const rc = recordingsResult[0]?.count || 0; const rc = recordingsResult[0]?.count || 0;
const pc = parseInt((playsResult.rows[0] as any)?.count || "0"); const pc = parseInt((playsResult.rows[0] as { count?: string })?.count || "0");
return rc >= 50 && pc >= 100 ? 1 : 0; return rc >= 50 && pc >= 100 ? 1 : 0;
} }

View File

@@ -0,0 +1,101 @@
type LogLevel = "trace" | "debug" | "info" | "warn" | "error" | "fatal";
const LEVEL_VALUES: Record<LogLevel, number> = {
trace: 10,
debug: 20,
info: 30,
warn: 40,
error: 50,
fatal: 60,
};
function createLogger(bindings: Record<string, unknown> = {}, initialLevel: LogLevel = "info") {
let currentLevel = initialLevel;
function shouldLog(level: LogLevel): boolean {
return LEVEL_VALUES[level] >= LEVEL_VALUES[currentLevel];
}
function formatMessage(level: LogLevel, arg: unknown, msg?: string): string {
const timestamp = new Date().toISOString();
let message: string;
const meta: Record<string, unknown> = { ...bindings };
if (typeof arg === "string") {
message = arg;
} else if (arg !== null && typeof arg === "object") {
// Pino-style: log(obj, msg?) — strip internal pino keys
const {
msg: m,
level: _l,
time: _t,
pid: _p,
hostname: _h,
req: _req,
res: _res,
reqId,
...rest
} = arg as Record<string, unknown>;
message = msg || (typeof m === "string" ? m : "");
if (reqId) meta.reqId = reqId;
Object.assign(meta, rest);
} else {
message = String(arg ?? "");
}
const parts = [`[${timestamp}]`, `[${level.toUpperCase()}]`, message];
let result = parts.join(" ");
const metaEntries = Object.entries(meta).filter(([k]) => k !== "reqId");
const reqId = meta.reqId;
if (reqId) result = `[${timestamp}] [${level.toUpperCase()}] [${reqId}] ${message}`;
if (metaEntries.length > 0) {
result += " " + JSON.stringify(Object.fromEntries(metaEntries));
}
return result;
}
function write(level: LogLevel, arg: unknown, msg?: string) {
if (!shouldLog(level)) return;
const formatted = formatMessage(level, arg, msg);
switch (level) {
case "trace":
case "debug":
console.debug(formatted);
break;
case "info":
console.info(formatted);
break;
case "warn":
console.warn(formatted);
break;
case "error":
case "fatal":
console.error(formatted);
break;
}
}
return {
get level() {
return currentLevel;
},
set level(l: string) {
currentLevel = l as LogLevel;
},
trace: (arg: unknown, msg?: string) => write("trace", arg, msg),
debug: (arg: unknown, msg?: string) => write("debug", arg, msg),
info: (arg: unknown, msg?: string) => write("info", arg, msg),
warn: (arg: unknown, msg?: string) => write("warn", arg, msg),
error: (arg: unknown, msg?: string) => write("error", arg, msg),
fatal: (arg: unknown, msg?: string) => write("fatal", arg, msg),
silent: () => {},
child: (newBindings: Record<string, unknown>) =>
createLogger({ ...bindings, ...newBindings }, currentLevel),
};
}
export const logger = createLogger({}, (process.env.LOG_LEVEL as LogLevel) || "info");

View File

@@ -0,0 +1,233 @@
CREATE TYPE "public"."achievement_status" AS ENUM('draft', 'published');--> statement-breakpoint
CREATE TYPE "public"."user_role" AS ENUM('model', 'viewer', 'admin');--> statement-breakpoint
CREATE TYPE "public"."recording_status" AS ENUM('draft', 'published', 'archived');--> statement-breakpoint
CREATE TABLE "articles" (
"id" text PRIMARY KEY NOT NULL,
"slug" text NOT NULL,
"title" text NOT NULL,
"excerpt" text,
"content" text,
"image" text,
"tags" text[] DEFAULT '{}',
"publish_date" timestamp DEFAULT now() NOT NULL,
"author" text,
"category" text,
"featured" boolean DEFAULT false,
"date_created" timestamp DEFAULT now() NOT NULL,
"date_updated" timestamp
);
--> statement-breakpoint
CREATE TABLE "comments" (
"id" integer PRIMARY KEY GENERATED ALWAYS AS IDENTITY (sequence name "comments_id_seq" INCREMENT BY 1 MINVALUE 1 MAXVALUE 2147483647 START WITH 1 CACHE 1),
"collection" text NOT NULL,
"item_id" text NOT NULL,
"comment" text NOT NULL,
"user_id" text NOT NULL,
"date_created" timestamp DEFAULT now() NOT NULL,
"date_updated" timestamp
);
--> statement-breakpoint
CREATE TABLE "files" (
"id" text PRIMARY KEY NOT NULL,
"title" text,
"description" text,
"filename" text NOT NULL,
"mime_type" text,
"filesize" bigint,
"duration" integer,
"uploaded_by" text,
"date_created" timestamp DEFAULT now() NOT NULL
);
--> statement-breakpoint
CREATE TABLE "achievements" (
"id" text PRIMARY KEY NOT NULL,
"code" text NOT NULL,
"name" text NOT NULL,
"description" text,
"icon" text,
"category" text,
"required_count" integer DEFAULT 1 NOT NULL,
"points_reward" integer DEFAULT 0 NOT NULL,
"status" "achievement_status" DEFAULT 'published' NOT NULL,
"sort" integer DEFAULT 0
);
--> statement-breakpoint
CREATE TABLE "user_achievements" (
"id" integer PRIMARY KEY GENERATED ALWAYS AS IDENTITY (sequence name "user_achievements_id_seq" INCREMENT BY 1 MINVALUE 1 MAXVALUE 2147483647 START WITH 1 CACHE 1),
"user_id" text NOT NULL,
"achievement_id" text NOT NULL,
"progress" integer DEFAULT 0,
"date_unlocked" timestamp
);
--> statement-breakpoint
CREATE TABLE "user_points" (
"id" integer PRIMARY KEY GENERATED ALWAYS AS IDENTITY (sequence name "user_points_id_seq" INCREMENT BY 1 MINVALUE 1 MAXVALUE 2147483647 START WITH 1 CACHE 1),
"user_id" text NOT NULL,
"action" text NOT NULL,
"points" integer NOT NULL,
"recording_id" text,
"date_created" timestamp DEFAULT now() NOT NULL
);
--> statement-breakpoint
CREATE TABLE "user_stats" (
"id" integer PRIMARY KEY GENERATED ALWAYS AS IDENTITY (sequence name "user_stats_id_seq" INCREMENT BY 1 MINVALUE 1 MAXVALUE 2147483647 START WITH 1 CACHE 1),
"user_id" text NOT NULL,
"total_raw_points" integer DEFAULT 0,
"total_weighted_points" real DEFAULT 0,
"recordings_count" integer DEFAULT 0,
"playbacks_count" integer DEFAULT 0,
"comments_count" integer DEFAULT 0,
"achievements_count" integer DEFAULT 0,
"last_updated" timestamp DEFAULT now()
);
--> statement-breakpoint
CREATE TABLE "user_photos" (
"id" integer PRIMARY KEY GENERATED ALWAYS AS IDENTITY (sequence name "user_photos_id_seq" INCREMENT BY 1 MINVALUE 1 MAXVALUE 2147483647 START WITH 1 CACHE 1),
"user_id" text NOT NULL,
"file_id" text NOT NULL,
"sort" integer DEFAULT 0
);
--> statement-breakpoint
CREATE TABLE "users" (
"id" text PRIMARY KEY NOT NULL,
"email" text NOT NULL,
"password_hash" text NOT NULL,
"first_name" text,
"last_name" text,
"artist_name" text,
"slug" text,
"description" text,
"tags" text[] DEFAULT '{}',
"role" "user_role" DEFAULT 'viewer' NOT NULL,
"avatar" text,
"banner" text,
"email_verified" boolean DEFAULT false NOT NULL,
"email_verify_token" text,
"password_reset_token" text,
"password_reset_expiry" timestamp,
"date_created" timestamp DEFAULT now() NOT NULL,
"date_updated" timestamp
);
--> statement-breakpoint
CREATE TABLE "video_likes" (
"id" text PRIMARY KEY NOT NULL,
"video_id" text NOT NULL,
"user_id" text NOT NULL,
"date_created" timestamp DEFAULT now() NOT NULL
);
--> statement-breakpoint
CREATE TABLE "video_models" (
"video_id" text NOT NULL,
"user_id" text NOT NULL,
CONSTRAINT "video_models_video_id_user_id_pk" PRIMARY KEY("video_id","user_id")
);
--> statement-breakpoint
CREATE TABLE "video_plays" (
"id" text PRIMARY KEY NOT NULL,
"video_id" text NOT NULL,
"user_id" text,
"session_id" text,
"duration_watched" integer,
"completed" boolean DEFAULT false,
"date_created" timestamp DEFAULT now() NOT NULL,
"date_updated" timestamp
);
--> statement-breakpoint
CREATE TABLE "videos" (
"id" text PRIMARY KEY NOT NULL,
"slug" text NOT NULL,
"title" text NOT NULL,
"description" text,
"image" text,
"movie" text,
"tags" text[] DEFAULT '{}',
"upload_date" timestamp DEFAULT now() NOT NULL,
"premium" boolean DEFAULT false,
"featured" boolean DEFAULT false,
"likes_count" integer DEFAULT 0,
"plays_count" integer DEFAULT 0
);
--> statement-breakpoint
CREATE TABLE "recording_plays" (
"id" text PRIMARY KEY NOT NULL,
"recording_id" text NOT NULL,
"user_id" text,
"duration_played" integer DEFAULT 0,
"completed" boolean DEFAULT false,
"date_created" timestamp DEFAULT now() NOT NULL,
"date_updated" timestamp
);
--> statement-breakpoint
CREATE TABLE "recordings" (
"id" text PRIMARY KEY NOT NULL,
"title" text NOT NULL,
"description" text,
"slug" text NOT NULL,
"duration" integer NOT NULL,
"events" jsonb DEFAULT '[]'::jsonb,
"device_info" jsonb DEFAULT '[]'::jsonb,
"user_id" text NOT NULL,
"status" "recording_status" DEFAULT 'draft' NOT NULL,
"tags" text[] DEFAULT '{}',
"linked_video" text,
"featured" boolean DEFAULT false,
"public" boolean DEFAULT false,
"original_recording_id" text,
"date_created" timestamp DEFAULT now() NOT NULL,
"date_updated" timestamp
);
--> statement-breakpoint
ALTER TABLE "articles" ADD CONSTRAINT "articles_image_files_id_fk" FOREIGN KEY ("image") REFERENCES "public"."files"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "articles" ADD CONSTRAINT "articles_author_users_id_fk" FOREIGN KEY ("author") REFERENCES "public"."users"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "comments" ADD CONSTRAINT "comments_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "user_achievements" ADD CONSTRAINT "user_achievements_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "user_achievements" ADD CONSTRAINT "user_achievements_achievement_id_achievements_id_fk" FOREIGN KEY ("achievement_id") REFERENCES "public"."achievements"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "user_points" ADD CONSTRAINT "user_points_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "user_points" ADD CONSTRAINT "user_points_recording_id_recordings_id_fk" FOREIGN KEY ("recording_id") REFERENCES "public"."recordings"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "user_stats" ADD CONSTRAINT "user_stats_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "user_photos" ADD CONSTRAINT "user_photos_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "user_photos" ADD CONSTRAINT "user_photos_file_id_files_id_fk" FOREIGN KEY ("file_id") REFERENCES "public"."files"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "users" ADD CONSTRAINT "users_avatar_files_id_fk" FOREIGN KEY ("avatar") REFERENCES "public"."files"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "users" ADD CONSTRAINT "users_banner_files_id_fk" FOREIGN KEY ("banner") REFERENCES "public"."files"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "video_likes" ADD CONSTRAINT "video_likes_video_id_videos_id_fk" FOREIGN KEY ("video_id") REFERENCES "public"."videos"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "video_likes" ADD CONSTRAINT "video_likes_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "video_models" ADD CONSTRAINT "video_models_video_id_videos_id_fk" FOREIGN KEY ("video_id") REFERENCES "public"."videos"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "video_models" ADD CONSTRAINT "video_models_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "video_plays" ADD CONSTRAINT "video_plays_video_id_videos_id_fk" FOREIGN KEY ("video_id") REFERENCES "public"."videos"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "video_plays" ADD CONSTRAINT "video_plays_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "videos" ADD CONSTRAINT "videos_image_files_id_fk" FOREIGN KEY ("image") REFERENCES "public"."files"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "videos" ADD CONSTRAINT "videos_movie_files_id_fk" FOREIGN KEY ("movie") REFERENCES "public"."files"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "recording_plays" ADD CONSTRAINT "recording_plays_recording_id_recordings_id_fk" FOREIGN KEY ("recording_id") REFERENCES "public"."recordings"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "recording_plays" ADD CONSTRAINT "recording_plays_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "recordings" ADD CONSTRAINT "recordings_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "recordings" ADD CONSTRAINT "recordings_linked_video_videos_id_fk" FOREIGN KEY ("linked_video") REFERENCES "public"."videos"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
CREATE UNIQUE INDEX "articles_slug_idx" ON "articles" USING btree ("slug");--> statement-breakpoint
CREATE INDEX "articles_publish_date_idx" ON "articles" USING btree ("publish_date");--> statement-breakpoint
CREATE INDEX "articles_featured_idx" ON "articles" USING btree ("featured");--> statement-breakpoint
CREATE INDEX "comments_collection_item_idx" ON "comments" USING btree ("collection","item_id");--> statement-breakpoint
CREATE INDEX "comments_user_idx" ON "comments" USING btree ("user_id");--> statement-breakpoint
CREATE INDEX "files_uploaded_by_idx" ON "files" USING btree ("uploaded_by");--> statement-breakpoint
CREATE UNIQUE INDEX "achievements_code_idx" ON "achievements" USING btree ("code");--> statement-breakpoint
CREATE INDEX "user_achievements_user_idx" ON "user_achievements" USING btree ("user_id");--> statement-breakpoint
CREATE UNIQUE INDEX "user_achievements_unique_idx" ON "user_achievements" USING btree ("user_id","achievement_id");--> statement-breakpoint
CREATE INDEX "user_points_user_idx" ON "user_points" USING btree ("user_id");--> statement-breakpoint
CREATE INDEX "user_points_date_idx" ON "user_points" USING btree ("date_created");--> statement-breakpoint
CREATE UNIQUE INDEX "user_stats_user_idx" ON "user_stats" USING btree ("user_id");--> statement-breakpoint
CREATE INDEX "user_photos_user_idx" ON "user_photos" USING btree ("user_id");--> statement-breakpoint
CREATE UNIQUE INDEX "users_email_idx" ON "users" USING btree ("email");--> statement-breakpoint
CREATE UNIQUE INDEX "users_slug_idx" ON "users" USING btree ("slug");--> statement-breakpoint
CREATE INDEX "users_role_idx" ON "users" USING btree ("role");--> statement-breakpoint
CREATE INDEX "video_likes_video_idx" ON "video_likes" USING btree ("video_id");--> statement-breakpoint
CREATE INDEX "video_likes_user_idx" ON "video_likes" USING btree ("user_id");--> statement-breakpoint
CREATE INDEX "video_plays_video_idx" ON "video_plays" USING btree ("video_id");--> statement-breakpoint
CREATE INDEX "video_plays_user_idx" ON "video_plays" USING btree ("user_id");--> statement-breakpoint
CREATE INDEX "video_plays_date_idx" ON "video_plays" USING btree ("date_created");--> statement-breakpoint
CREATE UNIQUE INDEX "videos_slug_idx" ON "videos" USING btree ("slug");--> statement-breakpoint
CREATE INDEX "videos_upload_date_idx" ON "videos" USING btree ("upload_date");--> statement-breakpoint
CREATE INDEX "videos_featured_idx" ON "videos" USING btree ("featured");--> statement-breakpoint
CREATE INDEX "recording_plays_recording_idx" ON "recording_plays" USING btree ("recording_id");--> statement-breakpoint
CREATE INDEX "recording_plays_user_idx" ON "recording_plays" USING btree ("user_id");--> statement-breakpoint
CREATE UNIQUE INDEX "recordings_slug_idx" ON "recordings" USING btree ("slug");--> statement-breakpoint
CREATE INDEX "recordings_user_idx" ON "recordings" USING btree ("user_id");--> statement-breakpoint
CREATE INDEX "recordings_status_idx" ON "recordings" USING btree ("status");--> statement-breakpoint
CREATE INDEX "recordings_public_idx" ON "recordings" USING btree ("public");

View File

@@ -0,0 +1,3 @@
ALTER TABLE "users" ADD COLUMN "is_admin" boolean NOT NULL DEFAULT false;--> statement-breakpoint
UPDATE "users" SET "is_admin" = true WHERE "role" = 'admin';--> statement-breakpoint
UPDATE "users" SET "role" = 'viewer' WHERE "role" = 'admin';

View File

@@ -0,0 +1,8 @@
-- Update any archived recordings to draft before removing the status
UPDATE "recordings" SET "status" = 'draft' WHERE "status" = 'archived';--> statement-breakpoint
-- Recreate enum without 'archived'
ALTER TYPE "public"."recording_status" RENAME TO "recording_status_old";--> statement-breakpoint
CREATE TYPE "public"."recording_status" AS ENUM('draft', 'published');--> statement-breakpoint
ALTER TABLE "recordings" ALTER COLUMN "status" TYPE "public"."recording_status" USING "status"::text::"public"."recording_status";--> statement-breakpoint
DROP TYPE "public"."recording_status_old";

View File

@@ -0,0 +1 @@
ALTER TABLE "users" ADD COLUMN "photo" text REFERENCES "files"("id") ON DELETE set null;

View File

@@ -0,0 +1,6 @@
-- Partial unique index: prevents duplicate RECORDING_CREATE / RECORDING_FEATURED points
-- for the same recording. RECORDING_PLAY / RECORDING_COMPLETE are excluded so a user
-- can earn play points across multiple sessions.
CREATE UNIQUE INDEX "user_points_unique_action_recording"
ON "user_points" ("user_id", "action", "recording_id")
WHERE "action" IN ('RECORDING_CREATE', 'RECORDING_FEATURED') AND "recording_id" IS NOT NULL;

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,34 @@
{
"version": "7",
"dialect": "postgresql",
"entries": [
{
"idx": 0,
"version": "7",
"when": 1772645674513,
"tag": "0000_pale_hellion",
"breakpoints": true
},
{
"idx": 1,
"version": "7",
"when": 1772645674514,
"tag": "0001_is_admin",
"breakpoints": true
},
{
"idx": 2,
"version": "7",
"when": 1741337600000,
"tag": "0002_remove_archived_recording_status",
"breakpoints": true
},
{
"idx": 3,
"version": "7",
"when": 1741420000000,
"tag": "0003_model_photo",
"breakpoints": true
}
]
}

View File

@@ -0,0 +1,16 @@
function parseRedisUrl(url: string): { host: string; port: number; password?: string } {
const parsed = new URL(url);
return {
host: parsed.hostname,
port: parseInt(parsed.port) || 6379,
password: parsed.password || undefined,
};
}
// BullMQ creates its own IORedis connections from these options.
// maxRetriesPerRequest: null is required for workers.
export const redisConnectionOpts = {
...parseRedisUrl(process.env.REDIS_URL || "redis://localhost:6379"),
maxRetriesPerRequest: null as null,
enableReadyCheck: false,
};

View File

@@ -0,0 +1,25 @@
import { Queue } from "bullmq";
import { redisConnectionOpts } from "./connection.js";
import { logger } from "../lib/logger.js";
const log = logger.child({ component: "queues" });
export const mailQueue = new Queue("mail", { connection: redisConnectionOpts });
mailQueue.on("error", (err) => {
log.error({ queue: "mail", err: err.message }, "Queue error");
});
export const gamificationQueue = new Queue("gamification", {
connection: redisConnectionOpts,
defaultJobOptions: { attempts: 3, backoff: { type: "exponential", delay: 2000 } },
});
gamificationQueue.on("error", (err) => {
log.error({ queue: "gamification", err: err.message }, "Queue error");
});
log.info("Queues initialized");
export const queues: Record<string, Queue> = {
mail: mailQueue,
gamification: gamificationQueue,
};

View File

@@ -0,0 +1,52 @@
import { Worker } from "bullmq";
import { redisConnectionOpts } from "../connection.js";
import { awardPoints, revokePoints, checkAchievements } from "../../lib/gamification.js";
import { db } from "../../db/connection.js";
import { logger } from "../../lib/logger.js";
import type { POINT_VALUES } from "../../lib/gamification.js";
const log = logger.child({ component: "gamification-worker" });
export type GamificationJobData =
| { job: "awardPoints"; userId: string; action: keyof typeof POINT_VALUES; recordingId?: string }
| { job: "revokePoints"; userId: string; action: keyof typeof POINT_VALUES; recordingId?: string }
| { job: "checkAchievements"; userId: string; category?: string };
export function startGamificationWorker(): Worker {
const worker = new Worker(
"gamification",
async (bullJob) => {
const data = bullJob.data as GamificationJobData;
log.info(
{ jobId: bullJob.id, job: data.job, userId: data.userId },
"Processing gamification job",
);
switch (data.job) {
case "awardPoints":
await awardPoints(db, data.userId, data.action, data.recordingId);
break;
case "revokePoints":
await revokePoints(db, data.userId, data.action, data.recordingId);
break;
case "checkAchievements":
await checkAchievements(db, data.userId, data.category);
break;
default:
throw new Error(`Unknown gamification job: ${(data as GamificationJobData).job}`);
}
log.info({ jobId: bullJob.id, job: data.job }, "Gamification job completed");
},
{ connection: redisConnectionOpts },
);
worker.on("failed", (bullJob, err) => {
log.error(
{ jobId: bullJob?.id, job: (bullJob?.data as GamificationJobData)?.job, err: err.message },
"Gamification job failed",
);
});
return worker;
}

View File

@@ -0,0 +1,33 @@
import { Worker } from "bullmq";
import { redisConnectionOpts } from "../connection.js";
import { sendVerification, sendPasswordReset } from "../../lib/email.js";
import { logger } from "../../lib/logger.js";
const log = logger.child({ component: "mail-worker" });
export function startMailWorker(): Worker {
const worker = new Worker(
"mail",
async (job) => {
log.info({ jobId: job.id, jobName: job.name }, `Processing mail job`);
switch (job.name) {
case "sendVerification":
await sendVerification(job.data.email as string, job.data.token as string);
break;
case "sendPasswordReset":
await sendPasswordReset(job.data.email as string, job.data.token as string);
break;
default:
throw new Error(`Unknown mail job: ${job.name}`);
}
log.info({ jobId: job.id, jobName: job.name }, `Mail job completed`);
},
{ connection: redisConnectionOpts },
);
worker.on("failed", (job, err) => {
log.error({ jobId: job?.id, jobName: job?.name, err: err.message }, `Mail job failed`);
});
return worker;
}

View File

@@ -44,7 +44,7 @@ function copyFile(src: string, dest: string) {
async function migrateFiles() { async function migrateFiles() {
console.log("📁 Migrating files..."); console.log("📁 Migrating files...");
const { rows } = await query( const { rows } = await query(
`SELECT id, title, description, filename_disk, type, filesize, duration, uploaded_by, date_created `SELECT id, title, description, filename_disk, type, filesize, duration, uploaded_by, uploaded_on as date_created
FROM directus_files`, FROM directus_files`,
); );
@@ -95,8 +95,8 @@ async function migrateUsers() {
console.log("👥 Migrating users..."); console.log("👥 Migrating users...");
const { rows } = await query( const { rows } = await query(
`SELECT u.id, u.email, u.password, u.first_name, u.last_name, `SELECT u.id, u.email, u.password, u.first_name, u.last_name,
u.description, u.avatar, u.date_created, u.description, u.avatar, u.join_date as date_created,
u.artist_name, u.slug, u.email_notifications_key, u.artist_name, u.slug,
r.name as role_name r.name as role_name
FROM directus_users u FROM directus_users u
LEFT JOIN directus_roles r ON u.role = r.id LEFT JOIN directus_roles r ON u.role = r.id
@@ -126,9 +126,11 @@ async function migrateUsers() {
if (tagsRes.rows[0]?.tags) { if (tagsRes.rows[0]?.tags) {
tags = Array.isArray(tagsRes.rows[0].tags) tags = Array.isArray(tagsRes.rows[0].tags)
? tagsRes.rows[0].tags ? tagsRes.rows[0].tags
: JSON.parse(tagsRes.rows[0].tags || "[]"); : JSON.parse(String(tagsRes.rows[0].tags || "[]"));
} }
} catch {} } catch {
/* tags column may not exist on older Directus installs */
}
await query( await query(
`INSERT INTO users (id, email, password_hash, first_name, last_name, artist_name, slug, `INSERT INTO users (id, email, password_hash, first_name, last_name, artist_name, slug,
@@ -144,10 +146,10 @@ async function migrateUsers() {
user.artist_name, user.artist_name,
user.slug, user.slug,
user.description, user.description,
JSON.stringify(tags), tags,
role, role,
user.avatar, user.avatar,
true, // Assume existing users are verified true,
user.date_created, user.date_created,
], ],
); );
@@ -160,7 +162,7 @@ async function migrateUsers() {
async function migrateUserPhotos() { async function migrateUserPhotos() {
console.log("🖼️ Migrating user photos..."); console.log("🖼️ Migrating user photos...");
const { rows } = await query( const { rows } = await query(
`SELECT directus_users_id as user_id, directus_files_id as file_id, sort `SELECT directus_users_id as user_id, directus_files_id as file_id
FROM junction_directus_users_files`, FROM junction_directus_users_files`,
); );
@@ -173,7 +175,7 @@ async function migrateUserPhotos() {
await query( await query(
`INSERT INTO user_photos (user_id, file_id, sort) VALUES ($1, $2, $3) `INSERT INTO user_photos (user_id, file_id, sort) VALUES ($1, $2, $3)
ON CONFLICT DO NOTHING`, ON CONFLICT DO NOTHING`,
[row.user_id, row.file_id, row.sort || 0], [row.user_id, row.file_id, 0],
); );
migrated++; migrated++;
} }
@@ -203,7 +205,7 @@ async function migrateArticles() {
article.excerpt, article.excerpt,
article.content, article.content,
article.image, article.image,
Array.isArray(article.tags) ? JSON.stringify(article.tags) : article.tags, Array.isArray(article.tags) ? article.tags : JSON.parse(String(article.tags || "[]")),
article.publish_date, article.publish_date,
article.author, article.author,
article.category, article.category,
@@ -222,7 +224,7 @@ async function migrateVideos() {
console.log("🎬 Migrating videos..."); console.log("🎬 Migrating videos...");
const { rows } = await query( const { rows } = await query(
`SELECT id, slug, title, description, image, movie, tags, upload_date, `SELECT id, slug, title, description, image, movie, tags, upload_date,
premium, featured, likes_count, plays_count premium, featured
FROM sexy_videos`, FROM sexy_videos`,
); );
@@ -240,12 +242,12 @@ async function migrateVideos() {
video.description, video.description,
video.image, video.image,
video.movie, video.movie,
Array.isArray(video.tags) ? JSON.stringify(video.tags) : video.tags, Array.isArray(video.tags) ? video.tags : JSON.parse(String(video.tags || "[]")),
video.upload_date, video.upload_date,
video.premium, video.premium,
video.featured, video.featured,
video.likes_count || 0, 0,
video.plays_count || 0, 0,
], ],
); );
migrated++; migrated++;
@@ -279,9 +281,7 @@ async function migrateVideoModels() {
async function migrateVideoLikes() { async function migrateVideoLikes() {
console.log("❤️ Migrating video likes..."); console.log("❤️ Migrating video likes...");
const { rows } = await query( const { rows } = await query(`SELECT id, video_id, user_id, date_created FROM sexy_video_likes`);
`SELECT id, video_id, user_id, date_created FROM sexy_video_likes`,
);
let migrated = 0; let migrated = 0;
for (const row of rows) { for (const row of rows) {
@@ -329,7 +329,7 @@ async function migrateRecordings() {
console.log("🎙️ Migrating recordings..."); console.log("🎙️ Migrating recordings...");
const { rows } = await query( const { rows } = await query(
`SELECT id, title, description, slug, duration, events, device_info, `SELECT id, title, description, slug, duration, events, device_info,
user_created as user_id, status, tags, linked_video, featured, public, user_created as user_id, status, tags, linked_video, public,
original_recording_id, date_created, date_updated original_recording_id, date_created, date_updated
FROM sexy_recordings`, FROM sexy_recordings`,
); );
@@ -338,25 +338,24 @@ async function migrateRecordings() {
for (const recording of rows) { for (const recording of rows) {
await query( await query(
`INSERT INTO recordings (id, title, description, slug, duration, events, device_info, `INSERT INTO recordings (id, title, description, slug, duration, events, device_info,
user_id, status, tags, linked_video, featured, public, user_id, status, tags, linked_video, public,
original_recording_id, date_created, date_updated) original_recording_id, date_created, date_updated)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15, $16) VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15)
ON CONFLICT (id) DO NOTHING`, ON CONFLICT (id) DO NOTHING`,
[ [
recording.id, recording.id,
recording.title, recording.title,
recording.description, recording.description,
recording.slug, recording.slug,
recording.duration, recording.duration != null ? Math.round(Number(recording.duration)) : null,
typeof recording.events === "string" ? recording.events : JSON.stringify(recording.events), typeof recording.events === "string" ? recording.events : JSON.stringify(recording.events),
typeof recording.device_info === "string" typeof recording.device_info === "string"
? recording.device_info ? recording.device_info
: JSON.stringify(recording.device_info), : JSON.stringify(recording.device_info),
recording.user_id, recording.user_id,
recording.status, recording.status,
Array.isArray(recording.tags) ? JSON.stringify(recording.tags) : recording.tags, Array.isArray(recording.tags) ? recording.tags : JSON.parse(String(recording.tags || "[]")),
recording.linked_video, recording.linked_video,
recording.featured,
recording.public, recording.public,
recording.original_recording_id, recording.original_recording_id,
recording.date_created, recording.date_created,

View File

@@ -0,0 +1,27 @@
import { Pool } from "pg";
import { drizzle } from "drizzle-orm/node-postgres";
import { migrate } from "drizzle-orm/node-postgres/migrator";
import path from "path";
const pool = new Pool({
connectionString: process.env.DATABASE_URL || "postgresql://sexy:sexy@localhost:5432/sexy",
});
const db = drizzle(pool);
async function main() {
console.log("Running schema migrations...");
// In dev (tsx): __dirname = src/scripts → migrations are at src/migrations
// In prod (node dist): __dirname = dist/scripts → migrations are at ../../migrations (package root)
const migrationsFolder = __dirname.includes("/src/")
? path.join(__dirname, "../migrations")
: path.join(__dirname, "../../migrations");
await migrate(db, { migrationsFolder });
console.log("Schema migrations complete.");
await pool.end();
}
main().catch((err) => {
console.error("Migration failed:", err);
process.exit(1);
});

View File

@@ -1,8 +1,8 @@
{ {
"compilerOptions": { "compilerOptions": {
"target": "ES2022", "target": "ES2022",
"module": "NodeNext", "module": "CommonJS",
"moduleResolution": "NodeNext", "moduleResolution": "Node",
"lib": ["ES2022"], "lib": ["ES2022"],
"outDir": "./dist", "outDir": "./dist",
"rootDir": "./src", "rootDir": "./src",

5
packages/buttplug/.gitignore vendored Normal file
View File

@@ -0,0 +1,5 @@
node_modules/
dist/
wasm/
target/
pkg/

View File

@@ -1,25 +1,27 @@
{ {
"name": "@sexy.pivoine.art/buttplug", "name": "@sexy.pivoine.art/buttplug",
"version": "1.0.0", "version": "1.0.0",
"type": "module", "type": "module",
"main": "./dist/index.js", "private": true,
"module": "./dist/index.js", "main": "./dist/index.js",
"types": "./dist/index.d.ts", "module": "./dist/index.js",
"files": [ "types": "./dist/index.d.ts",
"dist" "files": [
], "dist"
"scripts": { ],
"build": "vite build", "scripts": {
"build:wasm": "wasm-pack build --out-dir wasm --out-name index --target bundler --release" "build": "vite build",
}, "build:wasm": "wasm-pack build --out-dir wasm --out-name index --target web --release",
"dependencies": { "serve": "node serve.mjs"
"eventemitter3": "^5.0.4", },
"typescript": "^5.9.3", "dependencies": {
"vite": "^7.3.1", "eventemitter3": "^5.0.4",
"vite-plugin-wasm": "3.5.0", "typescript": "^5.9.3",
"ws": "^8.19.0" "vite": "^7.3.1",
}, "vite-plugin-wasm": "3.5.0",
"devDependencies": { "ws": "^8.19.0"
"wasm-pack": "^0.14.0" },
} "devDependencies": {
"wasm-pack": "^0.14.0"
}
} }

View File

@@ -0,0 +1,39 @@
#!/usr/bin/env node
// Simple static server for local development — serves dist/ and wasm/ on port 8080
import http from "http";
import fs from "fs";
import path from "path";
import { fileURLToPath } from "url";
const __dirname = path.dirname(fileURLToPath(import.meta.url));
const PORT = process.env.PORT ?? 8080;
const MIME = {
".js": "application/javascript",
".wasm": "application/wasm",
".ts": "text/plain",
".d.ts": "text/plain",
};
http
.createServer((req, res) => {
const filePath = path.join(__dirname, decodeURIComponent(req.url.split("?")[0]));
const ext = path.extname(filePath);
fs.readFile(filePath, (err, data) => {
if (err) {
res.writeHead(404);
res.end("Not found");
return;
}
res.writeHead(200, {
"Content-Type": MIME[ext] ?? "application/octet-stream",
"Cache-Control": "no-cache",
"Cross-Origin-Resource-Policy": "cross-origin",
});
res.end(data);
});
})
.listen(PORT, () => {
console.log(`[buttplug] serving on http://localhost:${PORT}`);
});

View File

@@ -6,11 +6,11 @@
* @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved. * @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved.
*/ */
'use strict'; "use strict";
import { IButtplugClientConnector } from './IButtplugClientConnector'; import { type IButtplugClientConnector } from "./IButtplugClientConnector";
import { ButtplugMessage } from '../core/Messages'; import { type ButtplugMessage } from "../core/Messages";
import { ButtplugBrowserWebsocketConnector } from '../utils/ButtplugBrowserWebsocketConnector'; import { ButtplugBrowserWebsocketConnector } from "../utils/ButtplugBrowserWebsocketConnector";
export class ButtplugBrowserWebsocketClientConnector export class ButtplugBrowserWebsocketClientConnector
extends ButtplugBrowserWebsocketConnector extends ButtplugBrowserWebsocketConnector
@@ -18,7 +18,7 @@ export class ButtplugBrowserWebsocketClientConnector
{ {
public send = (msg: ButtplugMessage): void => { public send = (msg: ButtplugMessage): void => {
if (!this.Connected) { if (!this.Connected) {
throw new Error('ButtplugClient not connected'); throw new Error("ButtplugClient not connected");
} }
this.sendMessage(msg); this.sendMessage(msg);
}; };

View File

@@ -6,20 +6,16 @@
* @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved. * @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved.
*/ */
'use strict'; "use strict";
import { ButtplugLogger } from '../core/Logging'; import { ButtplugLogger } from "../core/Logging";
import { EventEmitter } from 'eventemitter3'; import { EventEmitter } from "eventemitter3";
import { ButtplugClientDevice } from './ButtplugClientDevice'; import { ButtplugClientDevice } from "./ButtplugClientDevice";
import { IButtplugClientConnector } from './IButtplugClientConnector'; import { type IButtplugClientConnector } from "./IButtplugClientConnector";
import { ButtplugMessageSorter } from '../utils/ButtplugMessageSorter'; import { ButtplugMessageSorter } from "../utils/ButtplugMessageSorter";
import * as Messages from '../core/Messages'; import * as Messages from "../core/Messages";
import { import { ButtplugError, ButtplugInitError, ButtplugMessageError } from "../core/Exceptions";
ButtplugError, import { ButtplugClientConnectorException } from "./ButtplugClientConnectorException";
ButtplugInitError,
ButtplugMessageError,
} from '../core/Exceptions';
import { ButtplugClientConnectorException } from './ButtplugClientConnectorException';
export class ButtplugClient extends EventEmitter { export class ButtplugClient extends EventEmitter {
protected _pingTimer: NodeJS.Timeout | null = null; protected _pingTimer: NodeJS.Timeout | null = null;
@@ -30,7 +26,7 @@ export class ButtplugClient extends EventEmitter {
protected _isScanning = false; protected _isScanning = false;
private _sorter: ButtplugMessageSorter = new ButtplugMessageSorter(true); private _sorter: ButtplugMessageSorter = new ButtplugMessageSorter(true);
constructor(clientName = 'Generic Buttplug Client') { constructor(clientName = "Generic Buttplug Client") {
super(); super();
this._clientName = clientName; this._clientName = clientName;
this._logger.Debug(`ButtplugClient: Client ${clientName} created.`); this._logger.Debug(`ButtplugClient: Client ${clientName} created.`);
@@ -52,18 +48,16 @@ export class ButtplugClient extends EventEmitter {
} }
public connect = async (connector: IButtplugClientConnector) => { public connect = async (connector: IButtplugClientConnector) => {
this._logger.Info( this._logger.Info(`ButtplugClient: Connecting using ${connector.constructor.name}`);
`ButtplugClient: Connecting using ${connector.constructor.name}`
);
await connector.connect(); await connector.connect();
this._connector = connector; this._connector = connector;
this._connector.addListener('message', this.parseMessages); this._connector.addListener("message", this.parseMessages);
this._connector.addListener('disconnect', this.disconnectHandler); this._connector.addListener("disconnect", this.disconnectHandler);
await this.initializeConnection(); await this.initializeConnection();
}; };
public disconnect = async () => { public disconnect = async () => {
this._logger.Debug('ButtplugClient: Disconnect called'); this._logger.Debug("ButtplugClient: Disconnect called");
this._devices.clear(); this._devices.clear();
this.checkConnector(); this.checkConnector();
await this.shutdownConnection(); await this.shutdownConnection();
@@ -71,25 +65,33 @@ export class ButtplugClient extends EventEmitter {
}; };
public startScanning = async () => { public startScanning = async () => {
this._logger.Debug('ButtplugClient: StartScanning called'); this._logger.Debug("ButtplugClient: StartScanning called");
this._isScanning = true; this._isScanning = true;
await this.sendMsgExpectOk({ StartScanning: { Id: 1 } }); await this.sendMsgExpectOk({ StartScanning: { Id: 1 } });
}; };
public stopScanning = async () => { public stopScanning = async () => {
this._logger.Debug('ButtplugClient: StopScanning called'); this._logger.Debug("ButtplugClient: StopScanning called");
this._isScanning = false; this._isScanning = false;
await this.sendMsgExpectOk({ StopScanning: { Id: 1 } }); await this.sendMsgExpectOk({ StopScanning: { Id: 1 } });
}; };
public stopAllDevices = async () => { public stopAllDevices = async () => {
this._logger.Debug('ButtplugClient: StopAllDevices'); this._logger.Debug("ButtplugClient: StopAllDevices");
await this.sendMsgExpectOk({ StopCmd: { Id: 1, DeviceIndex: undefined, FeatureIndex: undefined, Inputs: true, Outputs: true } }); await this.sendMsgExpectOk({
StopCmd: {
Id: 1,
DeviceIndex: undefined,
FeatureIndex: undefined,
Inputs: true,
Outputs: true,
},
});
}; };
protected disconnectHandler = () => { protected disconnectHandler = () => {
this._logger.Info('ButtplugClient: Disconnect event receieved.'); this._logger.Info("ButtplugClient: Disconnect event receieved.");
this.emit('disconnect'); this.emit("disconnect");
}; };
protected parseMessages = (msgs: Messages.ButtplugMessage[]) => { protected parseMessages = (msgs: Messages.ButtplugMessage[]) => {
@@ -100,10 +102,10 @@ export class ButtplugClient extends EventEmitter {
break; break;
} else if (x.ScanningFinished !== undefined) { } else if (x.ScanningFinished !== undefined) {
this._isScanning = false; this._isScanning = false;
this.emit('scanningfinished', x); this.emit("scanningfinished", x);
} else if (x.InputReading !== undefined) { } else if (x.InputReading !== undefined) {
// TODO this should be emitted from the device or feature, not the client // TODO this should be emitted from the device or feature, not the client
this.emit('inputreading', x); this.emit("inputreading", x);
} else { } else {
console.log(`Unhandled message: ${x}`); console.log(`Unhandled message: ${x}`);
} }
@@ -112,21 +114,17 @@ export class ButtplugClient extends EventEmitter {
protected initializeConnection = async (): Promise<boolean> => { protected initializeConnection = async (): Promise<boolean> => {
this.checkConnector(); this.checkConnector();
const msg = await this.sendMessage( const msg = await this.sendMessage({
{ RequestServerInfo: {
RequestServerInfo: { ClientName: this._clientName,
ClientName: this._clientName, Id: 1,
Id: 1, ProtocolVersionMajor: Messages.MESSAGE_SPEC_VERSION_MAJOR,
ProtocolVersionMajor: Messages.MESSAGE_SPEC_VERSION_MAJOR, ProtocolVersionMinor: Messages.MESSAGE_SPEC_VERSION_MINOR,
ProtocolVersionMinor: Messages.MESSAGE_SPEC_VERSION_MINOR },
} });
}
);
if (msg.ServerInfo !== undefined) { if (msg.ServerInfo !== undefined) {
const serverinfo = msg as Messages.ServerInfo; const serverinfo = msg as Messages.ServerInfo;
this._logger.Info( this._logger.Info(`ButtplugClient: Connected to Server ${serverinfo.ServerName}`);
`ButtplugClient: Connected to Server ${serverinfo.ServerName}`
);
// TODO: maybe store server name, do something with message template version? // TODO: maybe store server name, do something with message template version?
const ping = serverinfo.MaxPingTime; const ping = serverinfo.MaxPingTime;
// If the server version is lower than the client version, the server will disconnect here. // If the server version is lower than the client version, the server will disconnect here.
@@ -153,42 +151,37 @@ export class ButtplugClient extends EventEmitter {
throw ButtplugError.LogAndError( throw ButtplugError.LogAndError(
ButtplugInitError, ButtplugInitError,
this._logger, this._logger,
`Cannot connect to server. ${err.ErrorMessage}` `Cannot connect to server. ${err.ErrorMessage}`,
); );
} }
return false; return false;
} };
private parseDeviceList = (list: Messages.DeviceList) => { private parseDeviceList = (list: Messages.DeviceList) => {
for (let [_, d] of Object.entries(list.Devices)) { for (const [_, d] of Object.entries(list.Devices)) {
if (!this._devices.has(d.DeviceIndex)) { if (!this._devices.has(d.DeviceIndex)) {
const device = ButtplugClientDevice.fromMsg( const device = ButtplugClientDevice.fromMsg(d, this.sendMessageClosure);
d,
this.sendMessageClosure
);
this._logger.Debug(`ButtplugClient: Adding Device: ${device}`); this._logger.Debug(`ButtplugClient: Adding Device: ${device}`);
this._devices.set(d.DeviceIndex, device); this._devices.set(d.DeviceIndex, device);
this.emit('deviceadded', device); this.emit("deviceadded", device);
} else { } else {
this._logger.Debug(`ButtplugClient: Device already added: ${d}`); this._logger.Debug(`ButtplugClient: Device already added: ${d}`);
} }
} }
for (let [index, device] of this._devices.entries()) { for (const [index, device] of this._devices.entries()) {
if (!list.Devices.hasOwnProperty(index.toString())) { if (!Object.prototype.hasOwnProperty.call(list.Devices, index.toString())) {
this._devices.delete(index); this._devices.delete(index);
this.emit('deviceremoved', device); this.emit("deviceremoved", device);
} }
} }
} };
protected requestDeviceList = async () => { protected requestDeviceList = async () => {
this.checkConnector(); this.checkConnector();
this._logger.Debug('ButtplugClient: ReceiveDeviceList called'); this._logger.Debug("ButtplugClient: ReceiveDeviceList called");
const response = (await this.sendMessage( const response = await this.sendMessage({
{ RequestDeviceList: { Id: 1 },
RequestDeviceList: { Id: 1 } });
}
));
this.parseDeviceList(response.DeviceList!); this.parseDeviceList(response.DeviceList!);
}; };
@@ -200,9 +193,7 @@ export class ButtplugClient extends EventEmitter {
} }
}; };
protected async sendMessage( protected async sendMessage(msg: Messages.ButtplugMessage): Promise<Messages.ButtplugMessage> {
msg: Messages.ButtplugMessage
): Promise<Messages.ButtplugMessage> {
this.checkConnector(); this.checkConnector();
const p = this._sorter.PrepareOutgoingMessage(msg); const p = this._sorter.PrepareOutgoingMessage(msg);
await this._connector!.send(msg); await this._connector!.send(msg);
@@ -211,15 +202,11 @@ export class ButtplugClient extends EventEmitter {
protected checkConnector() { protected checkConnector() {
if (!this.connected) { if (!this.connected) {
throw new ButtplugClientConnectorException( throw new ButtplugClientConnectorException("ButtplugClient not connected");
'ButtplugClient not connected'
);
} }
} }
protected sendMsgExpectOk = async ( protected sendMsgExpectOk = async (msg: Messages.ButtplugMessage): Promise<void> => {
msg: Messages.ButtplugMessage
): Promise<void> => {
const response = await this.sendMessage(msg); const response = await this.sendMessage(msg);
if (response.Ok !== undefined) { if (response.Ok !== undefined) {
return; return;
@@ -229,13 +216,13 @@ export class ButtplugClient extends EventEmitter {
throw ButtplugError.LogAndError( throw ButtplugError.LogAndError(
ButtplugMessageError, ButtplugMessageError,
this._logger, this._logger,
`Message ${response} not handled by SendMsgExpectOk` `Message ${response} not handled by SendMsgExpectOk`,
); );
} }
}; };
protected sendMessageClosure = async ( protected sendMessageClosure = async (
msg: Messages.ButtplugMessage msg: Messages.ButtplugMessage,
): Promise<Messages.ButtplugMessage> => { ): Promise<Messages.ButtplugMessage> => {
return await this.sendMessage(msg); return await this.sendMessage(msg);
}; };

View File

@@ -6,8 +6,8 @@
* @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved. * @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved.
*/ */
import { ButtplugError } from '../core/Exceptions'; import { ButtplugError } from "../core/Exceptions";
import * as Messages from '../core/Messages'; import * as Messages from "../core/Messages";
export class ButtplugClientConnectorException extends ButtplugError { export class ButtplugClientConnectorException extends ButtplugError {
public constructor(message: string) { public constructor(message: string) {

View File

@@ -6,22 +6,17 @@
* @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved. * @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved.
*/ */
'use strict'; "use strict";
import * as Messages from '../core/Messages'; import * as Messages from "../core/Messages";
import { import { ButtplugDeviceError, ButtplugError, ButtplugMessageError } from "../core/Exceptions";
ButtplugDeviceError, import { EventEmitter } from "eventemitter3";
ButtplugError, import { ButtplugClientDeviceFeature } from "./ButtplugClientDeviceFeature";
ButtplugMessageError, import { type DeviceOutputCommand } from "./ButtplugClientDeviceCommand";
} from '../core/Exceptions';
import { EventEmitter } from 'eventemitter3';
import { ButtplugClientDeviceFeature } from './ButtplugClientDeviceFeature';
import { DeviceOutputCommand } from './ButtplugClientDeviceCommand';
/** /**
* Represents an abstract device, capable of taking certain kinds of messages. * Represents an abstract device, capable of taking certain kinds of messages.
*/ */
export class ButtplugClientDevice extends EventEmitter { export class ButtplugClientDevice extends EventEmitter {
private _features: Map<number, ButtplugClientDeviceFeature>; private _features: Map<number, ButtplugClientDeviceFeature>;
/** /**
@@ -58,9 +53,7 @@ export class ButtplugClientDevice extends EventEmitter {
public static fromMsg( public static fromMsg(
msg: Messages.DeviceInfo, msg: Messages.DeviceInfo,
sendClosure: ( sendClosure: (msg: Messages.ButtplugMessage) => Promise<Messages.ButtplugMessage>,
msg: Messages.ButtplugMessage
) => Promise<Messages.ButtplugMessage>
): ButtplugClientDevice { ): ButtplugClientDevice {
return new ButtplugClientDevice(msg, sendClosure); return new ButtplugClientDevice(msg, sendClosure);
} }
@@ -72,25 +65,29 @@ export class ButtplugClientDevice extends EventEmitter {
*/ */
private constructor( private constructor(
private _deviceInfo: Messages.DeviceInfo, private _deviceInfo: Messages.DeviceInfo,
private _sendClosure: ( private _sendClosure: (msg: Messages.ButtplugMessage) => Promise<Messages.ButtplugMessage>,
msg: Messages.ButtplugMessage
) => Promise<Messages.ButtplugMessage>
) { ) {
super(); super();
this._features = new Map(Object.entries(_deviceInfo.DeviceFeatures).map(([index, v]) => [parseInt(index), new ButtplugClientDeviceFeature(_deviceInfo.DeviceIndex, _deviceInfo.DeviceName, v, _sendClosure)])); this._features = new Map(
Object.entries(_deviceInfo.DeviceFeatures).map(([index, v]) => [
parseInt(index),
new ButtplugClientDeviceFeature(
_deviceInfo.DeviceIndex,
_deviceInfo.DeviceName,
v,
_sendClosure,
),
]),
);
} }
public async send( public async send(msg: Messages.ButtplugMessage): Promise<Messages.ButtplugMessage> {
msg: Messages.ButtplugMessage
): Promise<Messages.ButtplugMessage> {
// Assume we're getting the closure from ButtplugClient, which does all of // Assume we're getting the closure from ButtplugClient, which does all of
// the index/existence/connection/message checks for us. // the index/existence/connection/message checks for us.
return await this._sendClosure(msg); return await this._sendClosure(msg);
} }
protected sendMsgExpectOk = async ( protected sendMsgExpectOk = async (msg: Messages.ButtplugMessage): Promise<void> => {
msg: Messages.ButtplugMessage
): Promise<void> => {
const response = await this.send(msg); const response = await this.send(msg);
if (response.Ok !== undefined) { if (response.Ok !== undefined) {
return; return;
@@ -108,25 +105,50 @@ export class ButtplugClientDevice extends EventEmitter {
}; };
protected isOutputValid(featureIndex: number, type: Messages.OutputType) { protected isOutputValid(featureIndex: number, type: Messages.OutputType) {
if (!this._deviceInfo.DeviceFeatures.hasOwnProperty(featureIndex.toString())) { if (
throw new ButtplugDeviceError(`Feature index ${featureIndex} does not exist for device ${this.name}`); !Object.prototype.hasOwnProperty.call(
this._deviceInfo.DeviceFeatures,
featureIndex.toString(),
)
) {
throw new ButtplugDeviceError(
`Feature index ${featureIndex} does not exist for device ${this.name}`,
);
} }
if (this._deviceInfo.DeviceFeatures[featureIndex.toString()].Outputs !== undefined && !this._deviceInfo.DeviceFeatures[featureIndex.toString()].Outputs.hasOwnProperty(type)) { if (
throw new ButtplugDeviceError(`Feature index ${featureIndex} does not support type ${type} for device ${this.name}`); this._deviceInfo.DeviceFeatures[featureIndex.toString()].Outputs !== undefined &&
!Object.prototype.hasOwnProperty.call(
this._deviceInfo.DeviceFeatures[featureIndex.toString()].Outputs,
type,
)
) {
throw new ButtplugDeviceError(
`Feature index ${featureIndex} does not support type ${type} for device ${this.name}`,
);
} }
} }
public hasOutput(type: Messages.OutputType): boolean { public hasOutput(type: Messages.OutputType): boolean {
return this._features.values().filter((f) => f.hasOutput(type)).toArray().length > 0; return (
this._features
.values()
.filter((f) => f.hasOutput(type))
.toArray().length > 0
);
} }
public hasInput(type: Messages.InputType): boolean { public hasInput(type: Messages.InputType): boolean {
return this._features.values().filter((f) => f.hasInput(type)).toArray().length > 0; return (
this._features
.values()
.filter((f) => f.hasInput(type))
.toArray().length > 0
);
} }
public async runOutput(cmd: DeviceOutputCommand): Promise<void> { public async runOutput(cmd: DeviceOutputCommand): Promise<void> {
let p: Promise<void>[] = []; const p: Promise<void>[] = [];
for (let f of this._features.values()) { for (const f of this._features.values()) {
if (f.hasOutput(cmd.outputType)) { if (f.hasOutput(cmd.outputType)) {
p.push(f.runOutput(cmd)); p.push(f.runOutput(cmd));
} }
@@ -138,15 +160,26 @@ export class ButtplugClientDevice extends EventEmitter {
} }
public async stop(): Promise<void> { public async stop(): Promise<void> {
await this.sendMsgExpectOk({StopCmd: { Id: 1, DeviceIndex: this.index, FeatureIndex: undefined, Inputs: true, Outputs: true}}); await this.sendMsgExpectOk({
StopCmd: {
Id: 1,
DeviceIndex: this.index,
FeatureIndex: undefined,
Inputs: true,
Outputs: true,
},
});
} }
public async battery(): Promise<number> { public async battery(): Promise<number> {
let p: Promise<void>[] = []; const _p: Promise<void>[] = [];
for (let f of this._features.values()) { for (const f of this._features.values()) {
if (f.hasInput(Messages.InputType.Battery)) { if (f.hasInput(Messages.InputType.Battery)) {
// Right now, we only have one battery per device, so assume the first one we find is it. // Right now, we only have one battery per device, so assume the first one we find is it.
let response = await f.runInput(Messages.InputType.Battery, Messages.InputCommandType.Read); const response = await f.runInput(
Messages.InputType.Battery,
Messages.InputCommandType.Read,
);
if (response === undefined) { if (response === undefined) {
throw new ButtplugMessageError("Got incorrect message back."); throw new ButtplugMessageError("Got incorrect message back.");
} }
@@ -160,6 +193,6 @@ export class ButtplugClientDevice extends EventEmitter {
} }
public emitDisconnected() { public emitDisconnected() {
this.emit('deviceremoved'); this.emit("deviceremoved");
} }
} }

View File

@@ -14,7 +14,7 @@ class PercentOrSteps {
} }
public static createSteps(s: number): PercentOrSteps { public static createSteps(s: number): PercentOrSteps {
let v = new PercentOrSteps; const v = new PercentOrSteps();
v._steps = s; v._steps = s;
return v; return v;
} }
@@ -24,7 +24,7 @@ class PercentOrSteps {
throw new ButtplugDeviceError(`Percent value ${p} is not in the range 0.0 <= x <= 1.0`); throw new ButtplugDeviceError(`Percent value ${p} is not in the range 0.0 <= x <= 1.0`);
} }
let v = new PercentOrSteps; const v = new PercentOrSteps();
v._percent = p; v._percent = p;
return v; return v;
} }
@@ -35,8 +35,7 @@ export class DeviceOutputCommand {
private _outputType: OutputType, private _outputType: OutputType,
private _value: PercentOrSteps, private _value: PercentOrSteps,
private _duration?: number, private _duration?: number,
) ) {}
{}
public get outputType() { public get outputType() {
return this._outputType; return this._outputType;
@@ -52,26 +51,36 @@ export class DeviceOutputCommand {
} }
export class DeviceOutputValueConstructor { export class DeviceOutputValueConstructor {
public constructor( public constructor(private _outputType: OutputType) {}
private _outputType: OutputType)
{}
public steps(steps: number): DeviceOutputCommand { public steps(steps: number): DeviceOutputCommand {
return new DeviceOutputCommand(this._outputType, PercentOrSteps.createSteps(steps), undefined); return new DeviceOutputCommand(this._outputType, PercentOrSteps.createSteps(steps), undefined);
} }
public percent(percent: number): DeviceOutputCommand { public percent(percent: number): DeviceOutputCommand {
return new DeviceOutputCommand(this._outputType, PercentOrSteps.createPercent(percent), undefined); return new DeviceOutputCommand(
this._outputType,
PercentOrSteps.createPercent(percent),
undefined,
);
} }
} }
export class DeviceOutputPositionWithDurationConstructor { export class DeviceOutputPositionWithDurationConstructor {
public steps(steps: number, duration: number): DeviceOutputCommand { public steps(steps: number, duration: number): DeviceOutputCommand {
return new DeviceOutputCommand(OutputType.Position, PercentOrSteps.createSteps(steps), duration); return new DeviceOutputCommand(
OutputType.Position,
PercentOrSteps.createSteps(steps),
duration,
);
} }
public percent(percent: number, duration: number): DeviceOutputCommand { public percent(percent: number, duration: number): DeviceOutputCommand {
return new DeviceOutputCommand(OutputType.HwPositionWithDuration, PercentOrSteps.createPercent(percent), duration); return new DeviceOutputCommand(
OutputType.HwPositionWithDuration,
PercentOrSteps.createPercent(percent),
duration,
);
} }
} }

View File

@@ -1,25 +1,20 @@
import { ButtplugDeviceError, ButtplugError, ButtplugMessageError } from "../core/Exceptions"; import { ButtplugDeviceError, ButtplugError, ButtplugMessageError } from "../core/Exceptions";
import * as Messages from "../core/Messages"; import * as Messages from "../core/Messages";
import { DeviceOutputCommand } from "./ButtplugClientDeviceCommand"; import { type DeviceOutputCommand } from "./ButtplugClientDeviceCommand";
export class ButtplugClientDeviceFeature { export class ButtplugClientDeviceFeature {
constructor( constructor(
private _deviceIndex: number, private _deviceIndex: number,
private _deviceName: string, private _deviceName: string,
private _feature: Messages.DeviceFeature, private _feature: Messages.DeviceFeature,
private _sendClosure: ( private _sendClosure: (msg: Messages.ButtplugMessage) => Promise<Messages.ButtplugMessage>,
msg: Messages.ButtplugMessage ) {}
) => Promise<Messages.ButtplugMessage>) {
}
protected send = async (msg: Messages.ButtplugMessage): Promise<Messages.ButtplugMessage> => { protected send = async (msg: Messages.ButtplugMessage): Promise<Messages.ButtplugMessage> => {
return await this._sendClosure(msg); return await this._sendClosure(msg);
} };
protected sendMsgExpectOk = async ( protected sendMsgExpectOk = async (msg: Messages.ButtplugMessage): Promise<void> => {
msg: Messages.ButtplugMessage
): Promise<void> => {
const response = await this.send(msg); const response = await this.send(msg);
if (response.Ok !== undefined) { if (response.Ok !== undefined) {
return; return;
@@ -31,14 +26,24 @@ export class ButtplugClientDeviceFeature {
}; };
protected isOutputValid(type: Messages.OutputType) { protected isOutputValid(type: Messages.OutputType) {
if (this._feature.Output !== undefined && !this._feature.Output.hasOwnProperty(type)) { if (
throw new ButtplugDeviceError(`Feature index ${this._feature.FeatureIndex} does not support type ${type} for device ${this._deviceName}`); this._feature.Output !== undefined &&
!Object.prototype.hasOwnProperty.call(this._feature.Output, type)
) {
throw new ButtplugDeviceError(
`Feature index ${this._feature.FeatureIndex} does not support type ${type} for device ${this._deviceName}`,
);
} }
} }
protected isInputValid(type: Messages.InputType) { protected isInputValid(type: Messages.InputType) {
if (this._feature.Input !== undefined && !this._feature.Input.hasOwnProperty(type)) { if (
throw new ButtplugDeviceError(`Feature index ${this._feature.FeatureIndex} does not support type ${type} for device ${this._deviceName}`); this._feature.Input !== undefined &&
!Object.prototype.hasOwnProperty.call(this._feature.Input, type)
) {
throw new ButtplugDeviceError(
`Feature index ${this._feature.FeatureIndex} does not support type ${type} for device ${this._deviceName}`,
);
} }
} }
@@ -49,7 +54,7 @@ export class ButtplugClientDeviceFeature {
throw new ButtplugDeviceError(`${command.outputType} requires value defined`); throw new ButtplugDeviceError(`${command.outputType} requires value defined`);
} }
let type = command.outputType; const type = command.outputType;
let duration: undefined | number = undefined; let duration: undefined | number = undefined;
if (type == Messages.OutputType.HwPositionWithDuration) { if (type == Messages.OutputType.HwPositionWithDuration) {
if (command.duration === undefined) { if (command.duration === undefined) {
@@ -58,24 +63,24 @@ export class ButtplugClientDeviceFeature {
duration = command.duration; duration = command.duration;
} }
let value: number; let value: number;
let p = command.value; const p = command.value;
if (p.percent === undefined) { if (p.percent === undefined) {
// TODO Check step limits here // TODO Check step limits here
value = command.value.steps!; value = command.value.steps!;
} else { } else {
value = Math.ceil(this._feature.Output[type]!.Value![1] * p.percent); value = Math.ceil(this._feature.Output[type]!.Value![1] * p.percent);
} }
let newCommand: Messages.DeviceFeatureOutput = { Value: value, Duration: duration }; const newCommand: Messages.DeviceFeatureOutput = { Value: value, Duration: duration };
let outCommand = {}; const outCommand = {};
outCommand[type.toString()] = newCommand; outCommand[type.toString()] = newCommand;
let cmd: Messages.ButtplugMessage = { const cmd: Messages.ButtplugMessage = {
OutputCmd: { OutputCmd: {
Id: 1, Id: 1,
DeviceIndex: this._deviceIndex, DeviceIndex: this._deviceIndex,
FeatureIndex: this._feature.FeatureIndex, FeatureIndex: this._feature.FeatureIndex,
Command: outCommand Command: outCommand,
} },
}; };
await this.sendMsgExpectOk(cmd); await this.sendMsgExpectOk(cmd);
} }
@@ -112,43 +117,51 @@ export class ButtplugClientDeviceFeature {
public hasOutput(type: Messages.OutputType): boolean { public hasOutput(type: Messages.OutputType): boolean {
if (this._feature.Output !== undefined) { if (this._feature.Output !== undefined) {
return this._feature.Output.hasOwnProperty(type.toString()); return Object.prototype.hasOwnProperty.call(this._feature.Output, type.toString());
} }
return false; return false;
} }
public hasInput(type: Messages.InputType): boolean { public hasInput(type: Messages.InputType): boolean {
if (this._feature.Input !== undefined) { if (this._feature.Input !== undefined) {
return this._feature.Input.hasOwnProperty(type.toString()); return Object.prototype.hasOwnProperty.call(this._feature.Input, type.toString());
} }
return false; return false;
} }
public async runOutput(cmd: DeviceOutputCommand): Promise<void> { public async runOutput(cmd: DeviceOutputCommand): Promise<void> {
if (this._feature.Output !== undefined && this._feature.Output.hasOwnProperty(cmd.outputType.toString())) { if (
this._feature.Output !== undefined &&
Object.prototype.hasOwnProperty.call(this._feature.Output, cmd.outputType.toString())
) {
return this.sendOutputCmd(cmd); return this.sendOutputCmd(cmd);
} }
throw new ButtplugDeviceError(`Output type ${cmd.outputType} not supported by feature.`); throw new ButtplugDeviceError(`Output type ${cmd.outputType} not supported by feature.`);
} }
public async runInput(inputType: Messages.InputType, inputCommand: Messages.InputCommandType): Promise<Messages.InputReading | undefined> { public async runInput(
inputType: Messages.InputType,
inputCommand: Messages.InputCommandType,
): Promise<Messages.InputReading | undefined> {
// Make sure the requested feature is valid // Make sure the requested feature is valid
this.isInputValid(inputType); this.isInputValid(inputType);
let inputAttributes = this._feature.Input[inputType]; const inputAttributes = this._feature.Input[inputType];
console.log(this._feature.Input); if (
if ((inputCommand === Messages.InputCommandType.Unsubscribe && !inputAttributes.Command.includes(Messages.InputCommandType.Subscribe)) && !inputAttributes.Command.includes(inputCommand)) { inputCommand === Messages.InputCommandType.Unsubscribe &&
!inputAttributes.Command.includes(Messages.InputCommandType.Subscribe) &&
!inputAttributes.Command.includes(inputCommand)
) {
throw new ButtplugDeviceError(`${inputType} does not support command ${inputCommand}`); throw new ButtplugDeviceError(`${inputType} does not support command ${inputCommand}`);
} }
let cmd: Messages.ButtplugMessage = { const cmd: Messages.ButtplugMessage = {
InputCmd: { InputCmd: {
Id: 1, Id: 1,
DeviceIndex: this._deviceIndex, DeviceIndex: this._deviceIndex,
FeatureIndex: this._feature.FeatureIndex, FeatureIndex: this._feature.FeatureIndex,
Type: inputType, Type: inputType,
Command: inputCommand, Command: inputCommand,
} },
}; };
if (inputCommand == Messages.InputCommandType.Read) { if (inputCommand == Messages.InputCommandType.Read) {
const response = await this.send(cmd); const response = await this.send(cmd);

View File

@@ -6,12 +6,11 @@
* @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved. * @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved.
*/ */
'use strict'; "use strict";
import { ButtplugBrowserWebsocketClientConnector } from './ButtplugBrowserWebsocketClientConnector'; import { ButtplugBrowserWebsocketClientConnector } from "./ButtplugBrowserWebsocketClientConnector";
import { WebSocket as NodeWebSocket } from 'ws'; import { WebSocket as NodeWebSocket } from "ws";
export class ButtplugNodeWebsocketClientConnector extends ButtplugBrowserWebsocketClientConnector { export class ButtplugNodeWebsocketClientConnector extends ButtplugBrowserWebsocketClientConnector {
protected _websocketConstructor = protected _websocketConstructor = NodeWebSocket as unknown as typeof WebSocket;
NodeWebSocket as unknown as typeof WebSocket;
} }

View File

@@ -6,8 +6,8 @@
* @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved. * @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved.
*/ */
import { ButtplugMessage } from '../core/Messages'; import { type ButtplugMessage } from "../core/Messages";
import { EventEmitter } from 'eventemitter3'; import { type EventEmitter } from "eventemitter3";
export interface IButtplugClientConnector extends EventEmitter { export interface IButtplugClientConnector extends EventEmitter {
connect: () => Promise<void>; connect: () => Promise<void>;

View File

@@ -6,8 +6,8 @@
* @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved. * @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved.
*/ */
import * as Messages from './Messages'; import * as Messages from "./Messages";
import { ButtplugLogger } from './Logging'; import { type ButtplugLogger } from "./Logging";
export class ButtplugError extends Error { export class ButtplugError extends Error {
public get ErrorClass(): Messages.ErrorClass { public get ErrorClass(): Messages.ErrorClass {
@@ -27,16 +27,16 @@ export class ButtplugError extends Error {
Error: { Error: {
Id: this.Id, Id: this.Id,
ErrorCode: this.ErrorClass, ErrorCode: this.ErrorClass,
ErrorMessage: this.message ErrorMessage: this.message,
} },
} };
} }
public static LogAndError<T extends ButtplugError>( public static LogAndError<T extends ButtplugError>(
constructor: new (str: string, num: number) => T, constructor: new (str: string, num: number) => T,
logger: ButtplugLogger, logger: ButtplugLogger,
message: string, message: string,
id: number = Messages.SYSTEM_MESSAGE_ID id: number = Messages.SYSTEM_MESSAGE_ID,
): T { ): T {
logger.Error(message); logger.Error(message);
return new constructor(message, id); return new constructor(message, id);
@@ -67,7 +67,7 @@ export class ButtplugError extends Error {
message: string, message: string,
errorClass: Messages.ErrorClass, errorClass: Messages.ErrorClass,
id: number = Messages.SYSTEM_MESSAGE_ID, id: number = Messages.SYSTEM_MESSAGE_ID,
inner?: Error inner?: Error,
) { ) {
super(message); super(message);
this.errorClass = errorClass; this.errorClass = errorClass;

View File

@@ -6,7 +6,7 @@
* @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved. * @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved.
*/ */
import { EventEmitter } from 'eventemitter3'; import { EventEmitter } from "eventemitter3";
export enum ButtplugLogLevel { export enum ButtplugLogLevel {
Off, Off,
@@ -69,9 +69,7 @@ export class LogMessage {
* Returns a formatted string with timestamp, level, and message. * Returns a formatted string with timestamp, level, and message.
*/ */
public get FormattedMessage() { public get FormattedMessage() {
return `${ButtplugLogLevel[this.logLevel]} : ${this.timestamp} : ${ return `${ButtplugLogLevel[this.logLevel]} : ${this.timestamp} : ${this.logMessage}`;
this.logMessage
}`;
} }
} }
@@ -176,10 +174,7 @@ export class ButtplugLogger extends EventEmitter {
*/ */
protected AddLogMessage(msg: string, level: ButtplugLogLevel) { protected AddLogMessage(msg: string, level: ButtplugLogLevel) {
// If nothing wants the log message we have, ignore it. // If nothing wants the log message we have, ignore it.
if ( if (level > this.maximumEventLogLevel && level > this.maximumConsoleLogLevel) {
level > this.maximumEventLogLevel &&
level > this.maximumConsoleLogLevel
) {
return; return;
} }
const logMsg = new LogMessage(msg, level); const logMsg = new LogMessage(msg, level);
@@ -191,7 +186,7 @@ export class ButtplugLogger extends EventEmitter {
console.log(logMsg.FormattedMessage); console.log(logMsg.FormattedMessage);
} }
if (level <= this.maximumEventLogLevel) { if (level <= this.maximumEventLogLevel) {
this.emit('log', logMsg); this.emit("log", logMsg);
} }
} }
} }

View File

@@ -7,9 +7,9 @@
*/ */
// tslint:disable:max-classes-per-file // tslint:disable:max-classes-per-file
'use strict'; "use strict";
import { ButtplugMessageError } from './Exceptions'; import { ButtplugMessageError } from "./Exceptions";
export const SYSTEM_MESSAGE_ID = 0; export const SYSTEM_MESSAGE_ID = 0;
export const DEFAULT_MESSAGE_ID = 1; export const DEFAULT_MESSAGE_ID = 1;
@@ -36,7 +36,7 @@ export interface ButtplugMessage {
} }
export function msgId(msg: ButtplugMessage): number { export function msgId(msg: ButtplugMessage): number {
for (let [_, entry] of Object.entries(msg)) { for (const [_, entry] of Object.entries(msg)) {
if (entry != undefined) { if (entry != undefined) {
return entry.Id; return entry.Id;
} }
@@ -45,7 +45,7 @@ export function msgId(msg: ButtplugMessage): number {
} }
export function setMsgId(msg: ButtplugMessage, id: number) { export function setMsgId(msg: ButtplugMessage, id: number) {
for (let [_, entry] of Object.entries(msg)) { for (const [_, entry] of Object.entries(msg)) {
if (entry != undefined) { if (entry != undefined) {
entry.Id = id; entry.Id = id;
return; return;
@@ -132,34 +132,34 @@ export interface DeviceList {
} }
export enum OutputType { export enum OutputType {
Unknown = 'Unknown', Unknown = "Unknown",
Vibrate = 'Vibrate', Vibrate = "Vibrate",
Rotate = 'Rotate', Rotate = "Rotate",
Oscillate = 'Oscillate', Oscillate = "Oscillate",
Constrict = 'Constrict', Constrict = "Constrict",
Inflate = 'Inflate', Inflate = "Inflate",
Position = 'Position', Position = "Position",
HwPositionWithDuration = 'HwPositionWithDuration', HwPositionWithDuration = "HwPositionWithDuration",
Temperature = 'Temperature', Temperature = "Temperature",
Spray = 'Spray', Spray = "Spray",
Led = 'Led', Led = "Led",
} }
export enum InputType { export enum InputType {
Unknown = 'Unknown', Unknown = "Unknown",
Battery = 'Battery', Battery = "Battery",
RSSI = 'RSSI', RSSI = "RSSI",
Button = 'Button', Button = "Button",
Pressure = 'Pressure', Pressure = "Pressure",
// Temperature, // Temperature,
// Accelerometer, // Accelerometer,
// Gyro, // Gyro,
} }
export enum InputCommandType { export enum InputCommandType {
Read = 'Read', Read = "Read",
Subscribe = 'Subscribe', Subscribe = "Subscribe",
Unsubscribe = 'Unsubscribe', Unsubscribe = "Unsubscribe",
} }
export interface DeviceFeatureInput { export interface DeviceFeatureInput {

View File

@@ -1,4 +1,4 @@
declare module "*.json" { declare module "*.json" {
const content: string; const content: string;
export default content; export default content;
} }

View File

@@ -6,27 +6,24 @@
* @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved. * @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved.
*/ */
import { ButtplugMessage } from './core/Messages'; import { type ButtplugMessage } from "./core/Messages";
import { IButtplugClientConnector } from './client/IButtplugClientConnector'; import { type IButtplugClientConnector } from "./client/IButtplugClientConnector";
import { EventEmitter } from 'eventemitter3'; import { EventEmitter } from "eventemitter3";
export * from './client/ButtplugClient'; export * from "./client/ButtplugClient";
export * from './client/ButtplugClientDevice'; export * from "./client/ButtplugClientDevice";
export * from './client/ButtplugBrowserWebsocketClientConnector'; export * from "./client/ButtplugBrowserWebsocketClientConnector";
export * from './client/ButtplugNodeWebsocketClientConnector'; export * from "./client/ButtplugNodeWebsocketClientConnector";
export * from './client/ButtplugClientConnectorException'; export * from "./client/ButtplugClientConnectorException";
export * from './utils/ButtplugMessageSorter'; export * from "./utils/ButtplugMessageSorter";
export * from './client/ButtplugClientDeviceCommand'; export * from "./client/ButtplugClientDeviceCommand";
export * from './client/ButtplugClientDeviceFeature'; export * from "./client/ButtplugClientDeviceFeature";
export * from './client/IButtplugClientConnector'; export * from "./client/IButtplugClientConnector";
export * from './core/Messages'; export * from "./core/Messages";
export * from './core/Logging'; export * from "./core/Logging";
export * from './core/Exceptions'; export * from "./core/Exceptions";
export class ButtplugWasmClientConnector export class ButtplugWasmClientConnector extends EventEmitter implements IButtplugClientConnector {
extends EventEmitter
implements IButtplugClientConnector
{
private static _loggingActivated = false; private static _loggingActivated = false;
private static wasmInstance; private static wasmInstance;
private _connected: boolean = false; private _connected: boolean = false;
@@ -43,35 +40,32 @@ export class ButtplugWasmClientConnector
private static maybeLoadWasm = async () => { private static maybeLoadWasm = async () => {
if (ButtplugWasmClientConnector.wasmInstance == undefined) { if (ButtplugWasmClientConnector.wasmInstance == undefined) {
ButtplugWasmClientConnector.wasmInstance = await import( const wasmModule = await import("../wasm/index.js");
'../wasm/index.js' await wasmModule.default(); // --target web requires calling init() before using exports
); ButtplugWasmClientConnector.wasmInstance = wasmModule;
} }
}; };
public static activateLogging = async (logLevel: string = 'debug') => { public static activateLogging = async (logLevel: string = "debug") => {
await ButtplugWasmClientConnector.maybeLoadWasm(); await ButtplugWasmClientConnector.maybeLoadWasm();
if (this._loggingActivated) { if (this._loggingActivated) {
console.log('Logging already activated, ignoring.'); console.log("Logging already activated, ignoring.");
return; return;
} }
console.log('Turning on logging.'); console.log("Turning on logging.");
ButtplugWasmClientConnector.wasmInstance.buttplug_activate_env_logger( ButtplugWasmClientConnector.wasmInstance.buttplug_activate_env_logger(logLevel);
logLevel,
);
}; };
public initialize = async (): Promise<void> => {}; public initialize = async (): Promise<void> => {};
public connect = async (): Promise<void> => { public connect = async (): Promise<void> => {
await ButtplugWasmClientConnector.maybeLoadWasm(); await ButtplugWasmClientConnector.maybeLoadWasm();
this.client = this.client = ButtplugWasmClientConnector.wasmInstance.buttplug_create_embedded_wasm_server(
ButtplugWasmClientConnector.wasmInstance.buttplug_create_embedded_wasm_server( (msgs) => {
(msgs) => { this.emitMessage(msgs);
this.emitMessage(msgs); },
}, this.serverPtr,
this.serverPtr, );
);
this._connected = true; this._connected = true;
}; };
@@ -80,7 +74,7 @@ export class ButtplugWasmClientConnector
public send = (msg: ButtplugMessage): void => { public send = (msg: ButtplugMessage): void => {
ButtplugWasmClientConnector.wasmInstance.buttplug_client_send_json_message( ButtplugWasmClientConnector.wasmInstance.buttplug_client_send_json_message(
this.client, this.client,
new TextEncoder().encode('[' + JSON.stringify(msg) + ']'), new TextEncoder().encode("[" + JSON.stringify(msg) + "]"),
(output) => { (output) => {
this.emitMessage(output); this.emitMessage(output);
}, },
@@ -90,6 +84,6 @@ export class ButtplugWasmClientConnector
private emitMessage = (msg: Uint8Array) => { private emitMessage = (msg: Uint8Array) => {
const str = new TextDecoder().decode(msg); const str = new TextDecoder().decode(msg);
const msgs: ButtplugMessage[] = JSON.parse(str); const msgs: ButtplugMessage[] = JSON.parse(str);
this.emit('message', msgs); this.emit("message", msgs);
}; };
} }

View File

@@ -23,6 +23,7 @@ type FFICallback = js_sys::Function;
type FFICallbackContext = u32; type FFICallbackContext = u32;
#[derive(Clone, Copy)] #[derive(Clone, Copy)]
#[allow(dead_code)]
pub struct FFICallbackContextWrapper(FFICallbackContext); pub struct FFICallbackContextWrapper(FFICallbackContext);
unsafe impl Send for FFICallbackContextWrapper { unsafe impl Send for FFICallbackContextWrapper {
@@ -50,7 +51,7 @@ pub fn send_server_message(
let buf = json.as_bytes(); let buf = json.as_bytes();
let this = JsValue::null(); let this = JsValue::null();
let uint8buf = unsafe { Uint8Array::new(&Uint8Array::view(buf)) }; let uint8buf = unsafe { Uint8Array::new(&Uint8Array::view(buf)) };
callback.call1(&this, &JsValue::from(uint8buf)); let _ = callback.call1(&this, &JsValue::from(uint8buf));
} }
} }
@@ -119,7 +120,7 @@ pub fn buttplug_client_send_json_message(
let buf = json.as_bytes(); let buf = json.as_bytes();
let this = JsValue::null(); let this = JsValue::null();
let uint8buf = unsafe { Uint8Array::new(&Uint8Array::view(buf)) }; let uint8buf = unsafe { Uint8Array::new(&Uint8Array::view(buf)) };
callback.call1(&this, &JsValue::from(uint8buf)); let _ = callback.call1(&this, &JsValue::from(uint8buf));
} }
}); });
} }

View File

@@ -6,10 +6,10 @@
* @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved. * @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved.
*/ */
'use strict'; "use strict";
import { EventEmitter } from 'eventemitter3'; import { EventEmitter } from "eventemitter3";
import { ButtplugMessage } from '../core/Messages'; import { type ButtplugMessage } from "../core/Messages";
export class ButtplugBrowserWebsocketConnector extends EventEmitter { export class ButtplugBrowserWebsocketConnector extends EventEmitter {
protected _ws: WebSocket | undefined; protected _ws: WebSocket | undefined;
@@ -26,18 +26,20 @@ export class ButtplugBrowserWebsocketConnector extends EventEmitter {
public connect = async (): Promise<void> => { public connect = async (): Promise<void> => {
return new Promise<void>((resolve, reject) => { return new Promise<void>((resolve, reject) => {
const ws = new (this._websocketConstructor ?? WebSocket)(this._url); const ws = new (this._websocketConstructor ?? WebSocket)(this._url);
const onErrorCallback = (event: Event) => {reject(event)} const onErrorCallback = (event: Event) => {
const onCloseCallback = (event: CloseEvent) => reject(event.reason) reject(event);
ws.addEventListener('open', async () => { };
const onCloseCallback = (event: CloseEvent) => reject(event.reason);
ws.addEventListener("open", async () => {
this._ws = ws; this._ws = ws;
try { try {
await this.initialize(); await this.initialize();
this._ws.addEventListener('message', (msg) => { this._ws.addEventListener("message", (msg) => {
this.parseIncomingMessage(msg); this.parseIncomingMessage(msg);
}); });
this._ws.removeEventListener('close', onCloseCallback); this._ws.removeEventListener("close", onCloseCallback);
this._ws.removeEventListener('error', onErrorCallback); this._ws.removeEventListener("error", onErrorCallback);
this._ws.addEventListener('close', this.disconnect); this._ws.addEventListener("close", this.disconnect);
resolve(); resolve();
} catch (e) { } catch (e) {
reject(e); reject(e);
@@ -47,8 +49,8 @@ export class ButtplugBrowserWebsocketConnector extends EventEmitter {
// browsers usually only throw Error Code 1006. It's up to those using this // browsers usually only throw Error Code 1006. It's up to those using this
// library to state what the problem might be. // library to state what the problem might be.
ws.addEventListener('error', onErrorCallback) ws.addEventListener("error", onErrorCallback);
ws.addEventListener('close', onCloseCallback); ws.addEventListener("close", onCloseCallback);
}); });
}; };
@@ -58,14 +60,14 @@ export class ButtplugBrowserWebsocketConnector extends EventEmitter {
} }
this._ws!.close(); this._ws!.close();
this._ws = undefined; this._ws = undefined;
this.emit('disconnect'); this.emit("disconnect");
}; };
public sendMessage(msg: ButtplugMessage) { public sendMessage(msg: ButtplugMessage) {
if (!this.Connected) { if (!this.Connected) {
throw new Error('ButtplugBrowserWebsocketConnector not connected'); throw new Error("ButtplugBrowserWebsocketConnector not connected");
} }
this._ws!.send('[' + JSON.stringify(msg) + ']'); this._ws!.send("[" + JSON.stringify(msg) + "]");
} }
public initialize = async (): Promise<void> => { public initialize = async (): Promise<void> => {
@@ -73,9 +75,9 @@ export class ButtplugBrowserWebsocketConnector extends EventEmitter {
}; };
protected parseIncomingMessage(event: MessageEvent) { protected parseIncomingMessage(event: MessageEvent) {
if (typeof event.data === 'string') { if (typeof event.data === "string") {
const msgs: ButtplugMessage[] = JSON.parse(event.data); const msgs: ButtplugMessage[] = JSON.parse(event.data);
this.emit('message', msgs); this.emit("message", msgs);
} else if (event.data instanceof Blob) { } else if (event.data instanceof Blob) {
// No-op, we only use text message types. // No-op, we only use text message types.
} }
@@ -83,6 +85,6 @@ export class ButtplugBrowserWebsocketConnector extends EventEmitter {
protected onReaderLoad(event: Event) { protected onReaderLoad(event: Event) {
const msgs: ButtplugMessage[] = JSON.parse((event.target as FileReader).result as string); const msgs: ButtplugMessage[] = JSON.parse((event.target as FileReader).result as string);
this.emit('message', msgs); this.emit("message", msgs);
} }
} }

View File

@@ -6,8 +6,8 @@
* @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved. * @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved.
*/ */
import * as Messages from '../core/Messages'; import * as Messages from "../core/Messages";
import { ButtplugError } from '../core/Exceptions'; import { ButtplugError } from "../core/Exceptions";
export class ButtplugMessageSorter { export class ButtplugMessageSorter {
protected _counter = 1; protected _counter = 1;
@@ -21,9 +21,7 @@ export class ButtplugMessageSorter {
// One of the places we should actually return a promise, as we need to store // One of the places we should actually return a promise, as we need to store
// them while waiting for them to return across the line. // them while waiting for them to return across the line.
// tslint:disable:promise-function-async // tslint:disable:promise-function-async
public PrepareOutgoingMessage( public PrepareOutgoingMessage(msg: Messages.ButtplugMessage): Promise<Messages.ButtplugMessage> {
msg: Messages.ButtplugMessage
): Promise<Messages.ButtplugMessage> {
if (this._useCounter) { if (this._useCounter) {
Messages.setMsgId(msg, this._counter); Messages.setMsgId(msg, this._counter);
// Always increment last, otherwise we might lose sync // Always increment last, otherwise we might lose sync
@@ -31,22 +29,18 @@ export class ButtplugMessageSorter {
} }
let res; let res;
let rej; let rej;
const msgPromise = new Promise<Messages.ButtplugMessage>( const msgPromise = new Promise<Messages.ButtplugMessage>((resolve, reject) => {
(resolve, reject) => { res = resolve;
res = resolve; rej = reject;
rej = reject; });
}
);
this._waitingMsgs.set(Messages.msgId(msg), [res, rej]); this._waitingMsgs.set(Messages.msgId(msg), [res, rej]);
return msgPromise; return msgPromise;
} }
public ParseIncomingMessages( public ParseIncomingMessages(msgs: Messages.ButtplugMessage[]): Messages.ButtplugMessage[] {
msgs: Messages.ButtplugMessage[]
): Messages.ButtplugMessage[] {
const noMatch: Messages.ButtplugMessage[] = []; const noMatch: Messages.ButtplugMessage[] = [];
for (const x of msgs) { for (const x of msgs) {
let id = Messages.msgId(x); const id = Messages.msgId(x);
if (id !== Messages.SYSTEM_MESSAGE_ID && this._waitingMsgs.has(id)) { if (id !== Messages.SYSTEM_MESSAGE_ID && this._waitingMsgs.has(id)) {
const [res, rej] = this._waitingMsgs.get(id)!; const [res, rej] = this._waitingMsgs.get(id)!;
this._waitingMsgs.delete(id); this._waitingMsgs.delete(id);

View File

@@ -1,3 +1,3 @@
export function getRandomInt(max: number) { export function getRandomInt(max: number) {
return Math.floor(Math.random() * Math.floor(max)); return Math.floor(Math.random() * Math.floor(max));
} }

View File

@@ -184,6 +184,7 @@ impl HardwareSpecializer for WebBluetoothHardwareSpecializer {
pub enum WebBluetoothEvent { pub enum WebBluetoothEvent {
// This is the only way we have to get our endpoints back to device creation // This is the only way we have to get our endpoints back to device creation
// right now. My god this is a mess. // right now. My god this is a mess.
#[allow(dead_code)]
Connected(Vec<Endpoint>), Connected(Vec<Endpoint>),
Disconnected, Disconnected,
} }
@@ -201,6 +202,7 @@ pub enum WebBluetoothDeviceCommand {
HardwareSubscribeCmd, HardwareSubscribeCmd,
oneshot::Sender<Result<(), ButtplugDeviceError>>, oneshot::Sender<Result<(), ButtplugDeviceError>>,
), ),
#[allow(dead_code)]
Unsubscribe( Unsubscribe(
HardwareUnsubscribeCmd, HardwareUnsubscribeCmd,
oneshot::Sender<Result<(), ButtplugDeviceError>>, oneshot::Sender<Result<(), ButtplugDeviceError>>,
@@ -271,7 +273,7 @@ async fn run_webbluetooth_loop(
//let web_btle_device = WebBluetoothDeviceImpl::new(device, char_map); //let web_btle_device = WebBluetoothDeviceImpl::new(device, char_map);
info!("device created!"); info!("device created!");
let endpoints = char_map.keys().into_iter().cloned().collect(); let endpoints = char_map.keys().into_iter().cloned().collect();
device_local_event_sender let _ = device_local_event_sender
.send(WebBluetoothEvent::Connected(endpoints)) .send(WebBluetoothEvent::Connected(endpoints))
.await; .await;
while let Some(msg) = device_command_receiver.recv().await { while let Some(msg) = device_command_receiver.recv().await {
@@ -337,6 +339,7 @@ async fn run_webbluetooth_loop(
#[derive(Debug)] #[derive(Debug)]
pub struct WebBluetoothHardware { pub struct WebBluetoothHardware {
device_command_sender: mpsc::Sender<WebBluetoothDeviceCommand>, device_command_sender: mpsc::Sender<WebBluetoothDeviceCommand>,
#[allow(dead_code)]
device_event_receiver: mpsc::Receiver<WebBluetoothEvent>, device_event_receiver: mpsc::Receiver<WebBluetoothEvent>,
event_sender: broadcast::Sender<HardwareEvent>, event_sender: broadcast::Sender<HardwareEvent>,
} }

View File

@@ -1,11 +1,11 @@
{ {
"compilerOptions": { "compilerOptions": {
"target": "esnext", "target": "esnext",
"module": "esnext", "module": "esnext",
"outDir": "dist", "outDir": "dist",
"moduleResolution": "bundler", "moduleResolution": "bundler",
"esModuleInterop": true, "esModuleInterop": true,
"skipLibCheck": true "skipLibCheck": true
}, },
"include": ["src"] "include": ["src"]
} }

View File

@@ -3,19 +3,19 @@ import path from "path";
import wasm from "vite-plugin-wasm"; import wasm from "vite-plugin-wasm";
export default defineConfig({ export default defineConfig({
plugins: [wasm()], // include wasm plugin plugins: [wasm()], // include wasm plugin
build: { build: {
lib: { lib: {
entry: path.resolve(__dirname, "src/index.ts"), entry: path.resolve(__dirname, "src/index.ts"),
name: "buttplug", name: "buttplug",
fileName: "index", fileName: "index",
formats: ["es"], // this is important formats: ["es"], // this is important
}, },
minify: false, // for demo purposes minify: false, // for demo purposes
target: "esnext", // this is important as well target: "esnext", // this is important as well
outDir: "dist", outDir: "dist",
rollupOptions: { rollupOptions: {
external: [/\.\/wasm\//, /\.\.\/wasm\//], external: [/\.\/wasm\//, /\.\.\/wasm\//],
}, },
}, },
}); });

18
packages/email/email.css Normal file
View File

@@ -0,0 +1,18 @@
@import "@maizzle/tailwindcss";
@theme {
/* ── Design tokens — exact mirror of frontend app.css :root ── */
--color-background: oklch(0.98 0.01 320);
--color-foreground: oklch(0.08 0.02 280);
--color-card: oklch(0.99 0.005 320);
--color-card-foreground: oklch(0.08 0.02 280);
--color-muted: oklch(0.95 0.01 280);
--color-muted-foreground: oklch(0.4 0.02 280);
--color-border: oklch(0.85 0.02 280);
--color-primary: oklch(56.971% 0.27455 319.257);
--color-primary-foreground: oklch(0.98 0.01 320);
/* ── Font ── */
--font-sans:
"Noto Sans", -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, Helvetica, Arial, sans-serif;
}

View File

@@ -0,0 +1,25 @@
{
"name": "@sexy.pivoine.art/email",
"version": "1.0.0",
"private": true,
"main": "./dist/index.js",
"types": "./dist/index.d.ts",
"exports": {
".": {
"require": "./dist/index.js",
"types": "./dist/index.d.ts"
}
},
"scripts": {
"build": "tsc",
"dev": "tsc --watch"
},
"dependencies": {
"@maizzle/framework": "6.0.0-15",
"@maizzle/tailwindcss": "^1.2.0"
},
"devDependencies": {
"@types/node": "^22.0.0",
"typescript": "^5.9.3"
}
}

View File

@@ -0,0 +1,25 @@
import { renderTemplate } from "./render.js";
const BASE_URL = process.env.PUBLIC_URL ?? "https://sexy.pivoine.art";
export async function renderVerification(data: {
token: string;
}): Promise<{ subject: string; html: string }> {
return {
subject: "Verify your email address — sexy.pivoine.art",
html: await renderTemplate("verification", {
url: `${BASE_URL}/signup/verify?token=${data.token}`,
}),
};
}
export async function renderPasswordReset(data: {
token: string;
}): Promise<{ subject: string; html: string }> {
return {
subject: "Reset your password — sexy.pivoine.art",
html: await renderTemplate("password-reset", {
url: `${BASE_URL}/password/reset?token=${data.token}`,
}),
};
}

View File

@@ -0,0 +1,42 @@
import { readFile } from "node:fs/promises";
import path from "node:path";
// At runtime (dist/render.js), __dirname is packages/email/dist/
const PKG_ROOT = path.join(__dirname, "..");
const TEMPLATES_ROOT = path.join(PKG_ROOT, "templates");
const CSS_PATH = path.join(PKG_ROOT, "email.css");
const BASE_URL = process.env.PUBLIC_URL ?? "https://sexy.pivoine.art";
export interface RenderOptions {
url: string;
[key: string]: unknown;
}
export async function renderTemplate(name: string, locals: RenderOptions): Promise<string> {
// Dynamic import: @maizzle/framework v6 is ESM-only
const { render } = await import("@maizzle/framework");
const html = await readFile(path.join(TEMPLATES_ROOT, `${name}.html`), "utf8");
const { html: rendered } = await render(html, {
components: {
root: TEMPLATES_ROOT,
folders: ["layouts"],
},
// Override PostCSS `from` so @import "@maizzle/tailwindcss" resolves
// from this package's node_modules (defu gives our value priority).
postcss: {
options: {
from: CSS_PATH,
},
},
locals: {
cssPath: CSS_PATH, // layout uses {{ cssPath }} in <link href="{{ cssPath }}" inline>
baseUrl: BASE_URL,
...locals,
},
});
return rendered;
}

View File

@@ -0,0 +1,81 @@
<!doctype html>
<html
lang="en"
xmlns="http://www.w3.org/1999/xhtml"
xmlns:v="urn:schemas-microsoft-com:vml"
xmlns:o="urn:schemas-microsoft-com:office:office"
>
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
<meta name="x-apple-disable-message-reformatting" />
<!--[if mso]>
<noscript
><xml
><o:OfficeDocumentSettings
><o:PixelsPerInch>96</o:PixelsPerInch></o:OfficeDocumentSettings
></xml
></noscript
>
<![endif]-->
<title>{{ page.title || 'sexy.pivoine.art' }}</title>
<!-- Noto Sans — progressive enhancement for clients that support web fonts -->
<style plain>
@import url("https://fonts.googleapis.com/css2?family=Noto+Sans:wght@400;600;700&display=swap");
</style>
<!-- Design tokens + Tailwind preset — path resolved by render.ts -->
<link rel="stylesheet" href="{{ cssPath }}" inline />
</head>
<body class="bg-background m-0 p-0 font-sans">
<!-- Preview text (hidden) -->
<if condition="page.previewText || previewText">
<div class="hidden max-h-0 overflow-hidden">
{{ page.previewText || previewText }}
&nbsp;&zwnj;&nbsp;&zwnj;&nbsp;&zwnj;&nbsp;&zwnj;&nbsp;&zwnj;&nbsp;&zwnj;
</div>
</if>
<div class="py-8 px-4">
<table
class="w-full max-w-[600px] mx-auto"
role="presentation"
cellpadding="0"
cellspacing="0"
border="0"
>
<!-- Brand header — uses --foreground as dark bg -->
<tr>
<td class="bg-foreground rounded-t-2xl px-8 py-6 text-center">
<a href="{{ baseUrl }}" style="text-decoration: none">
<span class="text-sm font-semibold tracking-[0.22em] uppercase text-background">
sexy<span class="text-primary">.</span>pivoine<span class="text-primary">.</span>art
</span>
</a>
</td>
</tr>
<!-- Card body -->
<tr>
<td class="bg-card px-8 py-10 text-[14px] text-card-foreground leading-relaxed">
<yield />
</td>
</tr>
<!-- Footer -->
<tr>
<td class="bg-muted border-t border-border rounded-b-2xl px-8 py-6 text-center">
<p class="text-[11px] text-muted-foreground m-0">
&copy; {{ new Date().getFullYear() }} sexy.pivoine.art &mdash; For adults only (18+)
</p>
<p class="text-[11px] text-muted-foreground mt-2 mb-0">
If you did not request this email, you can safely ignore it.
</p>
</td>
</tr>
</table>
</div>
</body>
</html>

View File

@@ -0,0 +1,41 @@
---
title: "Reset your password — sexy.pivoine.art"
previewText: "You requested a password reset. Use the link below to set a new one."
---
<x-main>
<h1 class="text-[22px] font-semibold text-foreground m-0 mb-2">Reset your password</h1>
<p class="text-muted-foreground m-0 mb-6">
We received a request to reset the password for your account. Click the button below to choose a
new one.
</p>
<!-- CTA button — inline style needed for Outlook -->
<table role="presentation" cellpadding="0" cellspacing="0" border="0" class="mb-6">
<tr>
<td class="rounded-lg" style="background: #b700d9">
<a
href="{{ url }}"
class="inline-block px-8 py-[14px] text-[14px] font-semibold text-primary-foreground no-underline rounded-lg"
style="background: #b700d9; color: #faf4fb"
>
Reset my password
</a>
</td>
</tr>
</table>
<p class="text-[13px] text-muted-foreground m-0 mb-6">
This link expires in <strong class="text-foreground">1 hour</strong>. If you did not request a
password reset, no action is needed — your account remains secure.
</p>
<hr class="border-0 border-t border-border my-6" />
<p class="text-[12px] text-muted-foreground m-0">
Button not working? Copy and paste this link into your browser:
</p>
<p class="text-[12px] m-0 mt-1">
<a href="{{ url }}" class="text-primary break-all" style="color: #b700d9"> {{ url }} </a>
</p>
</x-main>

View File

@@ -0,0 +1,40 @@
---
title: "Verify your email — sexy.pivoine.art"
previewText: "Almost there — confirm your email address to activate your account."
---
<x-main>
<h1 class="text-[22px] font-semibold text-foreground m-0 mb-2">Verify your email address</h1>
<p class="text-muted-foreground m-0 mb-6">
Thanks for signing up! Click the button below to confirm your email address and activate your
account.
</p>
<!-- CTA button — inline style needed for Outlook -->
<table role="presentation" cellpadding="0" cellspacing="0" border="0" class="mb-6">
<tr>
<td class="rounded-lg" style="background: #b700d9">
<a
href="{{ url }}"
class="inline-block px-8 py-[14px] text-[14px] font-semibold text-primary-foreground no-underline rounded-lg"
style="background: #b700d9; color: #faf4fb"
>
Verify my email
</a>
</td>
</tr>
</table>
<p class="text-[13px] text-muted-foreground m-0 mb-6">
This link expires in <strong class="text-foreground">24 hours</strong>.
</p>
<hr class="border-0 border-t border-border my-6" />
<p class="text-[12px] text-muted-foreground m-0">
Button not working? Copy and paste this link into your browser:
</p>
<p class="text-[12px] m-0 mt-1">
<a href="{{ url }}" class="text-primary break-all" style="color: #b700d9"> {{ url }} </a>
</p>
</x-main>

View File

@@ -0,0 +1,14 @@
{
"compilerOptions": {
"target": "ES2022",
"module": "Node16",
"moduleResolution": "Node16",
"lib": ["ES2022"],
"outDir": "dist",
"declaration": true,
"esModuleInterop": true,
"skipLibCheck": true,
"strict": true
},
"include": ["src"]
}

View File

@@ -1,16 +1,16 @@
{ {
"$schema": "https://shadcn-svelte.com/schema.json", "$schema": "https://shadcn-svelte.com/schema.json",
"tailwind": { "tailwind": {
"css": "src/app.css", "css": "src/app.css",
"baseColor": "slate" "baseColor": "slate"
}, },
"aliases": { "aliases": {
"components": "$lib/components", "components": "$lib/components",
"utils": "$lib/utils", "utils": "$lib/utils",
"ui": "$lib/components/ui", "ui": "$lib/components/ui",
"hooks": "$lib/hooks", "hooks": "$lib/hooks",
"lib": "$lib" "lib": "$lib"
}, },
"typescript": true, "typescript": true,
"registry": "https://shadcn-svelte.com/registry" "registry": "https://shadcn-svelte.com/registry"
} }

View File

@@ -1,16 +1,16 @@
{ {
"$schema": "https://unpkg.com/jsrepo@2.4.9/schemas/project-config.json", "$schema": "https://unpkg.com/jsrepo@2.4.9/schemas/project-config.json",
"repos": ["@ieedan/shadcn-svelte-extras"], "repos": ["@ieedan/shadcn-svelte-extras"],
"includeTests": false, "includeTests": false,
"includeDocs": false, "includeDocs": false,
"watermark": true, "watermark": true,
"formatter": "prettier", "formatter": "prettier",
"configFiles": {}, "configFiles": {},
"paths": { "paths": {
"*": "$lib/blocks", "*": "$lib/blocks",
"ui": "$lib/components/ui", "ui": "$lib/components/ui",
"actions": "$lib/actions", "actions": "$lib/actions",
"hooks": "$lib/hooks", "hooks": "$lib/hooks",
"utils": "$lib/utils" "utils": "$lib/utils"
} }
} }

View File

@@ -1,20 +1,21 @@
{ {
"name": "@sexy.pivoine.art/frontend", "name": "@sexy.pivoine.art/frontend",
"version": "1.0.0", "version": "1.0.0",
"author": "valknarogg",
"type": "module", "type": "module",
"private": true, "private": true,
"scripts": { "scripts": {
"dev": "vite", "dev": "vite",
"build": "vite build", "build": "vite build",
"preview": "vite preview", "preview": "vite preview",
"start": "node ./build" "start": "node ./build",
"check": "svelte-check --tsconfig ./tsconfig.json --threshold warning"
}, },
"devDependencies": { "devDependencies": {
"@sexy.pivoine.art/buttplug": "workspace:*",
"@iconify-json/ri": "^1.2.10", "@iconify-json/ri": "^1.2.10",
"@iconify/tailwind4": "^1.2.1", "@iconify/tailwind4": "^1.2.1",
"@internationalized/date": "^3.11.0", "@internationalized/date": "^3.12.0",
"@lucide/svelte": "^0.577.0", "@lucide/svelte": "^0.561.0",
"@sveltejs/adapter-node": "^5.5.4", "@sveltejs/adapter-node": "^5.5.4",
"@sveltejs/adapter-static": "^3.0.10", "@sveltejs/adapter-static": "^3.0.10",
"@sveltejs/kit": "^2.53.4", "@sveltejs/kit": "^2.53.4",
@@ -28,22 +29,22 @@
"glob": "^13.0.6", "glob": "^13.0.6",
"mode-watcher": "^1.1.0", "mode-watcher": "^1.1.0",
"prettier-plugin-svelte": "^3.5.1", "prettier-plugin-svelte": "^3.5.1",
"super-sitemap": "^1.0.7",
"svelte": "^5.53.7", "svelte": "^5.53.7",
"svelte-check": "^4.4.4",
"svelte-sonner": "^1.0.8", "svelte-sonner": "^1.0.8",
"tailwind-merge": "^3.5.0", "tailwind-merge": "^3.5.0",
"tailwind-variants": "^3.2.2", "tailwind-variants": "^3.2.2",
"tailwindcss": "^4.2.1", "tailwindcss": "^4.2.1",
"tw-animate-css": "^1.4.0", "tw-animate-css": "^1.4.0",
"typescript": "^5.9.3", "typescript": "^5.9.3",
"vite": "^7.3.1", "vite": "^7.3.1"
"vite-plugin-wasm": "3.5.0"
}, },
"dependencies": { "dependencies": {
"@sexy.pivoine.art/buttplug": "workspace:*", "@sexy.pivoine.art/types": "workspace:*",
"graphql": "^16.11.0", "graphql": "^16.11.0",
"graphql-request": "^7.1.2", "graphql-request": "^7.1.2",
"javascript-time-ago": "^2.6.4", "javascript-time-ago": "^2.6.4",
"marked": "^17.0.4",
"media-chrome": "^4.18.0", "media-chrome": "^4.18.0",
"svelte-i18n": "^4.0.1" "svelte-i18n": "^4.0.1"
} }

View File

@@ -3,85 +3,94 @@
@plugin "@iconify/tailwind4"; @plugin "@iconify/tailwind4";
@utility scrollbar-none {
scrollbar-width: none;
&::-webkit-scrollbar {
display: none;
}
}
@custom-variant dark (&:where(.dark, .dark *)); @custom-variant dark (&:where(.dark, .dark *));
@custom-variant hover (&:hover);
@theme { @theme {
--animate-vibrate: vibrate 0.3s linear infinite; --animate-vibrate: vibrate 0.3s linear infinite;
--animate-fade-in: fadeIn 0.3s ease-out; --animate-fade-in: fadeIn 0.3s ease-out;
--animate-slide-up: slideUp 0.4s cubic-bezier(0.4, 0, 0.2, 1); --animate-slide-up: slideUp 0.4s cubic-bezier(0.4, 0, 0.2, 1);
--animate-zoom-in: zoomIn 0.4s cubic-bezier(0.4, 0, 0.2, 1); --animate-zoom-in: zoomIn 0.4s cubic-bezier(0.4, 0, 0.2, 1);
--animate-pulse-glow: pulseGlow 2s infinite; --animate-pulse-glow: pulseGlow 2s infinite;
@keyframes vibrate { @keyframes vibrate {
0% { 0% {
transform: translate(0); transform: translate(0);
} }
20% { 20% {
transform: translate(-2px, 2px); transform: translate(-2px, 2px);
} }
40% { 40% {
transform: translate(-2px, -2px); transform: translate(-2px, -2px);
} }
60% { 60% {
transform: translate(2px, 2px); transform: translate(2px, 2px);
} }
80% { 80% {
transform: translate(2px, -2px); transform: translate(2px, -2px);
} }
100% { 100% {
transform: translate(0); transform: translate(0);
} }
} }
@keyframes fadeIn { @keyframes fadeIn {
0% { 0% {
opacity: 0; opacity: 0;
} }
100% { 100% {
opacity: 1; opacity: 1;
} }
} }
@keyframes slideUp { @keyframes slideUp {
0% { 0% {
opacity: 0; opacity: 0;
transform: translateY(30px) scale(0.95); transform: translateY(30px) scale(0.95);
} }
100% { 100% {
opacity: 1; opacity: 1;
transform: translateY(0) scale(1); transform: translateY(0) scale(1);
} }
} }
@keyframes zoomIn { @keyframes zoomIn {
0% { 0% {
opacity: 0; opacity: 0;
transform: scale(0.9); transform: scale(0.9);
} }
100% { 100% {
opacity: 1; opacity: 1;
transform: scale(1); transform: scale(1);
} }
} }
@keyframes pulseGlow { @keyframes pulseGlow {
0%, 0%,
100% { 100% {
boxShadow: 0 0 20px rgba(183, 0, 217, 0.3); boxshadow: 0 0 20px rgba(183, 0, 217, 0.3);
} }
50% { 50% {
boxShadow: 0 0 40px rgba(183, 0, 217, 0.6); boxshadow: 0 0 40px rgba(183, 0, 217, 0.6);
} }
} }
} }
/* /*
@@ -93,134 +102,159 @@
color utility to any element that depends on these defaults. color utility to any element that depends on these defaults.
*/ */
@layer base { @layer base {
* { * {
@supports (color: color-mix(in lab, red, red)) { @supports (color: color-mix(in lab, red, red)) {
outline-color: color-mix(in oklab, var(--ring) 50%, transparent); outline-color: color-mix(in oklab, var(--ring) 50%, transparent);
} }
} }
* { * {
border-color: var(--border); border-color: var(--border);
outline-color: var(--ring); outline-color: var(--ring);
} scrollbar-width: thin;
scrollbar-color: color-mix(in oklab, var(--primary) 40%, transparent) transparent;
}
.prose h2 { *::-webkit-scrollbar {
@apply text-2xl font-bold mt-8 mb-4 text-foreground; width: 6px;
} height: 6px;
}
.prose h3 { *::-webkit-scrollbar-track {
@apply text-xl font-semibold mt-6 mb-3 text-foreground; background: transparent;
} }
.prose p { *::-webkit-scrollbar-thumb {
@apply mb-4 leading-relaxed; background-color: color-mix(in oklab, var(--primary) 40%, transparent);
} border-radius: 9999px;
}
.prose ul { *::-webkit-scrollbar-thumb:hover {
@apply mb-4 pl-6; background-color: color-mix(in oklab, var(--primary) 70%, transparent);
} }
.prose li { html {
@apply mb-2; scrollbar-width: thin;
} scrollbar-color: color-mix(in oklab, var(--primary) 40%, transparent) transparent;
}
.prose h2 {
@apply text-2xl font-bold mt-8 mb-4 text-foreground;
}
.prose h3 {
@apply text-xl font-semibold mt-6 mb-3 text-foreground;
}
.prose p {
@apply mb-4 leading-relaxed;
}
.prose ul {
@apply mb-4 pl-6;
}
.prose li {
@apply mb-2;
}
} }
:root { :root {
--default-font-family: "Noto Sans", sans-serif; --default-font-family: "Noto Sans", sans-serif;
--background: oklch(0.98 0.01 320); --background: oklch(0.98 0.01 320);
--foreground: oklch(0.08 0.02 280); --foreground: oklch(0.08 0.02 280);
--muted: oklch(0.95 0.01 280); --muted: oklch(0.95 0.01 280);
--muted-foreground: oklch(0.4 0.02 280); --muted-foreground: oklch(0.4 0.02 280);
--popover: oklch(1 0 0); --popover: oklch(1 0 0);
--popover-foreground: oklch(0.145 0 0); --popover-foreground: oklch(0.145 0 0);
--card: oklch(0.99 0.005 320); --card: oklch(0.99 0.005 320);
--card-foreground: oklch(0.08 0.02 280); --card-foreground: oklch(0.08 0.02 280);
--border: oklch(0.85 0.02 280); --border: oklch(0.85 0.02 280);
--input: oklch(0.922 0 0); --input: oklch(0.922 0 0);
--primary: oklch(56.971% 0.27455 319.257); --primary: oklch(56.971% 0.27455 319.257);
--primary-foreground: oklch(0.98 0.01 320); --primary-foreground: oklch(0.98 0.01 320);
--secondary: oklch(0.92 0.02 260); --secondary: oklch(0.92 0.02 260);
--secondary-foreground: oklch(0.15 0.05 260); --secondary-foreground: oklch(0.15 0.05 260);
--accent: oklch(0.45 0.35 280); --accent: oklch(0.45 0.35 280);
--accent-foreground: oklch(0.98 0.01 280); --accent-foreground: oklch(0.98 0.01 280);
--destructive: oklch(0.577 0.245 27.325); --destructive: oklch(0.577 0.245 27.325);
--destructive-foreground: oklch(0.985 0 0); --destructive-foreground: oklch(0.985 0 0);
--ring: oklch(0.55 0.3 320); --ring: oklch(0.55 0.3 320);
--sidebar: oklch(0.985 0 0); --sidebar: oklch(0.985 0 0);
--sidebar-foreground: oklch(0.145 0 0); --sidebar-foreground: oklch(0.145 0 0);
--sidebar-primary: oklch(0.205 0 0); --sidebar-primary: oklch(0.205 0 0);
--sidebar-primary-foreground: oklch(0.985 0 0); --sidebar-primary-foreground: oklch(0.985 0 0);
--sidebar-accent: oklch(0.97 0 0); --sidebar-accent: oklch(0.97 0 0);
--sidebar-accent-foreground: oklch(0.205 0 0); --sidebar-accent-foreground: oklch(0.205 0 0);
--sidebar-border: oklch(0.922 0 0); --sidebar-border: oklch(0.922 0 0);
--sidebar-ring: oklch(0.708 0 0); --sidebar-ring: oklch(0.708 0 0);
} }
.dark { .dark {
--background: oklch(0.08 0.02 280); --background: oklch(0.08 0.02 280);
--foreground: oklch(0.98 0.01 280); --foreground: oklch(0.98 0.01 280);
--muted: oklch(0.12 0.03 280); --muted: oklch(0.12 0.03 280);
--muted-foreground: oklch(0.6 0.02 280); --muted-foreground: oklch(0.6 0.02 280);
--popover: oklch(0.205 0 0); --popover: oklch(0.205 0 0);
--popover-foreground: oklch(0.985 0 0); --popover-foreground: oklch(0.985 0 0);
--card: oklch(0.1 0.02 280); --card: oklch(0.1 0.02 280);
--card-foreground: oklch(0.95 0.01 280); --card-foreground: oklch(0.95 0.01 280);
--border: oklch(0.2 0.05 280); --border: oklch(0.2 0.05 280);
--input: oklch(1 0 0 / 0.15); --input: oklch(1 0 0 / 0.15);
--primary: oklch(0.65 0.25 320); --primary: oklch(65.054% 0.25033 319.934);
--primary-foreground: oklch(0.98 0.01 320); --primary-foreground: oklch(0.98 0.01 320);
--secondary: oklch(0.15 0.05 260); --secondary: oklch(0.15 0.05 260);
--secondary-foreground: oklch(0.9 0.02 260); --secondary-foreground: oklch(0.9 0.02 260);
--accent: oklch(0.55 0.3 280); --accent: oklch(0.55 0.3 280);
--accent-foreground: oklch(0.98 0.01 280); --accent-foreground: oklch(0.98 0.01 280);
--destructive: oklch(0.704 0.191 22.216); --destructive: oklch(0.704 0.191 22.216);
--destructive-foreground: oklch(0.985 0 0); --destructive-foreground: oklch(0.985 0 0);
--ring: oklch(0.65 0.25 320); --ring: oklch(0.65 0.25 320);
--sidebar: oklch(0.205 0 0); --sidebar: oklch(0.205 0 0);
--sidebar-foreground: oklch(0.985 0 0); --sidebar-foreground: oklch(0.985 0 0);
--sidebar-primary: oklch(0.488 0.243 264.376); --sidebar-primary: oklch(0.488 0.243 264.376);
--sidebar-primary-foreground: oklch(0.985 0 0); --sidebar-primary-foreground: oklch(0.985 0 0);
--sidebar-accent: oklch(0.269 0 0); --sidebar-accent: oklch(0.269 0 0);
--sidebar-accent-foreground: oklch(0.985 0 0); --sidebar-accent-foreground: oklch(0.985 0 0);
--sidebar-border: oklch(1 0 0 / 0.1); --sidebar-border: oklch(1 0 0 / 0.1);
--sidebar-ring: oklch(0.556 0 0); --sidebar-ring: oklch(0.556 0 0);
} }
@theme inline { @theme inline {
--color-background: var(--background); --color-background: var(--background);
--color-foreground: var(--foreground); --color-foreground: var(--foreground);
--color-card: var(--card); --color-card: var(--card);
--color-card-foreground: var(--card-foreground); --color-card-foreground: var(--card-foreground);
--color-popover: var(--popover); --color-popover: var(--popover);
--color-popover-foreground: var(--popover-foreground); --color-popover-foreground: var(--popover-foreground);
--color-primary: var(--primary); --color-primary: var(--primary);
--color-primary-foreground: var(--primary-foreground); --color-primary-foreground: var(--primary-foreground);
--color-secondary: var(--secondary); --color-secondary: var(--secondary);
--color-secondary-foreground: var(--secondary-foreground); --color-secondary-foreground: var(--secondary-foreground);
--color-muted: var(--muted); --color-muted: var(--muted);
--color-muted-foreground: var(--muted-foreground); --color-muted-foreground: var(--muted-foreground);
--color-accent: var(--accent); --color-accent: var(--accent);
--color-accent-foreground: var(--accent-foreground); --color-accent-foreground: var(--accent-foreground);
--color-destructive: var(--destructive); --color-destructive: var(--destructive);
--color-destructive-foreground: var(--destructive-foreground); --color-destructive-foreground: var(--destructive-foreground);
--color-border: var(--border); --color-border: var(--border);
--color-input: var(--input); --color-input: var(--input);
--color-ring: var(--ring); --color-ring: var(--ring);
--color-chart-1: var(--chart-1); --color-chart-1: var(--chart-1);
--color-chart-2: var(--chart-2); --color-chart-2: var(--chart-2);
--color-chart-3: var(--chart-3); --color-chart-3: var(--chart-3);
--color-chart-4: var(--chart-4); --color-chart-4: var(--chart-4);
--color-chart-5: var(--chart-5); --color-chart-5: var(--chart-5);
--color-sidebar: var(--sidebar); --color-sidebar: var(--sidebar);
--color-sidebar-foreground: var(--sidebar-foreground); --color-sidebar-foreground: var(--sidebar-foreground);
--color-sidebar-primary: var(--sidebar-primary); --color-sidebar-primary: var(--sidebar-primary);
--color-sidebar-primary-foreground: var(--sidebar-primary-foreground); --color-sidebar-primary-foreground: var(--sidebar-primary-foreground);
--color-sidebar-accent: var(--sidebar-accent); --color-sidebar-accent: var(--sidebar-accent);
--color-sidebar-accent-foreground: var(--sidebar-accent-foreground); --color-sidebar-accent-foreground: var(--sidebar-accent-foreground);
--color-sidebar-border: var(--sidebar-border); --color-sidebar-border: var(--sidebar-border);
--color-sidebar-ring: var(--sidebar-ring); --color-sidebar-ring: var(--sidebar-ring);
--font-sans: var(--font-sans); --font-sans: var(--font-sans);
--font-mono: var(--font-mono); --font-mono: var(--font-mono);
--font-serif: var(--font-serif); --font-serif: var(--font-serif);
} }

View File

@@ -4,22 +4,22 @@ import type { AuthStatus } from "$lib/types";
// for information about these interfaces // for information about these interfaces
declare global { declare global {
namespace App { namespace App {
// interface Error {} // interface Error {}
interface Locals { interface Locals {
authStatus: AuthStatus; authStatus: AuthStatus;
requestId: string; requestId: string;
} }
// interface PageData {} // interface PageData {}
// interface PageState {} // interface PageState {}
// interface Platform {} // interface Platform {}
} }
interface Window { interface Window {
sidebar: { sidebar: {
addPanel: () => void; addPanel: () => void;
}; };
opera: object; opera: object;
} }
} }
export {}; export {};

View File

@@ -1,24 +1,23 @@
<!doctype html> <!doctype html>
<html lang="en"> <html lang="en">
<head>
<head>
<meta charset="utf-8" /> <meta charset="utf-8" />
<meta name="viewport" content="width=device-width, initial-scale=1" /> <meta name="viewport" content="width=device-width, initial-scale=1" />
<link rel="apple-touch-icon" sizes="180x180" href="/apple-touch-icon.png" /> <link rel="apple-touch-icon" sizes="180x180" href="/apple-touch-icon.png" />
<link rel="icon" type="image/png" sizes="32x32" href="/favicon-32x32.png" /> <link rel="icon" type="image/png" sizes="32x32" href="/favicon-32x32.png" />
<link rel="icon" type="image/png" sizes="16x16" href="/favicon-16x16.png" /> <link rel="icon" type="image/png" sizes="16x16" href="/favicon-16x16.png" />
<link rel="preconnect" href="https://fonts.googleapis.com"> <link rel="preconnect" href="https://fonts.googleapis.com" />
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin> <link rel="preconnect" href="https://fonts.gstatic.com" crossorigin />
<link href="https://fonts.googleapis.com/css2?family=Dancing+Script:wght@400..700&family=Noto+Sans:ital,wght@0,100..900;1,100..900&display=swap" rel="stylesheet"> <link
href="https://fonts.googleapis.com/css2?family=Noto+Sans:ital,wght@0,100..900;1,100..900&display=swap"
rel="stylesheet"
/>
<link rel="manifest" href="/site.webmanifest" /> <link rel="manifest" href="/site.webmanifest" />
%sveltekit.head% %sveltekit.head%
</head> </head>
<body data-sveltekit-preload-data="hover" class="dark"> <body data-sveltekit-preload-data="hover" class="dark">
<div style="display: contents">%sveltekit.body%</div> <div style="display: contents">%sveltekit.body%</div>
</body> </body>
</html>
</html>

View File

@@ -1,97 +1,98 @@
import { isAuthenticated } from "$lib/services"; import { redirect } from "@sveltejs/kit";
import { isAuthenticated, UnauthorizedError } from "$lib/services";
import { logger, generateRequestId } from "$lib/logger"; import { logger, generateRequestId } from "$lib/logger";
import type { Handle } from "@sveltejs/kit"; import type { Handle } from "@sveltejs/kit";
// Log startup info once // Log startup info once (module-level code runs exactly once on import)
let hasLoggedStartup = false; logger.startup();
if (!hasLoggedStartup) {
logger.startup();
hasLoggedStartup = true;
}
export const handle: Handle = async ({ event, resolve }) => { export const handle: Handle = async ({ event, resolve }) => {
const { cookies, locals, url, request } = event; const { cookies, locals, url, request } = event;
const startTime = Date.now(); const startTime = Date.now();
// Generate unique request ID // Generate unique request ID
const requestId = generateRequestId(); const requestId = generateRequestId();
// Add request ID to locals for access in other handlers // Add request ID to locals for access in other handlers
locals.requestId = requestId; locals.requestId = requestId;
// Log incoming request // Log incoming request
logger.request(request.method, url.pathname, { logger.request(request.method, url.pathname, {
requestId, requestId,
context: { context: {
userAgent: request.headers.get('user-agent')?.substring(0, 100), userAgent: request.headers.get("user-agent")?.substring(0, 100),
referer: request.headers.get('referer'), referer: request.headers.get("referer"),
ip: request.headers.get('x-forwarded-for') || request.headers.get('x-real-ip'), ip: request.headers.get("x-forwarded-for") || request.headers.get("x-real-ip"),
}, },
}); });
// Handle authentication // Handle authentication
const token = cookies.get("session_token"); const token = cookies.get("session_token");
if (token) { if (token) {
try { try {
locals.authStatus = await isAuthenticated(token); locals.authStatus = await isAuthenticated(token);
if (locals.authStatus.authenticated) { if (locals.authStatus.authenticated) {
logger.auth('Token validated', true, { logger.auth("Token validated", true, {
requestId, requestId,
userId: locals.authStatus.user?.id, userId: locals.authStatus.user?.id,
context: { context: {
email: locals.authStatus.user?.email, email: locals.authStatus.user?.email,
role: locals.authStatus.user?.role, role: locals.authStatus.user?.role,
}, },
}); });
} else { } else {
logger.auth('Token invalid', false, { requestId }); logger.auth("Token invalid", false, { requestId });
} }
} catch (error) { } catch (error) {
logger.error('Authentication check failed', { logger.error("Authentication check failed", {
requestId, requestId,
error: error instanceof Error ? error : new Error(String(error)), error: error instanceof Error ? error : new Error(String(error)),
}); });
locals.authStatus = { authenticated: false }; locals.authStatus = { authenticated: false };
} }
} else { } else {
logger.debug('No session token found', { requestId }); logger.debug("No session token found", { requestId });
locals.authStatus = { authenticated: false }; locals.authStatus = { authenticated: false };
} }
// Resolve the request // Resolve the request
let response: Response; let response: Response;
try { try {
response = await resolve(event, { response = await resolve(event, {
filterSerializedResponseHeaders: (key) => { filterSerializedResponseHeaders: (key) => {
return key.toLowerCase() === "content-type"; return key.toLowerCase() === "content-type";
}, },
}); });
} catch (error) { } catch (error) {
const duration = Date.now() - startTime; if (error instanceof UnauthorizedError) {
logger.error('Request handler error', { const loginUrl = `/login?redirect=${encodeURIComponent(url.pathname)}`;
requestId, throw redirect(303, loginUrl);
method: request.method, }
path: url.pathname, const duration = Date.now() - startTime;
duration, logger.error("Request handler error", {
error: error instanceof Error ? error : new Error(String(error)), requestId,
}); method: request.method,
throw error; path: url.pathname,
} duration,
error: error instanceof Error ? error : new Error(String(error)),
});
throw error;
}
// Log response // Log response
const duration = Date.now() - startTime; const duration = Date.now() - startTime;
logger.response(request.method, url.pathname, response.status, duration, { logger.response(request.method, url.pathname, response.status, duration, {
requestId, requestId,
userId: locals.authStatus.authenticated ? locals.authStatus.user?.id : undefined, userId: locals.authStatus.authenticated ? locals.authStatus.user?.id : undefined,
context: { context: {
cached: response.headers.get('x-sveltekit-page') === 'true', cached: response.headers.get("x-sveltekit-page") === "true",
}, },
}); });
// Add request ID to response headers (useful for debugging) // Add request ID to response headers (useful for debugging)
response.headers.set('x-request-id', requestId); response.headers.set("x-request-id", requestId);
return response; return response;
}; };

View File

@@ -11,7 +11,7 @@ export const getGraphQLClient = (fetchFn?: typeof globalThis.fetch) =>
}); });
export const getAssetUrl = ( export const getAssetUrl = (
id: string, id: string | null | undefined,
transform?: "mini" | "thumbnail" | "preview" | "medium" | "banner", transform?: "mini" | "thumbnail" | "preview" | "medium" | "banner",
) => { ) => {
if (!id) { if (!id) {

Some files were not shown because too many files have changed in this diff Show More