Compare commits

76 Commits

Author SHA1 Message Date
ac63e59906 style: remove card wrapper from error page
Some checks failed
Build and Push Backend Image / build (push) Failing after 27s
Build and Push Frontend Image / build (push) Successful in 5m7s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 17:28:11 +01:00
19d29cbfc6 fix: replace flyout profile card with logout slider, i18n auth errors
- Replace static account card in mobile flyout with swipe-to-logout widget
- Remove redundant logout button from flyout bottom
- Make LogoutButton full-width via class prop and dynamic maxSlide
- Extract clean GraphQL error messages instead of raw JSON in all auth forms
- Add i18n keys for known backend errors (invalid credentials, email taken, invalid token)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 17:26:14 +01:00
0ec27117ae style: streamline /me page header to match admin dashboard style
Replace large gradient title with simple text-2xl font-bold heading,
matching the header pattern used across admin pages.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 17:13:19 +01:00
ed9eb6ef22 style: fix admin table padding — edge-to-edge on mobile, no right pad on desktop
Mobile: remove horizontal padding so tables fill full width with top/bottom
borders only. Desktop: keep left padding, table extends to right edge.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 17:08:16 +01:00
609f116b5d feat: replace native date inputs with shadcn date picker
Add calendar + popover components and a custom DateTimePicker wrapper.
Video forms use date-only; article forms include a time picker.
Also add video player preview to the video edit form.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 17:03:35 +01:00
e943876e70 fix: prevent age verification dialog flicker on page load
Initialize isOpen as false and only open in onMount if not yet verified,
instead of opening immediately and closing after localStorage check.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 16:52:48 +01:00
7d373b3aa3 i18n: internationalize all admin pages
Add full i18n coverage for the admin section — locale keys, layout nav,
users, videos, and articles pages (list, new, edit).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 16:49:30 +01:00
95fd9f48fc refactor: align article author with VideoModel, streamline selects, fix flyout inert
- Remove ArticleAuthor type; article.author now reuses VideoModel (id, artist_name, slug, avatar)
- updateArticle accepts authorId; author selectable in admin article edit page
- Article edit: single Select with bind:value + $derived selectedAuthor display
- Video edit: replace pill toggles with Select type="multiple" bind:value for models
- Video table: replace inline badge spans with Badge component
- Magazine: display artist_name throughout, author bio links to model profile
- Fix flyout aria-hidden warning: replace with inert attribute

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 16:31:41 +01:00
670c18bcb7 feat: refactor role system to is_admin flag, add Badge component, fix native dialogs
- Separate admin identity from role: viewer|model + is_admin boolean flag
- DB migration 0001_is_admin: adds column, migrates former admin role users
- Update ACL helpers, auth session, GraphQL types and all resolvers
- Admin layout guard and header links check is_admin instead of role
- Admin users table: show Admin badge next to name, remove admin from role select
- Admin user edit page: is_admin checkbox toggle
- Install shadcn Badge component; use in admin users table
- Fix duplicate photo keys in adminGetUser resolver
- Replace confirm() in /me recordings with Dialog component

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 16:14:00 +01:00
9ef490c1e5 fix: make deleteRecording a hard delete instead of soft archive
Previously deleteRecording set status to "archived", leaving the row
in the DB and visible in queries without a status filter. Now it hard-
deletes the row. Also excludes archived recordings from the default
recordings query so any pre-existing archived rows no longer appear.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 14:45:59 +01:00
434e926f77 style: use primary color for scrollbar thumbs
All checks were successful
Build and Push Backend Image / build (push) Successful in 17s
Build and Push Frontend Image / build (push) Successful in 4m0s
40% opacity at rest, 70% on hover, adapts to light/dark theme.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 14:42:15 +01:00
7a9ce0c3b1 fix: explicitly style html root scrollbar for Firefox
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 14:39:44 +01:00
ff1e1f6679 feat: style scrollbars globally using theme colors
Thin scrollbars using --border for thumb and transparent track,
with --muted-foreground on hover. Uses both scrollbar-color (Firefox)
and ::-webkit-scrollbar (Chrome/Safari).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 14:38:02 +01:00
648123fab5 feat: mobile-optimize admin section
- Layout: sidebar hidden on mobile, replaced with horizontal top nav strip
- Tables: overflow-x-auto + hide secondary columns (email/category/dates/
  plays/likes) on small screens; show email inline under name on mobile
- Forms: grid-cols-2 → grid-cols-1 sm:grid-cols-2 on all admin forms
- Markdown editor: Write/Preview tab toggle on mobile, side-by-side on sm+
- Padding: p-3 sm:p-6 on all admin pages for tighter mobile layout

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 14:36:52 +01:00
a7fafaf7c5 refactor: replace native select with shadcn Select for user role in admin
All checks were successful
Build and Push Backend Image / build (push) Successful in 1m10s
Build and Push Frontend Image / build (push) Successful in 5m8s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 13:34:39 +01:00
b71d7dc559 refactor: remove duplicate utils/utils.ts, consolidate into utils.ts
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 13:31:35 +01:00
f764e27d59 fix: shrink flyout account card, remove online indicator
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 13:29:33 +01:00
d7eb2acc6c fix: match mobile flyout header height to main header (h-16)
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 13:27:31 +01:00
fb38d6b9a9 fix: constrain admin layout to container width matching rest of site
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 13:22:15 +01:00
d021acaf0b feat: add admin user edit page with avatar, banner, and photo gallery
- Backend: adminGetUser query returns user + photos; adminUpdateUser now
  accepts avatarId/bannerId; new adminAddUserPhoto and adminRemoveUserPhoto
  mutations; AdminUserDetailType added to GraphQL schema
- Frontend: /admin/users/[id] page for editing name, avatar, banner, and
  managing the model photo gallery (upload multiple, delete individually)
- Admin users list: edit button per row linking to the detail page

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 13:18:43 +01:00
e06a1915f2 fix: remove backdrop-blur overlay causing blurry text site-wide
The full-screen glassmorphism overlay had backdrop-blur-[0.5px] which
triggered GPU compositing on the entire viewport, degrading subpixel
text rendering inconsistently. Also use globalThis.fetch (not SvelteKit
fetch) when forwarding session token in admin SSR calls to avoid header
stripping.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 13:11:54 +01:00
ebab3405b1 fix: forward session token in admin SSR load functions
Admin list queries (users, videos, articles) were using getGraphQLClient
without auth credentials, causing silent 403s on server-side loads. Now
extract session_token cookie and pass it to getAuthClient so the backend
sees the admin session on SSR requests.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 12:56:47 +01:00
ad7ceee5f8 fix: resolve lint errors from ACL/admin implementation
- Remove unused requireOwnerOrAdmin import from videos.ts
- Remove unused requireAuth import from users.ts
- Remove unused GraphQLError import from articles.ts
- Replace URLSearchParams with SvelteURLSearchParams in admin users page
- Apply prettier formatting to all changed files

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 12:35:11 +01:00
c1770ab9c9 feat: role-based ACL + admin management UI
Backend:
- Add acl.ts with requireAuth/requireRole/requireOwnerOrAdmin helpers
- Gate premium videos from unauthenticated users in videos query/resolver
- Fix updateVideoPlay to verify ownership before updating
- Add admin mutations: adminListUsers, adminUpdateUser, adminDeleteUser
- Add admin mutations: createVideo, updateVideo, deleteVideo, setVideoModels, adminListVideos
- Add admin mutations: createArticle, updateArticle, deleteArticle, adminListArticles
- Add deleteComment mutation (owner or admin only)
- Add AdminUserListType to GraphQL types
- Fix featured filter on articles query

Frontend:
- Install marked for markdown rendering
- Add /admin/* section with sidebar layout and admin-only guard
- Admin users page: paginated table with search, role filter, inline role change, delete
- Admin videos pages: list, create form, edit form with file upload and model assignment
- Admin articles pages: list, create form, edit form with split-pane markdown editor
- Add admin nav link in header (desktop + mobile) for admin users
- Render article content through marked in magazine detail page
- Add all admin GraphQL service functions to services.ts

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-06 12:31:33 +01:00
b200498a10 fix: correct CI badge URLs to use Gitea workflow badge format
All checks were successful
Build and Push Backend Image / build (push) Successful in 16s
Build and Push Frontend Image / build (push) Successful in 15s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 12:57:47 +01:00
1369d5c228 fix: copy packages/types into backend Docker build
All checks were successful
Build and Push Backend Image / build (push) Successful in 46s
Build and Push Frontend Image / build (push) Successful in 16s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 12:51:52 +01:00
e200514347 refactor: move flyout to left side, restore logo, remove close button
Some checks failed
Build and Push Backend Image / build (push) Failing after 47s
Build and Push Frontend Image / build (push) Successful in 5m13s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 12:34:57 +01:00
d7057c3681 refactor: improve mobile flyout header
- Replace inline mobile dropdown with sliding flyout panel from right
- Hide burger menu on lg breakpoint, desktop auth buttons use hidden lg:flex
- Add backdrop overlay with opacity transition
- Remove logo from flyout panel header
- Fix backdrop div accessibility with role="presentation"

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 11:44:23 +01:00
d820a8f6be chore: relative uploads dir 2026-03-05 11:05:30 +01:00
9bef2469d1 refactor: rename RecordedEvent fields to snake_case
deviceIndex → device_index
deviceName → device_name
actuatorIndex → actuator_index
actuatorType → actuator_type

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 11:04:36 +01:00
97269788ee feat: add shared @sexy.pivoine.art/types package and fix type safety across frontend/backend
- Create packages/types with shared TypeScript domain model interfaces (User, Video, Model, Article, Comment, Recording, etc.)
- Wire both frontend and backend packages to use @sexy.pivoine.art/types via workspace:*
- Update backend Pothos objectRef types to use shared interfaces instead of inline types
- Update frontend $lib/types.ts to re-export from shared package
- Fix all type errors introduced by more accurate nullable types (avatar/banner as string|null UUIDs, author nullable, events/device_info as object[])
- Add artist_name to comment user select in backend resolver
- Widen utility function signatures (getAssetUrl, getUserInitials, calcReadingTime) to accept null/undefined

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 11:01:11 +01:00
c6126c13e9 feat: add backend logger matching frontend text format
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 10:22:49 +01:00
fd4050a49f refactor: remove directus.ts shim, import directly from api
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 10:19:05 +01:00
efc7624ba3 style: apply prettier formatting to all files
All checks were successful
Build and Push Backend Image / build (push) Successful in 46s
Build and Push Frontend Image / build (push) Successful in 5m12s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 22:27:54 +01:00
18116072c9 feat: add formidable ESLint + Prettier linting setup
Some checks failed
Build and Push Backend Image / build (push) Successful in 47s
Build and Push Frontend Image / build (push) Has been cancelled
- Root-level eslint.config.js (flat config): typescript-eslint,
  eslint-plugin-svelte, eslint-config-prettier, @eslint/js
- Root-level prettier.config.js with prettier-plugin-svelte
- svelte-check added to frontend for Svelte/TS type checking
- lint, lint:fix, format, format:check, check scripts in root
  and both packages
- All 60 lint errors fixed across backend and frontend:
  - Consistent type imports
  - Removed unused imports/variables
  - Added keys to all {#each} blocks for Svelte performance
  - Replaced mutable Set/Map with SvelteSet/SvelteMap
  - Fixed useless assignments and empty catch blocks
- 64 remaining warnings are intentional any usages in the
  Pothos/Drizzle GraphQL resolver layer

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 22:24:55 +01:00
741e0c3387 docs: update README for custom backend replacing Directus
All checks were successful
Build and Push Backend Image / build (push) Successful in 17s
Build and Push Frontend Image / build (push) Successful in 17s
Replace all Directus references with the new Fastify + GraphQL Yoga
stack, update CI/CD references to dev.pivoine.art Gitea Actions,
add DB schema overview, auth flow, image transform presets table,
and fix all links to use https and correct registry URLs.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 22:01:46 +01:00
662e3e8fe2 fix: model join date used join_date but API returns date_created
All checks were successful
Build and Push Backend Image / build (push) Successful in 17s
Build and Push Frontend Image / build (push) Successful in 4m13s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 21:42:34 +01:00
fa159feffa fix: remove black border below video controls
Some checks failed
Build and Push Backend Image / build (push) Successful in 16s
Build and Push Frontend Image / build (push) Has been cancelled
- video: inline → block w-full (eliminates baseline descender gap)
- media-controller: fill parent container with absolute inset-0 w-full h-full

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 21:41:07 +01:00
124f0bfb22 fix: video src used movie.id but movie is already the UUID string
All checks were successful
Build and Push Backend Image / build (push) Successful in 17s
Build and Push Frontend Image / build (push) Successful in 4m18s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 21:27:06 +01:00
df89cc59f5 fix: use preview transform for home page video teasers
Some checks failed
Build and Push Backend Image / build (push) Has been cancelled
Build and Push Frontend Image / build (push) Has been cancelled
thumbnail (300x300 square) was double-cropping inside the wide h-48
container. preview (800px, aspect-ratio preserved) lets object-cover
do the only crop, matching the videos and model pages.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 21:24:59 +01:00
845e3df223 fix: image transforms — preserve aspect ratio, increase quality
Some checks failed
Build and Push Backend Image / build (push) Successful in 40s
Build and Push Frontend Image / build (push) Has been cancelled
- preview/medium use fit:inside (no forced crop, preserves aspect ratio)
- Only mini/thumbnail/banner force square/fixed crops
- Increase WebP quality 85 → 92
- Increase preview width 480 → 800, medium 960 → 1400

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 21:22:30 +01:00
05cb6a66e3 fix: image transforms via Sharp, model photos crash, video duration
All checks were successful
Build and Push Backend Image / build (push) Successful in 46s
Build and Push Frontend Image / build (push) Successful in 5m7s
- Backend: add Sharp image transform endpoint (/assets/:id?transform=X)
  with presets: mini(64), thumbnail(200), preview(480), medium(960), banner(1280)
  Transformed images are cached as webp next to originals
- Frontend: fix model photos crash (p.directus_files_id → p)
- Frontend: fix model banner URL (data.model.banner.id → data.model.banner)
- Frontend: fix video duration display (video.movie.duration → video.movie_file?.duration)
  across models/[slug], videos, videos/[slug], and home pages

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 20:56:33 +01:00
273aa42510 fix: serve assets via DB lookup to resolve file path correctly
All checks were successful
Build and Push Backend Image / build (push) Successful in 38s
Build and Push Frontend Image / build (push) Successful in 4m11s
Files are stored as <UPLOAD_DIR>/<id>/<filename>. The previous static
serving attempted to serve <UPLOAD_DIR>/<id> (a directory) which failed.
Custom /assets/:id route now looks up filename from DB and uses sendFile.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 20:40:22 +01:00
1e930baccb fix: resolve GraphQL request hang in Fastify integration
All checks were successful
Build and Push Backend Image / build (push) Successful in 39s
Build and Push Frontend Image / build (push) Successful in 4m7s
- Pass FastifyRequest/FastifyReply directly to yoga.handleNodeRequestAndResponse
  per the official graphql-yoga Fastify integration docs. Yoga v5 uses req.body
  (already parsed by Fastify) when available, avoiding the dead raw stream issue.
- Add proper TypeScript generics for server context including db and redis
- Wrap sendVerification/sendPasswordReset in try/catch so missing SMTP
  does not crash register/requestPasswordReset mutations
- Fix migrate.ts path resolution to work with both tsx (src/) and compiled (dist/)
- Expose postgres:5432 and redis:6379 ports in compose.yml for local dev

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 20:31:18 +01:00
012bb176d9 fix: convert Web API ReadableStream to Node.js Readable for Fastify
Some checks failed
Build and Push Backend Image / build (push) Failing after 26s
Build and Push Frontend Image / build (push) Successful in 4m17s
graphql-yoga's handleNodeRequestAndResponse returns a Response with a
Web API ReadableStream body. Fastify's reply.send() requires a Node.js
Readable stream, causing all GraphQL requests to hang indefinitely.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 20:11:08 +01:00
ed7ac0c573 fix: downgrade nanoid to v3 for CommonJS compatibility
All checks were successful
Build and Push Backend Image / build (push) Successful in 2m28s
Build and Push Frontend Image / build (push) Successful in 5m12s
nanoid v5 is ESM-only and cannot be require()'d in a CommonJS module.
v3 is the last version with native CJS support.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 19:43:44 +01:00
4565038be3 fix: cast recording duration float to integer in data migration
Some checks failed
Build and Push Backend Image / build (push) Successful in 39s
Build and Push Frontend Image / build (push) Has been cancelled
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 19:39:52 +01:00
fbafbeca5d fix: pass tags as native arrays not JSON strings in data migration
Some checks failed
Build and Push Backend Image / build (push) Successful in 38s
Build and Push Frontend Image / build (push) Has been cancelled
PostgreSQL text[] columns require native array values, not JSON strings.
Parse string tags from Directus and pass as JS arrays directly.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 19:36:04 +01:00
480369aa4e fix: correct column names in data migration script to match actual Directus schema
Some checks failed
Build and Push Backend Image / build (push) Successful in 37s
Build and Push Frontend Image / build (push) Has been cancelled
- directus_files: uploaded_on → date_created alias
- directus_users: join_date → date_created, remove email_notifications_key
- junction_directus_users_files: remove non-existent sort column
- sexy_videos: remove non-existent likes_count/plays_count (default 0)
- sexy_recordings: remove non-existent featured column (schema has default false)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 19:33:33 +01:00
ceb57ec1c4 fix: copy root node_modules to runner stage in backend Dockerfile
All checks were successful
Build and Push Backend Image / build (push) Successful in 30s
Build and Push Frontend Image / build (push) Successful in 15s
pnpm hoists workspace dependencies to the root node_modules.
Without copying it, modules like pg are not found at runtime.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 19:29:36 +01:00
4f8271217c fix: set CI=true for pnpm install in backend Dockerfile
All checks were successful
Build and Push Backend Image / build (push) Successful in 1m4s
Build and Push Frontend Image / build (push) Successful in 15s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 19:26:25 +01:00
046689e363 fix: set CI=true for pnpm install -rP in frontend Dockerfile
Some checks failed
Build and Push Backend Image / build (push) Failing after 28s
Build and Push Frontend Image / build (push) Successful in 4m50s
pnpm requires CI=true to allow non-interactive removal of node_modules
in CI environments without a TTY.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 19:19:45 +01:00
9ba71239b7 fix: use correct 'file' parameter in docker/build-push-action (not 'dockerfile')
Some checks failed
Build and Push Frontend Image / build (push) Failing after 4m12s
Build and Push Backend Image / build (push) Failing after 25s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 19:09:58 +01:00
757bbe9e3b fix: skip buttplug Rust build in backend Dockerfile via --ignore-scripts
Some checks failed
Build and Push Backend Image / build (push) Has been cancelled
Build and Push Frontend Image / build (push) Has been cancelled
Copy all workspace package.json files so pnpm can resolve the lockfile,
but install with --ignore-scripts to prevent buttplug's Rust/WASM build
from running. Only explicitly rebuild argon2 native bindings.
Also restore the missing migrations COPY line.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 19:08:48 +01:00
73f7a4f2f0 ci: replace combined workflow with separate frontend and backend workflow files
Some checks failed
Build and Push Frontend Image / build (push) Has been cancelled
Build and Push Backend Image / build (push) Has been cancelled
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 19:05:54 +01:00
3bd8d95576 ci: disable registry cache for backend build to fix poisoned buildcache
All checks were successful
Build and Push Docker Image to Gitea / build-frontend (push) Successful in 15s
Build and Push Docker Image to Gitea / build-backend (push) Successful in 4m44s
The backend buildcache was contaminated with frontend image layers, causing
the backend image to be built with the wrong content. Using no-cache forces
a fresh build until the cache can be reliably separated.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 18:57:39 +01:00
14e816241d ci: split frontend and backend into separate jobs to fix image tag mix-up
All checks were successful
Build and Push Docker Image to Gitea / build-frontend (push) Successful in 17s
Build and Push Docker Image to Gitea / build-backend (push) Successful in 16s
Both builds in the same job shared the same docker buildx instance,
causing the backend image to be incorrectly tagged with the frontend image.
Separate jobs get isolated buildx instances and separate build caches.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 18:53:16 +01:00
4102f9990c fix: switch backend to CommonJS, generate Drizzle migrations, add migrate script
All checks were successful
Build and Push Docker Image to Gitea / build-and-push (push) Successful in 4m23s
- Remove "type": "module" and switch tsconfig to CommonJS/Node resolution
  to fix drizzle-kit ESM/CJS incompatibility
- Strip .js extensions from all backend TypeScript imports
- Fix gamification resolver: combine two .where() calls using and()
- Fix index.ts: wrap top-level awaits in async main(), fix Fastify+yoga
  request handling via handleNodeRequestAndResponse
- Generate initial Drizzle SQL migration (0000_pale_hellion.sql) for all
  15 tables
- Add src/scripts/migrate.ts: programmatic Drizzle migrator for production
- Copy migrations folder into Docker image (Dockerfile.backend)
- Add schema:migrate npm script

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 18:42:58 +01:00
2565e6c28b fix: resolve pnpm frozen-lockfile error and argon2 native build
All checks were successful
Build and Push Docker Image to Gitea / build-and-push (push) Successful in 5m19s
- Run pnpm install to update lockfile with packages/backend dependencies
- Add argon2 to root onlyBuiltDependencies (pnpm-workspace.yaml + package.json)
- Add explicit `pnpm rebuild argon2` in Dockerfile.backend to ensure native
  bindings compile regardless of pnpm v10 build approval state
- Remove pnpm.onlyBuiltDependencies from packages/backend/package.json
  (ineffective in workspace packages, warned by pnpm)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 18:19:52 +01:00
493ddd7e78 chore: remove packages/bundle (Directus extension)
Some checks failed
Build and Push Docker Image to Gitea / build-and-push (push) Failing after 1m26s
All custom logic (endpoints, hooks, gamification) has been ported to
packages/backend. The Directus bundle is no longer needed.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 18:11:22 +01:00
33dd076a50 ci: add backend image build to docker workflow
Adds a second build+push step for the backend image (valknar/sexy-backend)
using Dockerfile.backend. Both images share the same tag strategy and
separate build caches. Summary step updated to show both images.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 18:10:06 +01:00
9d7afbe1b5 feat: replace Directus with custom Node.js GraphQL backend
Removes Directus 11 and replaces it with a lean, purpose-built backend:
- packages/backend/: Fastify v5 + GraphQL Yoga v5 + Pothos (code-first)
  with Drizzle ORM, Redis sessions (session_token cookie), argon2 auth,
  Nodemailer, fluent-ffmpeg, and full gamification system ported from bundle
- Frontend: @directus/sdk replaced by graphql-request v7; services.ts fully
  rewritten with identical signatures; directus.ts now re-exports from api.ts
- Cookie renamed directus_session_token → session_token
- Dev proxy target updated 8055 → 4000
- compose.yml: Directus service removed, backend service added (port 4000)
- Dockerfile.backend: new multi-stage image with ffmpeg
- Dockerfile: bundle build step and ffmpeg removed from frontend image
- data-migration.ts: one-time script to migrate all Directus/sexy_ tables

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 18:07:18 +01:00
de16b64255 fix: use env object from \$env/dynamic/public instead of named imports
All checks were successful
Build and Push Docker Image to Gitea / build-and-push (push) Successful in 5m13s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 17:19:26 +01:00
1e69d0b158 fix: create env placeholder file inline in Dockerfile for dynamic public env
Some checks failed
Build and Push Docker Image to Gitea / build-and-push (push) Failing after 4m6s
.dockerignore excludes .env files so the previous COPY failed silently.
$env/dynamic/public requires variable names to be declared at build time
to generate named exports; empty placeholders satisfy this while actual
values still come from process.env at runtime.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 17:09:42 +01:00
865787fb45 feat: switch PUBLIC_API_URL and PUBLIC_URL to dynamic env for runtime configurability
Some checks failed
Build and Push Docker Image to Gitea / build-and-push (push) Failing after 3m3s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 17:01:37 +01:00
3915dbc115 fix: .env inclusion
All checks were successful
Build and Push Docker Image to Gitea / build-and-push (push) Successful in 5m31s
2026-03-04 16:47:54 +01:00
83ca9d4fb5 chore: update all dependencies to latest versions
Some checks failed
Build and Push Docker Image to Gitea / build-and-push (push) Failing after 46s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 16:43:56 +01:00
225b9d41f5 chore: clean up repo and fix docker compose configuration
- Remove outdated docs (COMPOSE.md, DOCKER.md, QUICKSTART.md, REBUILD_GUIDE.md)
- Remove build.sh, compose.production.yml, gamification-schema.sql, directus.yaml
- Simplify compose.yml for local dev (remove env var indirection)
- Add directus.yml schema snapshot and schema.sql from VPS
- Add schema:export and schema:import scripts to package.json
- Ignore .env files (vars set via compose environment)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 16:36:49 +01:00
ad83fb553a feat: switch to dynamic env for Umami to allow runtime configuration
All checks were successful
Build and Push Docker Image to Gitea / build-and-push (push) Successful in 4m41s
2026-02-21 11:47:32 +01:00
2be36a679d fix: resolve buttplug-wasm build error by using Vec and slices
All checks were successful
Build and Push Docker Image to Gitea / build-and-push (push) Successful in 5m26s
2026-02-21 11:33:12 +01:00
75d4b4227c chore: update buttplug dependencies to commit fad6c9d
Some checks failed
Build and Push Docker Image to Gitea / build-and-push (push) Failing after 2m10s
2026-02-21 11:29:36 +01:00
2277e4f686 fix: resolve buttplug-wasm build error by using as_ref() for JS arrays
Some checks failed
Build and Push Docker Image to Gitea / build-and-push (push) Failing after 2m7s
2026-02-21 11:13:53 +01:00
13c6977e59 feat: add PUBLIC_UMAMI_SCRIPT variable and integrate into layout
Some checks failed
Build and Push Docker Image to Gitea / build-and-push (push) Failing after 2m13s
2026-02-21 11:05:30 +01:00
c85fa7798e chore: remove Letterspace newsletter integration and all LETTERSPACE_* variables 2026-02-21 10:56:43 +01:00
ce30eca574 chore: new imprint address 2026-02-21 10:37:19 +01:00
6724afa939 fix: use correct Directus preset parameter for asset transforms
All checks were successful
Build and Push Docker Image to Gitea / build-and-push (push) Successful in 6m4s
Changed ?transform= to ?key= so Directus storage asset presets
(mini, thumbnail, preview, medium, banner) are actually applied.
Previously all images were served at full resolution.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-16 08:42:38 +01:00
290 changed files with 23704 additions and 20886 deletions

View File

@@ -0,0 +1,60 @@
name: Build and Push Backend Image
on:
push:
branches:
- main
- develop
tags:
- "v*.*.*"
pull_request:
branches:
- main
workflow_dispatch:
env:
REGISTRY: dev.pivoine.art
IMAGE_NAME: valknar/sexy-backend
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
platforms: linux/amd64
- name: Log in to Gitea Container Registry
uses: docker/login-action@v3
with:
registry: ${{ env.REGISTRY }}
username: ${{ gitea.actor }}
password: ${{ secrets.REGISTRY_TOKEN }}
- name: Extract metadata
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
tags: |
type=raw,value=latest,enable={{is_default_branch}}
type=ref,event=branch
type=ref,event=pr
type=semver,pattern={{version}}
type=sha,prefix={{branch}}-
- name: Build and push
uses: docker/build-push-action@v5
with:
context: .
file: Dockerfile.backend
platforms: linux/amd64
push: ${{ gitea.event_name != 'pull_request' }}
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=registry,ref=${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:buildcache
cache-to: type=registry,ref=${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:buildcache,mode=max

View File

@@ -0,0 +1,60 @@
name: Build and Push Frontend Image
on:
push:
branches:
- main
- develop
tags:
- "v*.*.*"
pull_request:
branches:
- main
workflow_dispatch:
env:
REGISTRY: dev.pivoine.art
IMAGE_NAME: valknar/sexy
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
platforms: linux/amd64
- name: Log in to Gitea Container Registry
uses: docker/login-action@v3
with:
registry: ${{ env.REGISTRY }}
username: ${{ gitea.actor }}
password: ${{ secrets.REGISTRY_TOKEN }}
- name: Extract metadata
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
tags: |
type=raw,value=latest,enable={{is_default_branch}}
type=ref,event=branch
type=ref,event=pr
type=semver,pattern={{version}}
type=sha,prefix={{branch}}-
- name: Build and push
uses: docker/build-push-action@v5
with:
context: .
file: Dockerfile
platforms: linux/amd64
push: ${{ gitea.event_name != 'pull_request' }}
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=registry,ref=${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:buildcache
cache-to: type=registry,ref=${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:buildcache,mode=max

View File

@@ -1,112 +0,0 @@
name: Build and Push Docker Image to Gitea
on:
push:
branches:
- main
- develop
tags:
- 'v*.*.*'
pull_request:
branches:
- main
workflow_dispatch:
inputs:
tag:
description: 'Custom tag for the image'
required: false
default: 'manual'
env:
REGISTRY: dev.pivoine.art
IMAGE_NAME: valknar/sexy
jobs:
build-and-push:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
platforms: linux/amd64
- name: Log in to Gitea Container Registry
uses: docker/login-action@v3
with:
registry: ${{ env.REGISTRY }}
username: ${{ gitea.actor }}
password: ${{ secrets.REGISTRY_TOKEN }}
- name: Extract metadata (tags, labels)
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
tags: |
# Tag as 'latest' for main branch
type=raw,value=latest,enable={{is_default_branch}}
# Tag with branch name
type=ref,event=branch
# Tag with PR number
type=ref,event=pr
# Tag with git tag (semver)
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=semver,pattern={{major}}
# Tag with commit SHA
type=sha,prefix={{branch}}-
# Custom tag from workflow_dispatch
type=raw,value=${{ gitea.event.inputs.tag }},enable=${{ gitea.event_name == 'workflow_dispatch' }}
labels: |
org.opencontainers.image.title=sexy.pivoine.art
org.opencontainers.image.description=Adult content platform with SvelteKit, Directus, and hardware integration
org.opencontainers.image.vendor=valknar
org.opencontainers.image.source=https://dev.pivoine.art/${{ gitea.repository }}
- name: Build and push Docker image
uses: docker/build-push-action@v5
with:
context: .
platforms: linux/amd64
push: ${{ gitea.event_name != 'pull_request' }}
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=registry,ref=${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:buildcache
cache-to: type=registry,ref=${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:buildcache,mode=max
build-args: |
NODE_ENV=production
CI=true
- name: Generate image digest
if: gitea.event_name != 'pull_request'
run: |
echo "### Docker Image Published :rocket:" >> $GITEA_STEP_SUMMARY
echo "" >> $GITEA_STEP_SUMMARY
echo "**Registry:** \`${{ env.REGISTRY }}\`" >> $GITEA_STEP_SUMMARY
echo "**Image:** \`${{ env.IMAGE_NAME }}\`" >> $GITEA_STEP_SUMMARY
echo "" >> $GITEA_STEP_SUMMARY
echo "**Tags:**" >> $GITEA_STEP_SUMMARY
echo "\`\`\`" >> $GITEA_STEP_SUMMARY
echo "${{ steps.meta.outputs.tags }}" >> $GITEA_STEP_SUMMARY
echo "\`\`\`" >> $GITEA_STEP_SUMMARY
echo "" >> $GITEA_STEP_SUMMARY
echo "**Pull command:**" >> $GITEA_STEP_SUMMARY
echo "\`\`\`bash" >> $GITEA_STEP_SUMMARY
echo "docker pull ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:latest" >> $GITEA_STEP_SUMMARY
echo "\`\`\`" >> $GITEA_STEP_SUMMARY
- name: PR Comment - Image built but not pushed
if: gitea.event_name == 'pull_request'
run: |
echo "### Docker Image Built Successfully :white_check_mark:" >> $GITEA_STEP_SUMMARY
echo "" >> $GITEA_STEP_SUMMARY
echo "Image was built successfully but **not pushed** (PR builds are not published)." >> $GITEA_STEP_SUMMARY
echo "" >> $GITEA_STEP_SUMMARY
echo "**Would be tagged as:**" >> $GITEA_STEP_SUMMARY
echo "\`\`\`" >> $GITEA_STEP_SUMMARY
echo "${{ steps.meta.outputs.tags }}" >> $GITEA_STEP_SUMMARY
echo "\`\`\`" >> $GITEA_STEP_SUMMARY

4
.gitignore vendored
View File

@@ -3,5 +3,5 @@ dist/
target/
pkg/
.env.*
.claude/
.data/

6
.prettierignore Normal file
View File

@@ -0,0 +1,6 @@
build/
.svelte-kit/
dist/
node_modules/
migrations/
pnpm-lock.yaml

241
CLAUDE.md
View File

@@ -2,176 +2,93 @@
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## Project Overview
## Overview
This is a monorepo for an adult content platform built with SvelteKit, Directus CMS, and hardware integration via Buttplug.io. The project uses pnpm workspaces with three main packages.
## Prerequisites
1. Install Node.js 20.19.1
2. Enable corepack: `corepack enable`
3. Install dependencies: `pnpm install`
4. Install Rust toolchain and wasm-bindgen: `cargo install wasm-bindgen-cli`
## Project Structure
### Packages
- **`packages/frontend`**: SvelteKit application (main frontend)
- **`packages/bundle`**: Directus extension bundle (custom endpoints, hooks, themes)
- **`packages/buttplug`**: Hardware control library with TypeScript/WebAssembly bindings
### Frontend (SvelteKit + Tailwind CSS 4)
- **Framework**: SvelteKit 2 with adapter-node
- **Styling**: Tailwind CSS v4 via @tailwindcss/vite
- **UI Components**: bits-ui, custom components in `src/lib/components/ui/`
- **Backend**: Directus headless CMS
- **Routes**: File-based routing in `src/routes/`
- `+page.server.ts`: Server-side data loading
- `+layout.server.ts`: Layout data (authentication, etc.)
- **Authentication**: Session-based via Directus SDK (cookies)
- **API Proxy**: Dev server proxies `/api` to `http://localhost:8055` (Directus)
- **i18n**: svelte-i18n for internationalization
Key files:
- `src/lib/directus.ts`: Directus client configuration
- `src/lib/types.ts`: Shared TypeScript types
- `src/hooks.server.ts`: Server-side auth middleware
- `vite.config.ts`: Dev server on port 3000 with API proxy
### Bundle (Directus Extensions)
Custom Directus extensions providing:
- **Endpoint** (`src/endpoint/index.ts`): `/sexy/stats` endpoint for platform statistics
- **Hook** (`src/hook/index.ts`):
- Auto-generates slugs for users based on artist_name
- Processes uploaded videos with ffmpeg to extract duration
- **Theme** (`src/theme/index.ts`): Custom Directus admin theme
### Buttplug (Hardware Control)
Hybrid TypeScript/Rust package for intimate hardware control:
- **TypeScript**: Client library, connectors (WebSocket, Browser WebSocket)
- **Rust/WASM**: Core buttplug implementation compiled to WebAssembly
- Provides browser-based Bluetooth device control via WebBluetooth API
Key concepts:
- `ButtplugClient`: Main client interface
- `ButtplugClientDevice`: Device abstraction
- `ButtplugWasmClientConnector`: WASM-based connector
- Messages defined in `src/core/Messages.ts`
`sexy.pivoine.art` is a self-hosted adult content platform (18+) built as a pnpm monorepo with three packages: `frontend` (SvelteKit 5), `backend` (Fastify + GraphQL), and `buttplug` (hardware integration via WebBluetooth/WASM).
## Common Commands
### Development
Run from the repo root unless otherwise noted.
Start full development environment (data + Directus + frontend):
```bash
pnpm dev
# Development
pnpm dev:data # Start postgres & redis via Docker
pnpm dev:backend # Start backend on http://localhost:4000
pnpm dev # Start backend + frontend (frontend on :3000)
# Linting & Formatting
pnpm lint # ESLint across all packages
pnpm lint:fix # Auto-fix ESLint issues
pnpm format # Prettier format all files
pnpm format:check # Check formatting without changes
# Build
pnpm build:frontend # SvelteKit production build
pnpm build:backend # Compile backend TypeScript to dist/
# Database migrations (from packages/backend/)
pnpm migrate # Run pending Drizzle migrations
```
Individual services:
## Architecture
### Monorepo Layout
```
packages/
frontend/ # SvelteKit 2 + Svelte 5 + Tailwind CSS 4
backend/ # Fastify v5 + GraphQL Yoga v5 + Drizzle ORM
buttplug/ # TypeScript/Rust hybrid, compiles to WASM
```
### Backend (`packages/backend/src/`)
- **`index.ts`** — Fastify server entry: registers plugins (CORS, multipart, static), mounts GraphQL at `/graphql`, serves transformed assets at `/assets/:id`
- **`graphql/builder.ts`** — Pothos schema builder (code-first GraphQL)
- **`graphql/context.ts`** — Injects `currentUser` from Redis session into every request
- **`lib/auth.ts`** — Session management: `nanoid(32)` token stored in Redis with 24h TTL, set as httpOnly cookie
- **`db/schema/`** — Drizzle ORM table definitions (users, videos, files, comments, gamification, etc.)
- **`migrations/`** — SQL migration files managed by Drizzle Kit
### Frontend (`packages/frontend/src/`)
- **`lib/api.ts`** — GraphQL client (graphql-request)
- **`lib/services.ts`** — All API calls (login, videos, comments, models, etc.)
- **`lib/types.ts`** — Shared TypeScript types
- **`hooks.server.ts`** — Auth guard: reads session cookie, fetches `me` query, redirects if needed
- **`routes/`** — SvelteKit file-based routing: `/`, `/login`, `/signup`, `/me`, `/models`, `/models/[slug]`, `/videos`, `/play/[slug]`, `/magazine`, `/leaderboard`
### Asset Pipeline
Backend serves images with server-side Sharp transforms, cached to disk as WebP. Presets: `mini` (80×80), `thumbnail` (300×300), `preview` (800px wide), `medium` (1400px wide), `banner` (1600×480 cropped).
### Gamification
Points + achievements system tracked in `user_points` and `user_stats` tables. Logic in `packages/backend/src/lib/gamification.ts` and the `gamification` resolver.
## Code Style
- **TypeScript strict mode** in all packages
- **ESLint flat config** (`eslint.config.js` at root) — `any` is allowed but discouraged; enforces consistent type imports
- **Prettier**: 2-space indent, trailing commas, 100-char line width, Svelte plugin
- Migrations folder (`packages/backend/src/migrations/`) is excluded from lint
## Environment Variables (Backend)
| Variable | Purpose |
| --------------------------- | ---------------------------- |
| `DATABASE_URL` | PostgreSQL connection string |
| `REDIS_URL` | Redis connection string |
| `COOKIE_SECRET` | Session cookie signing |
| `CORS_ORIGIN` | Frontend origin URL |
| `UPLOAD_DIR` | File storage path |
| `SMTP_HOST/PORT/EMAIL_FROM` | Email (Nodemailer) |
## Docker
```bash
pnpm dev:data # Start Docker Compose data services
pnpm dev:directus # Start Directus in Docker
pnpm --filter @sexy.pivoine.art/frontend dev # Frontend dev server only
docker compose up -d # Start all services (postgres, redis, backend, frontend)
arty up -d <service> # Preferred way to manage containers in this project
```
### Building
Build all packages:
```bash
pnpm install # Ensure dependencies are installed first
```
Build specific packages:
```bash
pnpm build:frontend # Pulls git, installs, builds frontend
pnpm build:bundle # Pulls git, installs, builds Directus extensions
```
Individual package builds:
```bash
pnpm --filter @sexy.pivoine.art/frontend build
pnpm --filter @sexy.pivoine.art/bundle build
pnpm --filter @sexy.pivoine.art/buttplug build # TypeScript build
pnpm --filter @sexy.pivoine.art/buttplug build:wasm # Rust WASM build
```
### Production
Start production frontend server (local):
```bash
pnpm --filter @sexy.pivoine.art/frontend start
```
Docker Compose deployment (recommended for production):
```bash
# Local development (with Postgres, Redis, Directus)
docker-compose up -d
# Production (with Traefik, external DB, Redis)
docker-compose -f compose.production.yml --env-file .env.production up -d
```
See `COMPOSE.md` for Docker Compose guide and `DOCKER.md` for standalone Docker deployment.
## Architecture Notes
### Data Flow
1. **Frontend**`/api/*` (proxied) → **Directus CMS**
2. Directus uses **bundle extensions** for custom logic (stats, video processing, user management)
3. Frontend uses **Directus SDK** with session authentication
4. Hardware control uses **buttplug package** (TypeScript → WASM → Bluetooth)
### Authentication
- Session tokens stored in `directus_session_token` cookie
- `hooks.server.ts` validates token on every request via `isAuthenticated()`
- User roles: Model, Viewer (checked via role or policy)
- `isModel()` helper in `src/lib/directus.ts` checks user permissions
### Content Types
Core types in `packages/frontend/src/lib/types.ts`:
- **User/CurrentUser**: User profiles with roles and policies
- **Video**: Videos with models, tags, premium flag
- **Model**: Creator profiles with photos and banner
- **Article**: Magazine/blog content
- **BluetoothDevice**: Hardware device state
### Docker Environment
Development uses Docker Compose in `../compose/` directory:
- `../compose/data`: Database/storage services
- `../compose/sexy`: Directus instance (uses `.env.local`)
### Asset URLs
Assets served via Directus with transforms:
```typescript
getAssetUrl(id, "thumbnail" | "preview" | "medium" | "banner")
// Returns: ${directusApiUrl}/assets/${id}?transform=...
```
## Development Workflow
1. Ensure Docker services are running: `pnpm dev:data && pnpm dev:directus`
2. Start frontend dev server: `pnpm --filter @sexy.pivoine.art/frontend dev`
3. Access frontend at `http://localhost:3000`
4. Access Directus admin at `http://localhost:8055`
When modifying:
- **Frontend code**: Hot reload via Vite
- **Bundle extensions**: Rebuild with `pnpm --filter @sexy.pivoine.art/bundle build` and restart Directus
- **Buttplug library**: Rebuild TypeScript (`pnpm build`) and/or WASM (`pnpm build:wasm`)
## Important Notes
- This is a pnpm workspace; always use `pnpm` not `npm` or `yarn`
- Package manager is locked to `pnpm@10.17.0`
- Buttplug package requires Rust toolchain for WASM builds
- Frontend uses SvelteKit's adapter-node for production deployment
- All TypeScript packages use ES modules (`"type": "module"`)
Production images are built and pushed to `dev.pivoine.art` via Gitea Actions on push to `main`.

View File

@@ -1,424 +0,0 @@
# Docker Compose Guide
This guide explains the Docker Compose setup for sexy.pivoine.art with local development and production configurations.
## Architecture Overview
The application uses a **multi-file compose setup** with two configurations:
1. **`compose.yml`** - Base configuration for local development
2. **`compose.production.yml`** - Production overrides with Traefik integration
### Service Architecture
```
┌─────────────────────────────────────────────────────────────┐
│ 🌐 Traefik Reverse Proxy (Production Only) │
│ ├─ HTTPS Termination │
│ ├─ Automatic Let's Encrypt │
│ └─ Routes traffic to frontend & Directus API │
├─────────────────────────────────────────────────────────────┤
│ 💄 Frontend (SvelteKit) │
│ ├─ Port 3000 (internal) │
│ ├─ Serves on https://sexy.pivoine.art │
│ └─ Proxies /api to Directus │
├─────────────────────────────────────────────────────────────┤
│ 🎭 Directus CMS │
│ ├─ Port 8055 (internal) │
│ ├─ Serves on https://sexy.pivoine.art/api │
│ ├─ Custom bundle extensions mounted │
│ └─ Uploads volume │
├─────────────────────────────────────────────────────────────┤
│ 🗄️ PostgreSQL (Local) / External (Production) │
│ └─ Database for Directus │
├─────────────────────────────────────────────────────────────┤
│ 💾 Redis (Local) / External (Production) │
│ └─ Cache & session storage │
└─────────────────────────────────────────────────────────────┘
```
## Local Development Setup
### Prerequisites
- Docker 20.10+
- Docker Compose 2.0+
### Quick Start
1. **Create environment file:**
```bash
cp .env.example .env
# Edit .env with your local settings (defaults work fine)
```
2. **Start all services:**
```bash
docker-compose up -d
```
3. **Access services:**
- Frontend: http://localhost:3000 (if enabled)
- Directus: http://localhost:8055
- Directus Admin: http://localhost:8055/admin
4. **View logs:**
```bash
docker-compose logs -f
```
5. **Stop services:**
```bash
docker-compose down
```
### Local Services
#### PostgreSQL
- **Image:** `postgres:16-alpine`
- **Port:** 5432 (internal only)
- **Volume:** `postgres-data`
- **Database:** `sexy`
#### Redis
- **Image:** `redis:7-alpine`
- **Port:** 6379 (internal only)
- **Volume:** `redis-data`
- **Persistence:** AOF enabled
#### Directus
- **Image:** `directus/directus:11`
- **Port:** 8055 (exposed)
- **Volumes:**
- `directus-uploads` - File uploads
- `./packages/bundle/dist` - Custom extensions
- **Features:**
- Auto-reload extensions
- WebSockets enabled
- CORS enabled for localhost
### Local Development Workflow
```bash
# Start infrastructure (Postgres, Redis, Directus)
docker-compose up -d
# Develop frontend locally with hot reload
cd packages/frontend
pnpm dev
# Build Directus bundle
pnpm --filter @sexy.pivoine.art/bundle build
# Restart Directus to load new bundle
docker-compose restart directus
```
## Production Deployment
### Prerequisites
- External PostgreSQL database
- External Redis instance
- Traefik reverse proxy configured
- External network: `compose_network`
### Setup
The production compose file now uses the `include` directive to automatically extend `compose.yml`, making deployment simpler.
1. **Create production environment file:**
```bash
cp .env.production.example .env.production
```
2. **Edit `.env.production` with your values:**
```bash
# Database (external)
CORE_DB_HOST=your-postgres-host
SEXY_DB_NAME=sexy_production
DB_USER=sexy
DB_PASSWORD=your-secure-password
# Redis (external)
CORE_REDIS_HOST=your-redis-host
# Directus
SEXY_DIRECTUS_SECRET=your-32-char-random-secret
ADMIN_PASSWORD=your-secure-admin-password
# Traefik
SEXY_TRAEFIK_HOST=sexy.pivoine.art
# Frontend
PUBLIC_API_URL=https://sexy.pivoine.art/api
PUBLIC_URL=https://sexy.pivoine.art
# Email (SMTP)
EMAIL_SMTP_HOST=smtp.your-provider.com
EMAIL_SMTP_USER=your-email@domain.com
EMAIL_SMTP_PASSWORD=your-smtp-password
```
3. **Deploy:**
```bash
# Simple deployment - compose.production.yml includes compose.yml automatically
docker-compose -f compose.production.yml --env-file .env.production up -d
# Or use the traditional multi-file approach (same result)
docker-compose -f compose.yml -f compose.production.yml --env-file .env.production up -d
```
### Production Services
#### Directus
- **Image:** `directus/directus:11` (configurable)
- **Network:** `compose_network` (external)
- **Volumes:**
- `/var/www/sexy.pivoine.art/uploads` - Persistent uploads
- `/var/www/sexy.pivoine.art/packages/bundle/dist` - Extensions
- **Traefik routing:**
- Domain: `sexy.pivoine.art/api`
- Strips `/api` prefix before forwarding
- HTTPS with auto-certificates
#### Frontend
- **Image:** `ghcr.io/valknarxxx/sexy:latest` (from GHCR)
- **Network:** `compose_network` (external)
- **Volume:** `/var/www/sexy.pivoine.art` - Application code
- **Traefik routing:**
- Domain: `sexy.pivoine.art`
- HTTPS with auto-certificates
### Traefik Integration
Both services are configured with Traefik labels for automatic routing:
**Frontend:**
- HTTP → HTTPS redirect
- Routes `sexy.pivoine.art` to port 3000
- Gzip compression enabled
**Directus API:**
- HTTP → HTTPS redirect
- Routes `sexy.pivoine.art/api` to port 8055
- Strips `/api` prefix
- Gzip compression enabled
### Production Commands
```bash
# Deploy/update (simplified - uses include)
docker-compose -f compose.production.yml --env-file .env.production up -d
# View logs
docker-compose -f compose.production.yml logs -f
# Restart specific service
docker-compose -f compose.production.yml restart frontend
# Stop all services
docker-compose -f compose.production.yml down
# Update images
docker-compose -f compose.production.yml pull
docker-compose -f compose.production.yml up -d
```
## Environment Variables
### Local Development (`.env`)
| Variable | Default | Description |
|----------|---------|-------------|
| `DB_DATABASE` | `sexy` | Database name |
| `DB_USER` | `sexy` | Database user |
| `DB_PASSWORD` | `sexy` | Database password |
| `DIRECTUS_SECRET` | - | Secret for Directus (min 32 chars) |
| `ADMIN_EMAIL` | `admin@sexy.pivoine.art` | Admin email |
| `ADMIN_PASSWORD` | `admin` | Admin password |
| `CORS_ORIGIN` | `http://localhost:3000` | CORS allowed origins |
See `.env.example` for full list.
### Production (`.env.production`)
| Variable | Description | Required |
|----------|-------------|----------|
| `CORE_DB_HOST` | External PostgreSQL host | ✅ |
| `SEXY_DB_NAME` | Database name | ✅ |
| `DB_PASSWORD` | Database password | ✅ |
| `CORE_REDIS_HOST` | External Redis host | ✅ |
| `SEXY_DIRECTUS_SECRET` | Directus secret key | ✅ |
| `SEXY_TRAEFIK_HOST` | Domain name | ✅ |
| `EMAIL_SMTP_HOST` | SMTP server | ✅ |
| `EMAIL_SMTP_PASSWORD` | SMTP password | ✅ |
| `SEXY_FRONTEND_PUBLIC_API_URL` | Frontend API URL | ✅ |
| `SEXY_FRONTEND_PUBLIC_URL` | Frontend public URL | ✅ |
See `.env.production.example` for full list.
**Note:** All frontend-specific variables are prefixed with `SEXY_FRONTEND_` for clarity.
## Volumes
### Local Development
- `postgres-data` - PostgreSQL database
- `redis-data` - Redis persistence
- `directus-uploads` - Uploaded files
### Production
- `/var/www/sexy.pivoine.art/uploads` - Directus uploads
- `/var/www/sexy.pivoine.art` - Application code (frontend)
## Networks
### Local: `sexy-network`
- Bridge network
- Internal communication only
- Directus exposed on 8055
### Production: `compose_network`
- External network (pre-existing)
- Connects to Traefik
- No exposed ports (Traefik handles routing)
## Health Checks
All services include health checks:
**PostgreSQL:**
- Command: `pg_isready`
- Interval: 10s
**Redis:**
- Command: `redis-cli ping`
- Interval: 10s
**Directus:**
- Endpoint: `/server/health`
- Interval: 30s
**Frontend:**
- HTTP GET: `localhost:3000`
- Interval: 30s
## Troubleshooting
### Local Development
**Problem:** Directus won't start
```bash
# Check logs
docker-compose logs directus
# Common issues:
# 1. Database not ready - wait for postgres to be healthy
# 2. Wrong secret - check DIRECTUS_SECRET is at least 32 chars
```
**Problem:** Can't connect to database
```bash
# Check if postgres is running
docker-compose ps postgres
# Verify health
docker-compose exec postgres pg_isready -U sexy
```
**Problem:** Extensions not loading
```bash
# Rebuild bundle
pnpm --filter @sexy.pivoine.art/bundle build
# Verify volume mount
docker-compose exec directus ls -la /directus/extensions/
# Restart Directus
docker-compose restart directus
```
### Production
**Problem:** Services not accessible via domain
```bash
# Check Traefik labels
docker inspect sexy_frontend | grep traefik
# Verify compose_network exists
docker network ls | grep compose_network
# Check Traefik is running
docker ps | grep traefik
```
**Problem:** Can't connect to external database
```bash
# Test connection from Directus container
docker-compose exec directus sh
apk add postgresql-client
psql -h $CORE_DB_HOST -U $DB_USER -d $SEXY_DB_NAME
```
**Problem:** Frontend can't reach Directus API
```bash
# Check Directus is accessible
curl https://sexy.pivoine.art/api/server/health
# Verify CORS settings
# PUBLIC_API_URL should match the public Directus URL
```
## Migration from Old Setup
If migrating from `docker-compose.production.yml`:
1. **Rename environment variables** according to `.env.production.example`
2. **Update command** to use both compose files
3. **Verify Traefik labels** match your setup
4. **Test** with `docker-compose config` to see merged configuration
```bash
# Validate configuration
docker-compose -f compose.yml -f compose.production.yml --env-file .env.production config
# Deploy
docker-compose -f compose.yml -f compose.production.yml --env-file .env.production up -d
```
## Best Practices
### Local Development
1. Use default credentials (they're fine for local)
2. Keep `EXTENSIONS_AUTO_RELOAD=true` for quick iteration
3. Run frontend via `pnpm dev` for hot reload
4. Restart Directus after bundle changes
### Production
1. Use strong passwords for database and admin
2. Set `EXTENSIONS_AUTO_RELOAD=false` for stability
3. Use GHCR images for frontend
4. Enable Gzip compression via Traefik
5. Monitor logs regularly
6. Keep backups of uploads and database
## See Also
- [DOCKER.md](DOCKER.md) - Docker image documentation
- [QUICKSTART.md](QUICKSTART.md) - Quick start guide
- [CLAUDE.md](CLAUDE.md) - Development guide

378
DOCKER.md
View File

@@ -1,378 +0,0 @@
# Docker Deployment Guide
This guide covers building and deploying sexy.pivoine.art using Docker.
## Overview
The Dockerfile uses a multi-stage build process:
1. **Base stage**: Sets up Node.js and pnpm
2. **Builder stage**: Installs Rust, compiles WASM, builds all packages
3. **Runner stage**: Minimal production image with only runtime dependencies
## Prerequisites
- Docker 20.10+ with BuildKit support
- Docker Compose 2.0+ (optional, for orchestration)
## Building the Image
### Basic Build
```bash
docker build -t sexy.pivoine.art:latest .
```
### Build with Build Arguments
```bash
docker build \
--build-arg NODE_ENV=production \
-t sexy.pivoine.art:latest \
.
```
### Multi-platform Build (for ARM64 and AMD64)
```bash
docker buildx build \
--platform linux/amd64,linux/arm64 \
-t sexy.pivoine.art:latest \
--push \
.
```
## Running the Container
### Run with Environment Variables
```bash
docker run -d \
--name sexy-pivoine-frontend \
-p 3000:3000 \
-e PUBLIC_API_URL=https://api.pivoine.art \
-e PUBLIC_URL=https://sexy.pivoine.art \
-e PUBLIC_UMAMI_ID=your-umami-id \
-e LETTERSPACE_API_URL=https://api.letterspace.com/v1 \
-e LETTERSPACE_API_KEY=your-api-key \
-e LETTERSPACE_LIST_ID=your-list-id \
sexy.pivoine.art:latest
```
### Run with Environment File
```bash
# Create .env.production from template
cp .env.production.example .env.production
# Edit .env.production with your values
nano .env.production
# Run container
docker run -d \
--name sexy-pivoine-frontend \
-p 3000:3000 \
--env-file .env.production \
sexy.pivoine.art:latest
```
## Docker Compose Deployment
### Using docker-compose.production.yml
```bash
# 1. Create environment file
cp .env.production.example .env.production
# 2. Edit environment variables
nano .env.production
# 3. Build and start
docker-compose -f docker-compose.production.yml up -d --build
# 4. View logs
docker-compose -f docker-compose.production.yml logs -f frontend
# 5. Stop services
docker-compose -f docker-compose.production.yml down
```
### Scale the Application
```bash
docker-compose -f docker-compose.production.yml up -d --scale frontend=3
```
## Environment Variables Reference
### Required Variables
| Variable | Description | Example |
|----------|-------------|---------|
| `PUBLIC_API_URL` | Directus API backend URL | `https://api.pivoine.art` |
| `PUBLIC_URL` | Frontend application URL | `https://sexy.pivoine.art` |
### Optional Variables
| Variable | Description | Example |
|----------|-------------|---------|
| `PUBLIC_UMAMI_ID` | Umami analytics tracking ID | `abc123def-456` |
| `LETTERSPACE_API_URL` | Letterspace API endpoint | `https://api.letterspace.com/v1` |
| `LETTERSPACE_API_KEY` | Letterspace authentication key | `sk_live_...` |
| `LETTERSPACE_LIST_ID` | Mailing list identifier | `list_abc123` |
| `PORT` | Application port (inside container) | `3000` |
| `HOST` | Host binding | `0.0.0.0` |
| `NODE_ENV` | Node environment | `production` |
## Health Checks
The container includes a built-in health check that pings the HTTP server every 30 seconds:
```bash
# Check container health
docker inspect --format='{{.State.Health.Status}}' sexy-pivoine-frontend
# View health check logs
docker inspect --format='{{json .State.Health}}' sexy-pivoine-frontend | jq
```
## Logs and Debugging
### View Container Logs
```bash
# Follow logs
docker logs -f sexy-pivoine-frontend
# Last 100 lines
docker logs --tail 100 sexy-pivoine-frontend
# With timestamps
docker logs -f --timestamps sexy-pivoine-frontend
```
### Execute Commands in Running Container
```bash
# Open shell
docker exec -it sexy-pivoine-frontend sh
# Check Node.js version
docker exec sexy-pivoine-frontend node --version
# Check environment variables
docker exec sexy-pivoine-frontend env
```
### Debug Build Issues
```bash
# Build with no cache
docker build --no-cache -t sexy.pivoine.art:latest .
# Build specific stage for debugging
docker build --target builder -t sexy.pivoine.art:builder .
# Inspect builder stage
docker run -it --rm sexy.pivoine.art:builder sh
```
## Production Best Practices
### 1. Use Specific Tags
```bash
# Tag with version
docker build -t sexy.pivoine.art:1.0.0 .
docker tag sexy.pivoine.art:1.0.0 sexy.pivoine.art:latest
```
### 2. Image Scanning
```bash
# Scan for vulnerabilities (requires Docker Scout or Trivy)
docker scout cves sexy.pivoine.art:latest
# Or with Trivy
trivy image sexy.pivoine.art:latest
```
### 3. Resource Limits
```bash
docker run -d \
--name sexy-pivoine-frontend \
-p 3000:3000 \
--memory="2g" \
--cpus="2" \
--env-file .env.production \
sexy.pivoine.art:latest
```
### 4. Restart Policies
```bash
docker run -d \
--name sexy-pivoine-frontend \
--restart=unless-stopped \
-p 3000:3000 \
--env-file .env.production \
sexy.pivoine.art:latest
```
### 5. Use Docker Secrets (Docker Swarm)
```bash
# Create secrets
echo "your-api-key" | docker secret create letterspace_api_key -
# Deploy with secrets
docker service create \
--name sexy-pivoine-frontend \
--secret letterspace_api_key \
-p 3000:3000 \
sexy.pivoine.art:latest
```
## Optimization Tips
### Reduce Build Time
1. **Use BuildKit cache mounts** (already enabled in Dockerfile)
2. **Leverage layer caching** - structure Dockerfile to cache dependencies
3. **Use `.dockerignore`** - exclude unnecessary files from build context
### Reduce Image Size
Current optimizations:
- Multi-stage build (builder artifacts not in final image)
- Production-only dependencies (`pnpm install --prod`)
- Minimal base image (`node:20.19.1-slim`)
- Only necessary build artifacts copied to runner
Image size breakdown:
```bash
docker images sexy.pivoine.art:latest
```
## CI/CD Integration
### GitHub Actions (Automated)
This repository includes automated GitHub Actions workflows for building, scanning, and managing Docker images.
**Pre-configured workflows:**
- **Build & Push** (`.github/workflows/docker-build-push.yml`)
- Automatically builds and pushes to `ghcr.io/valknarxxx/sexy`
- Triggers on push to main/develop, version tags, and PRs
- Multi-platform builds (AMD64 + ARM64)
- Smart tagging: latest, branch names, semver, commit SHAs
- **Security Scan** (`.github/workflows/docker-scan.yml`)
- Daily vulnerability scans with Trivy
- Reports to GitHub Security tab
- Scans on every release
- **Cleanup** (`.github/workflows/cleanup-images.yml`)
- Weekly cleanup of old untagged images
- Keeps last 10 versions
**Using pre-built images:**
```bash
# Pull latest from GitHub Container Registry
docker pull ghcr.io/valknarxxx/sexy:latest
# Pull specific version
docker pull ghcr.io/valknarxxx/sexy:v1.0.0
# Run the image
docker run -d -p 3000:3000 --env-file .env.production ghcr.io/valknarxxx/sexy:latest
```
**Triggering builds:**
```bash
# Push to main → builds 'latest' tag
git push origin main
# Create version tag → builds semver tags
git tag v1.0.0 && git push origin v1.0.0
# Pull request → builds but doesn't push
```
See `.github/workflows/README.md` for detailed workflow documentation.
## Troubleshooting
### Build Fails at Rust Installation
**Problem**: Rust installation fails or times out
**Solution**:
- Check internet connectivity
- Use a Rust mirror if in restricted network
- Increase build timeout
### WASM Build Fails
**Problem**: `wasm-bindgen-cli` version mismatch
**Solution**:
```dockerfile
# In Dockerfile, pin wasm-bindgen-cli version
RUN cargo install wasm-bindgen-cli --version 0.2.103
```
### Container Exits Immediately
**Problem**: Container starts then exits
**Solution**: Check logs and verify:
```bash
docker logs sexy-pivoine-frontend
# Verify build output exists
docker run -it --rm sexy.pivoine.art:latest ls -la packages/frontend/build
```
### Port Already in Use
**Problem**: Port 3000 already bound
**Solution**:
```bash
# Use different host port
docker run -d -p 8080:3000 sexy.pivoine.art:latest
```
## Maintenance
### Clean Up
```bash
# Remove stopped containers
docker container prune
# Remove unused images
docker image prune -a
# Remove build cache
docker builder prune
# Complete cleanup (use with caution)
docker system prune -a --volumes
```
### Update Base Image
Regularly update the base Node.js image:
```bash
# Pull latest Node 20 LTS
docker pull node:20.19.1-slim
# Rebuild
docker build --pull -t sexy.pivoine.art:latest .
```

View File

@@ -14,9 +14,10 @@ WORKDIR /app
# Copy workspace configuration
COPY pnpm-workspace.yaml package.json pnpm-lock.yaml ./
# Copy .env to .env.production for proper svelte compiling
RUN mkdir -p ./packages/frontend
COPY packages/frontend/.env ./packages/frontend/.env.production
# Create env file with placeholder values so SvelteKit knows variable names at build time
# Actual values are injected at runtime via process.env (adapter-node)
RUN mkdir -p ./packages/frontend && \
printf 'PUBLIC_API_URL=\nPUBLIC_URL=\nPUBLIC_UMAMI_ID=\nPUBLIC_UMAMI_SCRIPT=\n' > ./packages/frontend/.env
# ============================================================================
# Builder stage - compile application with Rust/WASM support
@@ -63,11 +64,8 @@ RUN pnpm --filter @sexy.pivoine.art/buttplug build
# 3. Build frontend
RUN pnpm --filter @sexy.pivoine.art/frontend build
# 4. Build Directus bundle
RUN pnpm --filter @sexy.pivoine.art/bundle build
# Prune dev dependencies for production
RUN pnpm install -rP
RUN CI=true pnpm install -rP
# ============================================================================
# Runner stage - minimal production image
@@ -77,7 +75,6 @@ FROM node:22.11.0-slim AS runner
# Install dumb-init for proper signal handling
RUN apt-get update && apt-get install -y \
dumb-init \
ffmpeg \
&& rm -rf /var/lib/apt/lists/*
# Create non-root user
@@ -95,18 +92,13 @@ COPY --from=builder --chown=node:node /app/pnpm-lock.yaml ./pnpm-lock.yaml
COPY --from=builder --chown=node:node /app/pnpm-workspace.yaml ./pnpm-workspace.yaml
# Create package directories
RUN mkdir -p packages/frontend packages/bundle packages/buttplug
RUN mkdir -p packages/frontend packages/buttplug
# Copy frontend artifacts
COPY --from=builder --chown=node:node /app/packages/frontend/build ./packages/frontend/build
COPY --from=builder --chown=node:node /app/packages/frontend/node_modules ./packages/frontend/node_modules
COPY --from=builder --chown=node:node /app/packages/frontend/package.json ./packages/frontend/package.json
# Copy bundle artifacts
COPY --from=builder --chown=node:node /app/packages/bundle/dist ./packages/bundle/dist
COPY --from=builder --chown=node:node /app/packages/bundle/node_modules ./packages/bundle/node_modules
COPY --from=builder --chown=node:node /app/packages/bundle/package.json ./packages/bundle/package.json
# Copy buttplug artifacts
COPY --from=builder --chown=node:node /app/packages/buttplug/dist ./packages/buttplug/dist
COPY --from=builder --chown=node:node /app/packages/buttplug/node_modules ./packages/buttplug/node_modules
@@ -124,9 +116,7 @@ ENV NODE_ENV=production \
ENV PUBLIC_API_URL="" \
PUBLIC_URL="" \
PUBLIC_UMAMI_ID="" \
LETTERSPACE_API_URL="" \
LETTERSPACE_API_KEY="" \
LETTERSPACE_LIST_ID=""
PUBLIC_UMAMI_SCRIPT=""
# Expose application port
EXPOSE 3000

73
Dockerfile.backend Normal file
View File

@@ -0,0 +1,73 @@
# syntax=docker/dockerfile:1
# ============================================================================
# Builder stage
# ============================================================================
FROM node:22.11.0-slim AS builder
RUN npm install -g corepack@latest && corepack enable
WORKDIR /app
# Copy all package manifests so pnpm can resolve the workspace lockfile,
# but use --ignore-scripts to skip buttplug's Rust/WASM build entirely.
COPY pnpm-workspace.yaml package.json pnpm-lock.yaml ./
COPY packages/backend/package.json ./packages/backend/package.json
COPY packages/frontend/package.json ./packages/frontend/package.json
COPY packages/buttplug/package.json ./packages/buttplug/package.json
COPY packages/types/package.json ./packages/types/package.json
RUN pnpm install --frozen-lockfile --filter @sexy.pivoine.art/backend --ignore-scripts
# Rebuild native bindings (argon2, sharp)
RUN pnpm rebuild argon2 sharp
COPY packages/types ./packages/types
COPY packages/backend ./packages/backend
RUN pnpm --filter @sexy.pivoine.art/backend build
RUN CI=true pnpm install --frozen-lockfile --filter @sexy.pivoine.art/backend --prod --ignore-scripts
RUN pnpm rebuild argon2 sharp
# ============================================================================
# Runner stage
# ============================================================================
FROM node:22.11.0-slim AS runner
RUN apt-get update && apt-get install -y \
dumb-init \
ffmpeg \
wget \
&& rm -rf /var/lib/apt/lists/*
RUN userdel -r node && \
groupadd -r -g 1000 node && \
useradd -r -u 1000 -g node -m -d /home/node -s /bin/bash node
WORKDIR /home/node/app
RUN mkdir -p packages/backend
COPY --from=builder --chown=node:node /app/node_modules ./node_modules
COPY --from=builder --chown=node:node /app/package.json ./package.json
COPY --from=builder --chown=node:node /app/packages/backend/dist ./packages/backend/dist
COPY --from=builder --chown=node:node /app/packages/backend/node_modules ./packages/backend/node_modules
COPY --from=builder --chown=node:node /app/packages/backend/package.json ./packages/backend/package.json
COPY --from=builder --chown=node:node /app/packages/backend/src/migrations ./packages/backend/migrations
RUN mkdir -p /data/uploads && chown node:node /data/uploads
USER node
ENV NODE_ENV=production \
PORT=4000
EXPOSE 4000
HEALTHCHECK --interval=30s --timeout=5s --start-period=20s --retries=3 \
CMD wget --no-verbose --tries=1 --spider http://localhost:4000/health
ENTRYPOINT ["dumb-init", "--"]
CMD ["node", "packages/backend/dist/index.js"]

View File

@@ -1,334 +0,0 @@
# Quick Start Guide
Get sexy.pivoine.art running in under 5 minutes using pre-built Docker images.
## Prerequisites
- Docker 20.10+
- Docker Compose 2.0+ (optional)
## Option 1: Docker Run (Fastest)
### Step 1: Pull the Image
```bash
docker pull ghcr.io/valknarxxx/sexy:latest
```
### Step 2: Create Environment File
```bash
cat > .env.production << EOF
PUBLIC_API_URL=https://api.your-domain.com
PUBLIC_URL=https://your-domain.com
PUBLIC_UMAMI_ID=
LETTERSPACE_API_URL=
LETTERSPACE_API_KEY=
LETTERSPACE_LIST_ID=
EOF
```
### Step 3: Run the Container
```bash
docker run -d \
--name sexy-pivoine \
-p 3000:3000 \
--env-file .env.production \
--restart unless-stopped \
ghcr.io/valknarxxx/sexy:latest
```
### Step 4: Verify
```bash
# Check if running
docker ps | grep sexy-pivoine
# Check logs
docker logs -f sexy-pivoine
# Test the application
curl http://localhost:3000
```
Your application is now running at `http://localhost:3000` 🎉
## Option 2: Docker Compose (Recommended)
### Step 1: Download docker-compose.production.yml
```bash
curl -O https://raw.githubusercontent.com/valknarxxx/sexy/main/docker-compose.production.yml
```
Or if you have the repository:
```bash
cd /path/to/sexy.pivoine.art
```
### Step 2: Create Environment File
```bash
cp .env.production.example .env.production
nano .env.production # Edit with your values
```
### Step 3: Start Services
```bash
docker-compose -f docker-compose.production.yml up -d
```
### Step 4: Monitor
```bash
# View logs
docker-compose -f docker-compose.production.yml logs -f
# Check status
docker-compose -f docker-compose.production.yml ps
```
Your application is now running at `http://localhost:3000` 🎉
## Accessing Private Images
If the image is in a private registry:
### Step 1: Create GitHub Personal Access Token
1. Go to https://github.com/settings/tokens
2. Click "Generate new token (classic)"
3. Select scope: `read:packages`
4. Generate and copy the token
### Step 2: Login to GitHub Container Registry
```bash
echo YOUR_GITHUB_TOKEN | docker login ghcr.io -u YOUR_GITHUB_USERNAME --password-stdin
```
### Step 3: Pull and Run
Now you can pull private images:
```bash
docker pull ghcr.io/valknarxxx/sexy:latest
```
## Environment Variables
### Required
| Variable | Description | Example |
|----------|-------------|---------|
| `PUBLIC_API_URL` | Directus API endpoint | `https://api.pivoine.art` |
| `PUBLIC_URL` | Frontend URL | `https://sexy.pivoine.art` |
### Optional
| Variable | Description | Example |
|----------|-------------|---------|
| `PUBLIC_UMAMI_ID` | Analytics tracking ID | `abc-123-def` |
| `LETTERSPACE_API_URL` | Newsletter API | `https://api.letterspace.com/v1` |
| `LETTERSPACE_API_KEY` | Newsletter API key | `sk_live_...` |
| `LETTERSPACE_LIST_ID` | Mailing list ID | `list_abc123` |
## Common Commands
### View Logs
```bash
# Follow logs (Docker Run)
docker logs -f sexy-pivoine
# Follow logs (Docker Compose)
docker-compose -f docker-compose.production.yml logs -f
```
### Restart Container
```bash
# Docker Run
docker restart sexy-pivoine
# Docker Compose
docker-compose -f docker-compose.production.yml restart
```
### Stop Container
```bash
# Docker Run
docker stop sexy-pivoine
# Docker Compose
docker-compose -f docker-compose.production.yml down
```
### Update to Latest Version
```bash
# Docker Run
docker pull ghcr.io/valknarxxx/sexy:latest
docker stop sexy-pivoine
docker rm sexy-pivoine
# Then re-run the docker run command from Step 3
# Docker Compose
docker-compose -f docker-compose.production.yml pull
docker-compose -f docker-compose.production.yml up -d
```
### Shell Access
```bash
# Docker Run
docker exec -it sexy-pivoine sh
# Docker Compose
docker-compose -f docker-compose.production.yml exec frontend sh
```
## Available Image Tags
| Tag | Description | Use Case |
|-----|-------------|----------|
| `latest` | Latest stable build from main | Production |
| `v1.0.0` | Specific version | Production (pinned) |
| `develop` | Latest from develop branch | Staging |
| `main-abc123` | Specific commit | Testing |
**Best Practice:** Use version tags in production for predictable deployments.
## Production Deployment
### 1. Use Version Tags
```bash
# Instead of :latest
docker pull ghcr.io/valknarxxx/sexy:v1.0.0
```
### 2. Add Resource Limits
```bash
docker run -d \
--name sexy-pivoine \
-p 3000:3000 \
--env-file .env.production \
--memory="2g" \
--cpus="2" \
--restart unless-stopped \
ghcr.io/valknarxxx/sexy:v1.0.0
```
### 3. Use a Reverse Proxy
Example with nginx:
```nginx
server {
listen 80;
server_name sexy.pivoine.art;
location / {
proxy_pass http://localhost:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
}
```
### 4. Enable HTTPS
Use Certbot or similar:
```bash
certbot --nginx -d sexy.pivoine.art
```
## Health Check
The container includes a built-in health check:
```bash
# Check container health
docker inspect --format='{{.State.Health.Status}}' sexy-pivoine
```
Possible statuses:
- `starting` - Container just started
- `healthy` - Application is responding
- `unhealthy` - Application is not responding
## Troubleshooting
### Container Exits Immediately
```bash
# Check logs
docker logs sexy-pivoine
# Common issues:
# - Missing environment variables
# - Port 3000 already in use
# - Invalid environment variable values
```
### Cannot Pull Image
```bash
# For private images, ensure you're logged in
docker login ghcr.io
# Check if image exists
docker pull ghcr.io/valknarxxx/sexy:latest
```
### Port Already in Use
```bash
# Use a different port
docker run -d -p 8080:3000 ghcr.io/valknarxxx/sexy:latest
# Or find what's using port 3000
lsof -i :3000
```
### Application Not Accessible
```bash
# Check if container is running
docker ps | grep sexy-pivoine
# Check logs
docker logs sexy-pivoine
# Verify port mapping
docker port sexy-pivoine
# Test from inside container
docker exec sexy-pivoine wget -O- http://localhost:3000
```
## Next Steps
- **Production setup:** See [DOCKER.md](DOCKER.md)
- **Development:** See [CLAUDE.md](CLAUDE.md)
- **CI/CD:** See [.github/workflows/README.md](.github/workflows/README.md)
## Support
- **Issues:** https://github.com/valknarxxx/sexy/issues
- **Discussions:** https://github.com/valknarxxx/sexy/discussions
- **Security:** Report privately via GitHub Security tab
## License
See [LICENSE](LICENSE) file for details.

322
README.md
View File

@@ -4,7 +4,7 @@
![sexy lips tongue mouth american apparel moist lip gloss ](https://i.gifer.com/1pYe.gif)
*"Lust und Liebe gehören zusammen - wer das eine verteufelt, zerstört auch das andere."*
_"Lust und Liebe gehören zusammen - wer das eine verteufelt, zerstört auch das andere."_
**Beate Uhse**, Pionierin der sexuellen Befreiung ✈️
---
@@ -13,10 +13,10 @@
Built with passion, technology, and the fearless spirit of sexual empowerment
[![Build Status](https://img.shields.io/github/actions/workflow/status/valknarxxx/sexy.pivoine.art/docker-build-push.yml?style=for-the-badge&logo=docker&logoColor=white&color=FF69B4&labelColor=8B008B)](https://github.com/valknarxxx/sexy.pivoine.art/actions/workflows/docker-build-push.yml)
[![Security Scan](https://img.shields.io/github/actions/workflow/status/valknarxxx/sexy.pivoine.art/docker-scan.yml?style=for-the-badge&logo=security&logoColor=white&label=Security&color=DA70D6&labelColor=8B008B)](https://github.com/valknarxxx/sexy.pivoine.art/actions/workflows/docker-scan.yml)
[![Build Frontend](https://dev.pivoine.art/valknar/sexy/actions/workflows/docker-build-frontend.yml/badge.svg)](https://dev.pivoine.art/valknar/sexy/actions)
[![Build Backend](https://dev.pivoine.art/valknar/sexy/actions/workflows/docker-build-backend.yml/badge.svg)](https://dev.pivoine.art/valknar/sexy/actions)
[![License](https://img.shields.io/badge/License-For_Pleasure-FF1493?style=for-the-badge&logo=heart&logoColor=white&labelColor=8B008B)](LICENSE)
[![Made with Love](https://img.shields.io/badge/Made_with-💜_Love-FF69B4?style=for-the-badge&labelColor=8B008B)](http://sexy.pivoine.art)
[![Made with Love](https://img.shields.io/badge/Made_with-💜_Love-FF69B4?style=for-the-badge&labelColor=8B008B)](https://sexy.pivoine.art)
</div>
@@ -24,21 +24,23 @@ Built with passion, technology, and the fearless spirit of sexual empowerment
## 👅 What Is This Delicious Creation?
Welcome, dear pleasure-seeker! This is **sexy.pivoine.art** — a modern, sensual platform combining the elegance of **SvelteKit**, the power of **Directus CMS**, and the intimate connection of **Buttplug.io** hardware integration.
Welcome, dear pleasure-seeker! This is **sexy.pivoine.art** — a modern, sensual platform built from the ground up with full control over every intimate detail. A **SvelteKit** frontend caresses a purpose-built **Fastify + GraphQL** backend, while **Buttplug.io** hardware integration brings the experience into the physical world.
Like Beate Uhse breaking barriers in post-war Germany, we believe in the freedom to explore, create, and celebrate sexuality without shame. This platform is built for **models**, **creators**, and **connoisseurs** of adult content who deserve technology as sophisticated as their desires.
### ♉ Features That'll Make You Blush ♊
- 💖 **Sensual SvelteKit Frontend** with Tailwind CSS 4 styling
- 🗄️ **Headless CMS** powered by Directus for content liberation
- **Purpose-built GraphQL Backend** — lean, fast, no CMS overhead
- 🔐 **Session-based Auth** with Redis & Argon2 — discretion guaranteed
- 🖼️ **Smart Image Transforms** via Sharp (WebP, multiple presets, cached)
- 🎮 **Hardware Integration** via Buttplug.io (yes, really!)
- 🌐 **Multi-Platform Support** (AMD64 + ARM64) — pleasure everywhere
- 🔒 **Session-Based Authentication** — discretion guaranteed
- 📱 **Responsive Design** that looks sexy on any device
- 🌍 **Internationalization** — pleasure speaks all languages
- 🏆 **Gamification** — achievements, leaderboards, and reward points
- 💬 **Comments & Social** — build your community
- 📊 **Analytics Integration** (Umami) — know your admirers
- 📧 **Newsletter Integration** (Letterspace) — stay connected
- 🐳 **Self-hosted CI/CD** via Gitea Actions on `dev.pivoine.art`
<div align="center">
@@ -49,15 +51,21 @@ Like Beate Uhse breaking barriers in post-war Germany, we believe in the freedom
```
┌─────────────────────────────────────────────────────────────┐
│ 💋 Frontend Layer │
│ ├─ SvelteKit 2.0 → Smooth as silk │
│ ├─ SvelteKit 2 → Smooth as silk │
│ ├─ Tailwind CSS 4 → Styled to seduce │
│ ├─ bits-ui Components → Building blocks of pleasure │
│ ├─ graphql-request v7 → Whispering to the backend │
│ └─ Vite → Fast and furious │
├─────────────────────────────────────────────────────────────┤
│ 🍷 Backend Layer │
│ ├─ Directus CMS → Content with no limits
│ ├─ Custom Extensions → Bespoke pleasures
─ PostgreSQL → Data deep and secure
│ ├─ Fastify v5 → The fastest penetration
│ ├─ GraphQL Yoga v5 → Flexible positions
─ Pothos (code-first) → Schema with intention
│ ├─ Drizzle ORM → Data with grace │
│ ├─ PostgreSQL 16 → Deep and persistent │
│ ├─ Redis → Sessions that never forget │
│ ├─ Sharp → Images transformed beautifully │
│ └─ Argon2 → Passwords hashed with passion │
├─────────────────────────────────────────────────────────────┤
│ 🎀 Hardware Layer │
│ ├─ Buttplug.io → Real connections │
@@ -66,8 +74,8 @@ Like Beate Uhse breaking barriers in post-war Germany, we believe in the freedom
├─────────────────────────────────────────────────────────────┤
│ 🌸 DevOps Layer │
│ ├─ Docker → Containerized ecstasy │
│ ├─ GitHub Actions → Automated seduction
│ └─ GHCR → Images served hot
│ ├─ Gitea Actions Self-hosted seduction │
│ └─ dev.pivoine.art → Our own pleasure palace
└─────────────────────────────────────────────────────────────┘
```
@@ -75,46 +83,49 @@ Like Beate Uhse breaking barriers in post-war Germany, we believe in the freedom
## 🔥 Quick Start — Get Intimate Fast
### 💕 Option 1: Using Docker (Recommended)
### 💕 Option 1: Using Docker Compose (Recommended)
```bash
# Pull the pleasure
docker pull ghcr.io/valknarxxx/sexy:latest
# Clone the repository
git clone https://dev.pivoine.art/valknar/sexy.git
cd sexy.pivoine.art
# Run with passion
docker run -d -p 3000:3000 \
-e PUBLIC_API_URL=https://api.your-domain.com \
-e PUBLIC_URL=https://your-domain.com \
ghcr.io/valknarxxx/sexy:latest
# Configure your secrets
cp .env.example .env
# Edit .env with your intimate details
# Awaken all services (postgres, redis, backend, frontend)
docker compose up -d
# Visit your creation at http://localhost:3000 💋
```
See [QUICKSTART.md](QUICKSTART.md) for the full seduction guide.
### 💜 Option 2: Local Development
**Prerequisites:**
1. Node.js 20.19.1 — *the foundation*
2. `corepack enable`*unlock the tools*
3. `pnpm install`*gather your ingredients*
4. Rust + `cargo install wasm-bindgen-cli`*forge the connection*
1. Node.js 20.19.1 — _the foundation_
2. `corepack enable`_unlock the tools_
3. `pnpm install`_gather your ingredients_
4. PostgreSQL 16 + Redis_the data lovers_
**Start your pleasure journey:**
```bash
# Awaken all services
pnpm dev
# Awaken data services
pnpm dev:data
# Or tease them one by one
pnpm dev:data # The foundation
pnpm dev:directus # The content
pnpm --filter @sexy.pivoine.art/frontend dev # The face
# Start the backend (port 4000)
pnpm dev:backend
# Start the frontend (port 3000, proxied to :4000)
pnpm --filter @sexy.pivoine.art/frontend dev
```
Visit `http://localhost:3000` and let the experience begin... 💋
GraphQL playground is available at `http://localhost:4000/graphql` — explore every query.
---
## 🌹 Project Structure
@@ -124,100 +135,116 @@ This monorepo contains three packages, each serving its purpose:
```
sexy.pivoine.art/
├─ 💄 packages/frontend/ → SvelteKit app (the seduction)
├─ 🎭 packages/bundle/ Directus extensions (the power)
├─ packages/backend/Fastify + GraphQL API (the engine)
└─ 🎮 packages/buttplug/ → Hardware control (the connection)
```
---
### 💄 Frontend (`packages/frontend/`)
## 📚 Documentation — Your Guide to Pleasure
SvelteKit 2 application with server-side rendering, i18n, and a clean component library.
Communicates with the backend exclusively via GraphQL using `graphql-request`.
Assets served via `/api/assets/:id?transform=<preset>` — no CDN, no Directus, just raw power.
<div align="center">
### ⚡ Backend (`packages/backend/`)
| Document | Purpose | Emoji |
|----------|---------|-------|
| [QUICKSTART.md](QUICKSTART.md) | Get wet... I mean, get started! | 💦 |
| [COMPOSE.md](COMPOSE.md) | Docker Compose setup guide | 🐳 |
| [DOCKER.md](DOCKER.md) | Standalone Docker deployment | 🐋 |
| [CLAUDE.md](CLAUDE.md) | Architecture & development | 🤖 |
| [.github/workflows/README.md](.github/workflows/README.md) | CI/CD workflows | ⚙️ |
Purpose-built Fastify v5 + GraphQL Yoga server. All business logic lives here:
auth, file uploads, video processing, comments, gamification, and analytics.
Files stored as `<UPLOAD_DIR>/<uuid>/<filename>` with on-demand WebP transforms cached on disk.
</div>
### 🎮 Buttplug (`packages/buttplug/`)
Hybrid TypeScript/Rust package for intimate hardware control via WebBluetooth.
Compiled to WebAssembly for browser-based Bluetooth device communication.
---
## 🎨 Building — Craft Your Masterpiece
## 🗃️ Database Schema
### Build All Packages
Built with Drizzle ORM — clean tables, no `directus_` prefix, full control:
```bash
# Prepare everything
pnpm install
# Build the WASM foundation
pnpm --filter @sexy.pivoine.art/buttplug build:wasm
# Build the packages
pnpm --filter @sexy.pivoine.art/buttplug build
pnpm --filter @sexy.pivoine.art/frontend build
pnpm --filter @sexy.pivoine.art/bundle build
```
### Build Docker Image
```bash
# Quick build
./build.sh
# Manual control
docker build -t sexy.pivoine.art:latest .
# Multi-platform pleasure
docker buildx build --platform linux/amd64,linux/arm64 -t sexy.pivoine.art:latest .
users → profiles, roles (model/viewer/admin), auth tokens
files → uploaded assets with metadata and duration
videos → content with model junctions, likes, plays
articles → magazine / editorial content
recordings → user-created content with play tracking
comments → threaded by collection + item_id
achievements → gamification goals
user_points → points ledger
user_stats → cached leaderboard data
```
---
## 🚀 Deployment — Share Your Creation
## 🔐 Authentication Flow
```
POST /graphql (login mutation)
→ verify argon2 password hash
→ nanoid(32) session token
→ SET session:<token> <user JSON> EX 86400 in Redis
→ set httpOnly cookie: session_token
→ return CurrentUser
Every request:
→ read session_token cookie
→ GET session:<token> from Redis
→ inject currentUser into GraphQL context
```
---
## 🖼️ Image Transforms
Assets are transformed on first request and cached as WebP:
| Preset | Size | Fit | Use |
| ----------- | ----------- | ------ | ---------------- |
| `mini` | 80×80 | cover | Avatars in lists |
| `thumbnail` | 300×300 | cover | Profile photos |
| `preview` | 800px wide | inside | Video teasers |
| `medium` | 1400px wide | inside | Full-size images |
| `banner` | 1600×480 | cover | Profile banners |
---
## 🚀 Deployment
### Production with Docker Compose
```bash
# Configure your secrets
cp .env.production.example .env.production
# Edit .env.production with your intimate details
cp .env.example .env.production
# Edit .env.production — set DB credentials, SMTP, cookie secret, CORS origin
# Deploy with grace (uses Traefik for routing)
docker-compose -f compose.production.yml --env-file .env.production up -d
# Deploy
docker compose --env-file .env.production up -d
```
### Production without Docker
Key environment variables for the backend:
```bash
# Build everything
pnpm build:frontend
# Start serving
pnpm --filter @sexy.pivoine.art/frontend start
```env
DATABASE_URL=postgresql://sexy:sexy@postgres:5432/sexy
REDIS_URL=redis://redis:6379
COOKIE_SECRET=your-very-secret-key
CORS_ORIGIN=https://sexy.pivoine.art
UPLOAD_DIR=/data/uploads
SMTP_HOST=your.smtp.host
SMTP_PORT=587
EMAIL_FROM=noreply@sexy.pivoine.art
PUBLIC_URL=https://sexy.pivoine.art
```
---
### 🎬 CI/CD — Self-Hosted Seduction
## 🌈 Environment Variables
Automated builds run on **[dev.pivoine.art](https://dev.pivoine.art/valknar/sexy)** via Gitea Actions:
### 💖 Required (The Essentials)
- ✅ Frontend image → `dev.pivoine.art/valknar/sexy:latest`
- ✅ Backend image → `dev.pivoine.art/valknar/sexy-backend:latest`
- ✅ Triggers on push to `main`, `develop`, or version tags (`v*.*.*`)
- ✅ Build cache via registry for fast successive builds
- `PUBLIC_API_URL` — Your Directus backend
- `PUBLIC_URL` — Your frontend domain
### 💜 Optional (The Extras)
- `PUBLIC_UMAMI_ID` — Analytics tracking
- `LETTERSPACE_API_URL` — Newsletter endpoint
- `LETTERSPACE_API_KEY` — Newsletter key
- `LETTERSPACE_LIST_ID` — Mailing list
See [.env.production.example](.env.production.example) for the full configuration.
Images are pulled on the production server via Watchtower or manual `docker compose pull && docker compose up -d`.
---
@@ -228,60 +255,54 @@ graph LR
A[💡 Idea] --> B[💻 Code]
B --> C[🧪 Test Locally]
C --> D[🌿 Feature Branch]
D --> E[📤 Push & PR]
E --> F{✅ CI Pass?}
D --> E[📤 Push to dev.pivoine.art]
E --> F{✅ Build Pass?}
F -->|Yes| G[🔀 Merge to Main]
F -->|No| B
G --> H[🚀 Auto Deploy]
H --> I[🏷️ Tag Release]
I --> J[🎉 Celebrate]
G --> H[🚀 Images Built & Pushed]
H --> I[🎉 Deploy to Production]
```
1. Create → `git checkout -b feature/my-sexy-feature`
2. Develop → Write beautiful code
3. Test → `pnpm dev`
4. Push → Create PR (triggers CI build)
5. Merge → Automatic deployment to production
3. Test → `pnpm dev:data && pnpm dev:backend && pnpm dev`
4. Push → `git push` to `dev.pivoine.art` (triggers CI build)
5. Merge → Images published, deploy to production
6. Release → `git tag v1.0.0 && git push origin v1.0.0`
---
## 🔐 Security — Protected Pleasure
## 🌈 Environment Variables
- 🛡️ Daily vulnerability scans with Trivy
- 🔒 Non-root Docker containers
- 📊 Security reports in GitHub Security tab
- 🤐 Confidential issue reporting available
### Backend (required)
*Report security concerns privately via GitHub Security.*
| Variable | Description |
| --------------- | ----------------------------- |
| `DATABASE_URL` | PostgreSQL connection string |
| `REDIS_URL` | Redis connection string |
| `COOKIE_SECRET` | Session cookie signing secret |
| `CORS_ORIGIN` | Allowed frontend origin |
| `UPLOAD_DIR` | Path for uploaded files |
---
### Backend (optional)
## 💝 Contributing — Join the Movement
| Variable | Default | Description |
| ------------ | ------- | ------------------------------ |
| `PORT` | `4000` | Backend listen port |
| `LOG_LEVEL` | `info` | Fastify log level |
| `SMTP_HOST` | — | Email server for auth flows |
| `SMTP_PORT` | `587` | Email server port |
| `EMAIL_FROM` | — | Sender address |
| `PUBLIC_URL` | — | Frontend URL (for email links) |
Like Beate Uhse fought for sexual liberation, we welcome contributors who believe in freedom, pleasure, and quality code.
### Frontend
1. **Fork** this repository
2. **Create** your feature branch
3. **Commit** your changes
4. **Push** to your branch
5. **Submit** a pull request
All contributors are bound by our code of conduct: **Respect, Consent, and Quality.**
---
## 🎯 CI/CD Pipeline — Automated Seduction
Our GitHub Actions workflows handle:
- ✅ Multi-platform Docker builds (AMD64 + ARM64)
- ✅ Automated publishing to GHCR
- ✅ Daily security vulnerability scans
- ✅ Weekly cleanup of old images
- ✅ Semantic versioning from git tags
**Images available at:** `ghcr.io/valknarxxx/sexy`
| Variable | Description |
| --------------------- | --------------------------------------------- |
| `PUBLIC_API_URL` | Backend URL (e.g. `http://sexy_backend:4000`) |
| `PUBLIC_URL` | Frontend public URL |
| `PUBLIC_UMAMI_ID` | Umami analytics site ID (optional) |
| `PUBLIC_UMAMI_SCRIPT` | Umami script URL (optional) |
---
@@ -291,20 +312,25 @@ Our GitHub Actions workflows handle:
### 🌸 Created with Love by 🌸
**[Palina](http://sexy.pivoine.art) & [Valknar](http://sexy.pivoine.art)**
**[Palina](https://sexy.pivoine.art) & [Valknar](https://sexy.pivoine.art)**
*Für die Mäuse...* 🐭💕
_Für die Mäuse..._ 🐭💕
---
### 🙏 Built With
| Technology | Purpose |
|------------|---------|
| [SvelteKit](https://kit.svelte.dev/) | Framework |
| [Directus](https://directus.io/) | CMS |
| --------------------------------------------------------- | -------------------- |
| [SvelteKit](https://kit.svelte.dev/) | Frontend framework |
| [Fastify](https://fastify.dev/) | HTTP server |
| [GraphQL Yoga](https://the-guild.dev/graphql/yoga-server) | GraphQL server |
| [Pothos](https://pothos-graphql.dev/) | Code-first schema |
| [Drizzle ORM](https://orm.drizzle.team/) | Database |
| [Sharp](https://sharp.pixelplumbing.com/) | Image transforms |
| [Buttplug.io](https://buttplug.io/) | Hardware |
| [bits-ui](https://www.bits-ui.com/) | Components |
| [bits-ui](https://www.bits-ui.com/) | UI components |
| [Gitea](https://dev.pivoine.art) | Self-hosted VCS & CI |
---
@@ -313,7 +339,7 @@ Our GitHub Actions workflows handle:
Pioneer of sexual liberation (1919-2001)
Pilot, Entrepreneur, Freedom Fighter
*"Eine Frau, die ihre Sexualität selbstbestimmt lebt, ist eine freie Frau."*
_"Eine Frau, die ihre Sexualität selbstbestimmt lebt, ist eine freie Frau."_
![Beate Uhse Quote](https://img.shields.io/badge/Beate_Uhse-Sexual_Liberation_Pioneer-FF1493?style=for-the-badge&logo=heart&logoColor=white&labelColor=8B008B)
@@ -334,9 +360,9 @@ Pilot, Entrepreneur, Freedom Fighter
<div align="center">
[![Issues](https://img.shields.io/badge/🐛_Issues-Report_Here-FF69B4?style=for-the-badge&labelColor=8B008B)](https://github.com/valknarxxx/sexy.pivoine.art/issues)
[![Discussions](https://img.shields.io/badge/💭_Discussions-Join_Here-DA70D6?style=for-the-badge&labelColor=8B008B)](https://github.com/valknarxxx/sexy.pivoine.art/discussions)
[![Website](https://img.shields.io/badge/🌐_Website-Visit_Here-FF1493?style=for-the-badge&labelColor=8B008B)](http://sexy.pivoine.art)
[![Repository](https://img.shields.io/badge/🐙_Repository-dev.pivoine.art-FF69B4?style=for-the-badge&labelColor=8B008B)](https://dev.pivoine.art/valknar/sexy)
[![Issues](https://img.shields.io/badge/🐛_Issues-Report_Here-DA70D6?style=for-the-badge&labelColor=8B008B)](https://dev.pivoine.art/valknar/sexy/issues)
[![Website](https://img.shields.io/badge/🌐_Website-Visit_Here-FF1493?style=for-the-badge&labelColor=8B008B)](https://sexy.pivoine.art)
</div>
@@ -355,8 +381,8 @@ Pilot, Entrepreneur, Freedom Fighter
╚═════╝ ╚══════╝╚═╝ ╚═╝ ╚═╝
</pre>
*Pleasure is a human right. Technology is freedom. Together, they are power.*
_Pleasure is a human right. Technology is freedom. Together, they are power._
**[sexy.pivoine.art](http://sexy.pivoine.art)** | © 2025 Palina & Valknar
**[sexy.pivoine.art](https://sexy.pivoine.art)** | © 2025 Palina & Valknar
</div>

View File

@@ -1,265 +0,0 @@
# 🔄 Rebuild Guide - When You Need to Rebuild the Image
## Why Rebuild?
SvelteKit's `PUBLIC_*` environment variables are **baked into the JavaScript** at build time. You need to rebuild when:
1. ✅ Changing `PUBLIC_API_URL`
2. ✅ Changing `PUBLIC_URL`
3. ✅ Changing `PUBLIC_UMAMI_ID`
4. ✅ Changing any `LETTERSPACE_*` variables
5. ❌ NOT needed for Directus env vars (those are runtime)
## Quick Rebuild Process
### 1. Update Frontend Environment Variables
Edit the frontend `.env` file:
```bash
nano packages/frontend/.env
```
Set your production values:
```bash
PUBLIC_API_URL=https://sexy.pivoine.art/api
PUBLIC_URL=https://sexy.pivoine.art
PUBLIC_UMAMI_ID=your-umami-id
LETTERSPACE_API_URL=https://api.letterspace.com/v1
LETTERSPACE_API_KEY=your-key
LETTERSPACE_LIST_ID=your-list-id
```
### 2. Rebuild the Image
```bash
# From the project root
docker build -t ghcr.io/valknarxxx/sexy:latest -t sexy.pivoine.art:latest .
```
**Expected Time:** 30-45 minutes (first build), 10-15 minutes (cached rebuild)
### 3. Restart Services
```bash
# If using docker-compose
cd /home/valknar/Projects/docker-compose/sexy
docker compose down
docker compose up -d
# Or directly
docker stop sexy_frontend
docker rm sexy_frontend
docker compose up -d frontend
```
## Monitoring the Build
### Check Build Progress
```bash
# Watch build output
docker build -t ghcr.io/valknarxxx/sexy:latest .
# Build stages:
# 1. Base (~30s) - Node.js setup
# 2. Builder (~25-40min) - Rust + WASM + packages
# - Rust installation: ~2-3 min
# - wasm-bindgen-cli: ~10-15 min
# - WASM build: ~5-10 min
# - Package builds: ~5-10 min
# 3. Runner (~2min) - Final image assembly
```
### Verify Environment Variables in Built Image
```bash
# Check what PUBLIC_API_URL is baked in
docker run --rm ghcr.io/valknarxxx/sexy:latest sh -c \
"grep -r 'PUBLIC_API_URL' /home/node/app/packages/frontend/build/ | head -3"
# Should show: https://sexy.pivoine.art/api
```
## Push to GitHub Container Registry
After successful build:
```bash
# Login to GHCR (first time only)
echo $GITHUB_TOKEN | docker login ghcr.io -u valknarxxx --password-stdin
# Push the image
docker push ghcr.io/valknarxxx/sexy:latest
```
## Alternative: Build Arguments (Future Enhancement)
To avoid rebuilding for every env change, consider adding build arguments:
```dockerfile
# In Dockerfile, before building frontend:
ARG PUBLIC_API_URL=https://sexy.pivoine.art/api
ARG PUBLIC_URL=https://sexy.pivoine.art
ARG PUBLIC_UMAMI_ID=
# Create .env.production dynamically
RUN echo "PUBLIC_API_URL=${PUBLIC_API_URL}" > packages/frontend/.env.production && \
echo "PUBLIC_URL=${PUBLIC_URL}" >> packages/frontend/.env.production && \
echo "PUBLIC_UMAMI_ID=${PUBLIC_UMAMI_ID}" >> packages/frontend/.env.production
```
Then build with:
```bash
docker build \
--build-arg PUBLIC_API_URL=https://sexy.pivoine.art/api \
--build-arg PUBLIC_URL=https://sexy.pivoine.art \
-t ghcr.io/valknarxxx/sexy:latest .
```
## Troubleshooting
### Build Fails at Rust Installation
```bash
# Check network connectivity
ping -c 3 sh.rustup.rs
# Build with verbose output
docker build --progress=plain -t ghcr.io/valknarxxx/sexy:latest .
```
### Build Fails at WASM
```bash
# Check if wasm-bindgen-cli matches package.json version
docker run --rm rust:latest cargo install wasm-bindgen-cli --version 0.2.103
```
### Frontend Still Shows Wrong URL
```bash
# Verify .env file is correct
cat packages/frontend/.env
# Check if old image is cached
docker images | grep sexy
docker rmi ghcr.io/valknarxxx/sexy:old-tag
# Force rebuild without cache
docker build --no-cache -t ghcr.io/valknarxxx/sexy:latest .
```
### Container Starts But Can't Connect to API
1. Check Traefik routing:
```bash
docker logs traefik | grep sexy
```
2. Check if Directus is accessible:
```bash
curl -I https://sexy.pivoine.art/api/server/health
```
3. Check frontend logs:
```bash
docker logs sexy_frontend
```
## Development vs Production
### Development (Local)
- Use `pnpm dev` for hot reload
- No rebuild needed for code changes
- Env vars from `.env` or shell
### Production (Docker)
- Rebuild required for PUBLIC_* changes
- Changes baked into JavaScript
- Env vars from `packages/frontend/.env`
## Optimization Tips
### Speed Up Rebuilds
1. **Use BuildKit cache:**
```bash
export DOCKER_BUILDKIT=1
docker build --build-arg BUILDKIT_INLINE_CACHE=1 -t ghcr.io/valknarxxx/sexy:latest .
```
2. **Multi-stage caching:**
- Dockerfile already optimized with multi-stage build
- Dependencies cached separately from code
3. **Parallel builds:**
```bash
# Build with more CPU cores
docker build --cpus 4 -t ghcr.io/valknarxxx/sexy:latest .
```
### Reduce Image Size
Current optimizations:
- ✅ Multi-stage build
- ✅ Production dependencies only
- ✅ Minimal base image
- ✅ No dev tools in final image
Expected sizes:
- Base: ~100MB
- Builder: ~2-3GB (not shipped)
- Runner: ~300-500MB (final)
## Automation
### GitHub Actions (Already Set Up)
The `.github/workflows/docker-build-push.yml` automatically:
1. Builds on push to main
2. Creates version tags
3. Pushes to GHCR
4. Caches layers for faster builds
**Trigger a rebuild:**
```bash
git tag v1.0.1
git push origin v1.0.1
```
### Local Build Script
Use the provided `build.sh`:
```bash
./build.sh -t v1.0.0 -p
```
## When NOT to Rebuild
You DON'T need to rebuild for:
- ❌ Directus configuration changes
- ❌ Database credentials
- ❌ Redis settings
- ❌ SMTP settings
- ❌ Session cookie settings
- ❌ Traefik labels
These are runtime environment variables and can be changed in docker-compose.
## Summary
| Change | Rebuild Needed | How to Apply |
|--------|----------------|--------------|
| `PUBLIC_API_URL` | ✅ Yes | Rebuild image |
| `PUBLIC_URL` | ✅ Yes | Rebuild image |
| `PUBLIC_UMAMI_ID` | ✅ Yes | Rebuild image |
| `LETTERSPACE_*` | ✅ Yes | Rebuild image |
| `SEXY_DIRECTUS_*` | ❌ No | Restart container |
| `DB_*` | ❌ No | Restart container |
| `EMAIL_*` | ❌ No | Restart container |
| Traefik labels | ❌ No | Restart container |
---
**Remember:** The key difference is **build-time** (compiled into JS) vs **runtime** (read from environment).

130
build.sh
View File

@@ -1,130 +0,0 @@
#!/bin/bash
# Build script for sexy.pivoine.art Docker image
set -e # Exit on error
# Colors for output
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
NC='\033[0m' # No Color
# Default values
IMAGE_NAME="sexy.pivoine.art"
TAG="latest"
PUSH=false
PLATFORM=""
# Parse arguments
while [[ $# -gt 0 ]]; do
case $1 in
-t|--tag)
TAG="$2"
shift 2
;;
-n|--name)
IMAGE_NAME="$2"
shift 2
;;
-p|--push)
PUSH=true
shift
;;
--platform)
PLATFORM="$2"
shift 2
;;
-h|--help)
echo "Usage: $0 [OPTIONS]"
echo ""
echo "Options:"
echo " -t, --tag TAG Set image tag (default: latest)"
echo " -n, --name NAME Set image name (default: sexy.pivoine.art)"
echo " -p, --push Push image after build"
echo " --platform PLATFORM Build for specific platform (e.g., linux/amd64,linux/arm64)"
echo " -h, --help Show this help message"
echo ""
echo "Examples:"
echo " $0 # Build with defaults"
echo " $0 -t v1.0.0 # Build with version tag"
echo " $0 --platform linux/amd64,linux/arm64 -p # Multi-platform build and push"
exit 0
;;
*)
echo -e "${RED}Unknown option: $1${NC}"
exit 1
;;
esac
done
FULL_IMAGE="${IMAGE_NAME}:${TAG}"
echo -e "${GREEN}=== Building Docker Image ===${NC}"
echo "Image: ${FULL_IMAGE}"
echo "Platform: ${PLATFORM:-default}"
echo ""
# Check if Docker is running
if ! docker info > /dev/null 2>&1; then
echo -e "${RED}Error: Docker is not running${NC}"
exit 1
fi
# Build command
BUILD_CMD="docker build"
if [ -n "$PLATFORM" ]; then
# Multi-platform build requires buildx
echo -e "${YELLOW}Using buildx for multi-platform build${NC}"
BUILD_CMD="docker buildx build --platform ${PLATFORM}"
if [ "$PUSH" = true ]; then
BUILD_CMD="${BUILD_CMD} --push"
fi
else
# Regular build
if [ "$PUSH" = true ]; then
echo -e "${YELLOW}Note: --push only works with multi-platform builds. Use 'docker push' after build.${NC}"
fi
fi
# Execute build
echo -e "${GREEN}Building...${NC}"
$BUILD_CMD -t "${FULL_IMAGE}" .
if [ $? -eq 0 ]; then
echo -e "${GREEN}✓ Build successful!${NC}"
echo "Image: ${FULL_IMAGE}"
# Show image size
if [ -z "$PLATFORM" ]; then
SIZE=$(docker images "${FULL_IMAGE}" --format "{{.Size}}")
echo "Size: ${SIZE}"
fi
# Push if requested and not multi-platform
if [ "$PUSH" = true ] && [ -z "$PLATFORM" ]; then
echo -e "${GREEN}Pushing image...${NC}"
docker push "${FULL_IMAGE}"
if [ $? -eq 0 ]; then
echo -e "${GREEN}✓ Push successful!${NC}"
else
echo -e "${RED}✗ Push failed${NC}"
exit 1
fi
fi
echo ""
echo -e "${GREEN}Next steps:${NC}"
echo "1. Run locally:"
echo " docker run -d -p 3000:3000 --env-file .env.production ${FULL_IMAGE}"
echo ""
echo "2. Run with docker-compose:"
echo " docker-compose -f docker-compose.production.yml up -d"
echo ""
echo "3. View logs:"
echo " docker logs -f <container-name>"
else
echo -e "${RED}✗ Build failed${NC}"
exit 1
fi

View File

@@ -1,130 +0,0 @@
include:
- compose.yml
# Production compose file - extends base compose.yml
# Usage: docker-compose -f compose.production.yml up -d
networks:
compose_network:
external: true
name: compose_network
services:
# Disable local postgres for production (use external DB)
postgres:
deploy:
replicas: 0
# Disable local redis for production (use external Redis)
redis:
deploy:
replicas: 0
# Override Directus for production
directus:
networks:
- compose_network
ports: [] # Remove exposed ports, use Traefik instead
# Override volumes for production paths
volumes:
- ${SEXY_DIRECTUS_UPLOADS:-./uploads}:/directus/uploads
- ${SEXY_DIRECTUS_BUNDLE:-./packages/bundle/dist}:/directus/extensions/sexy.pivoine.art
# Override environment for production settings
environment:
# Database (external)
DB_HOST: ${CORE_DB_HOST}
DB_PORT: ${CORE_DB_PORT:-5432}
DB_DATABASE: ${SEXY_DB_NAME}
DB_USER: ${DB_USER}
DB_PASSWORD: ${DB_PASSWORD}
# General
SECRET: ${SEXY_DIRECTUS_SECRET}
ADMIN_EMAIL: ${ADMIN_EMAIL}
ADMIN_PASSWORD: ${ADMIN_PASSWORD}
PUBLIC_URL: ${SEXY_PUBLIC_URL}
# Cache (external Redis)
REDIS: redis://${CORE_REDIS_HOST}:${CORE_REDIS_PORT:-6379}
# CORS
CORS_ORIGIN: ${SEXY_CORS_ORIGIN}
# Security (production settings)
SESSION_COOKIE_SECURE: ${SEXY_SESSION_COOKIE_SECURE:-true}
SESSION_COOKIE_SAME_SITE: ${SEXY_SESSION_COOKIE_SAME_SITE:-strict}
SESSION_COOKIE_DOMAIN: ${SEXY_SESSION_COOKIE_DOMAIN}
# Extensions
EXTENSIONS_AUTO_RELOAD: ${SEXY_EXTENSIONS_AUTO_RELOAD:-false}
# Email (production SMTP)
EMAIL_TRANSPORT: ${EMAIL_TRANSPORT:-smtp}
EMAIL_FROM: ${EMAIL_FROM}
EMAIL_SMTP_HOST: ${EMAIL_SMTP_HOST}
EMAIL_SMTP_PORT: ${EMAIL_SMTP_PORT:-587}
EMAIL_SMTP_USER: ${EMAIL_SMTP_USER}
EMAIL_SMTP_PASSWORD: ${EMAIL_SMTP_PASSWORD}
# User URLs
USER_REGISTER_URL_ALLOW_LIST: ${SEXY_USER_REGISTER_URL_ALLOW_LIST}
PASSWORD_RESET_URL_ALLOW_LIST: ${SEXY_PASSWORD_RESET_URL_ALLOW_LIST}
# Remove local dependencies
depends_on: []
labels:
# Traefik labels for reverse proxy
- 'traefik.enable=${SEXY_TRAEFIK_ENABLED:-true}'
- 'traefik.http.middlewares.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-redirect-web-secure.redirectscheme.scheme=https'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web.middlewares=${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-redirect-web-secure'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web.rule=Host(`${SEXY_TRAEFIK_HOST}`) && PathPrefix(`/api`)'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web.entrypoints=web'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web-secure.rule=Host(`${SEXY_TRAEFIK_HOST}`) && PathPrefix(`/api`)'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web-secure.tls.certresolver=resolver'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web-secure.entrypoints=web-secure'
- 'traefik.http.middlewares.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web-secure-compress.compress=true'
- 'traefik.http.middlewares.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-strip.stripprefix.prefixes=/api'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web-secure.middlewares=${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-strip,${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web-secure-compress'
- 'traefik.http.services.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-api-web-secure.loadbalancer.server.port=8055'
- 'traefik.docker.network=compose_network'
# Override Frontend for production
frontend:
networks:
- compose_network
ports: [] # Remove exposed ports, use Traefik instead
# Override environment for production
environment:
NODE_ENV: production
PUBLIC_API_URL: ${SEXY_FRONTEND_PUBLIC_API_URL}
PUBLIC_URL: ${SEXY_FRONTEND_PUBLIC_URL}
PUBLIC_UMAMI_ID: ${SEXY_FRONTEND_PUBLIC_UMAMI_ID:-}
LETTERSPACE_API_URL: ${SEXY_FRONTEND_LETTERSPACE_API_URL:-}
LETTERSPACE_API_KEY: ${SEXY_FRONTEND_LETTERSPACE_API_KEY:-}
LETTERSPACE_LIST_ID: ${SEXY_FRONTEND_LETTERSPACE_LIST_ID:-}
# Override volume for production path
volumes:
- ${SEXY_FRONTEND_PATH:-/var/www/sexy.pivoine.art}:/home/node/app
# Remove local dependency
depends_on: []
labels:
# Traefik labels for reverse proxy
- 'traefik.enable=${SEXY_TRAEFIK_ENABLED:-true}'
- 'traefik.http.middlewares.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-redirect-web-secure.redirectscheme.scheme=https'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web.middlewares=${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-redirect-web-secure'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web.rule=Host(`${SEXY_TRAEFIK_HOST}`)'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web.entrypoints=web'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web-secure.rule=Host(`${SEXY_TRAEFIK_HOST}`)'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web-secure.tls.certresolver=resolver'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web-secure.entrypoints=web-secure'
- 'traefik.http.middlewares.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web-secure-compress.compress=true'
- 'traefik.http.routers.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web-secure.middlewares=${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web-secure-compress'
- 'traefik.http.services.${SEXY_COMPOSE_PROJECT_NAME:-sexy}-frontend-web-secure.loadbalancer.server.port=3000'
- 'traefik.docker.network=compose_network'

View File

@@ -1,183 +1,91 @@
name: sexy
services:
# PostgreSQL Database (local only)
postgres:
image: postgres:16-alpine
container_name: ${SEXY_COMPOSE_PROJECT_NAME:-sexy}_postgres
container_name: sexy_postgres
restart: unless-stopped
networks:
- sexy-network
ports:
- "5432:5432"
volumes:
- postgres-data:/var/lib/postgresql/data
- postgres_data:/var/lib/postgresql/data
environment:
POSTGRES_DB: ${DB_DATABASE:-sexy}
POSTGRES_USER: ${DB_USER:-sexy}
POSTGRES_PASSWORD: ${DB_PASSWORD:-sexy}
POSTGRES_DB: sexy
POSTGRES_USER: sexy
POSTGRES_PASSWORD: sexy
healthcheck:
test: ["CMD-SHELL", "pg_isready -U ${DB_USER:-sexy}"]
test: ["CMD-SHELL", "pg_isready -U sexy"]
interval: 10s
timeout: 5s
retries: 5
# Redis Cache (local only)
redis:
image: redis:7-alpine
container_name: ${SEXY_COMPOSE_PROJECT_NAME:-sexy}_redis
container_name: sexy_redis
restart: unless-stopped
networks:
- sexy-network
ports:
- "6379:6379"
volumes:
- redis-data:/data
- redis_data:/data
command: redis-server --appendonly yes
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 10s
timeout: 5s
retries: 5
# Directus CMS
directus:
image: ${SEXY_DIRECTUS_IMAGE:-directus/directus:11}
container_name: ${SEXY_COMPOSE_PROJECT_NAME:-sexy}_api
backend:
build:
context: .
dockerfile: Dockerfile.backend
container_name: sexy_backend
restart: unless-stopped
networks:
- sexy-network
ports:
- "8055:8055"
- "4000:4000"
volumes:
- directus-uploads:/directus/uploads
- ${SEXY_DIRECTUS_BUNDLE:-./packages/bundle}:/directus/extensions/sexy.pivoine.art
- uploads_data:/data/uploads
environment:
# Database
DB_CLIENT: pg
DB_HOST: ${CORE_DB_HOST:-postgres}
DB_PORT: ${CORE_DB_PORT:-5432}
DB_DATABASE: ${SEXY_DB_NAME:-sexy}
DB_USER: ${DB_USER:-sexy}
DB_PASSWORD: ${DB_PASSWORD:-sexy}
# General
SECRET: ${SEXY_DIRECTUS_SECRET:-replace-with-random-secret-min-32-chars}
ADMIN_EMAIL: ${ADMIN_EMAIL:-admin@sexy.pivoine.art}
ADMIN_PASSWORD: ${ADMIN_PASSWORD:-admin}
PUBLIC_URL: ${SEXY_PUBLIC_URL:-http://localhost:8055}
# Cache
CACHE_ENABLED: ${SEXY_CACHE_ENABLED:-true}
CACHE_AUTO_PURGE: ${SEXY_CACHE_AUTO_PURGE:-true}
CACHE_STORE: redis
REDIS: redis://${CORE_REDIS_HOST:-redis}:${CORE_REDIS_PORT:-6379}
# CORS
CORS_ENABLED: ${SEXY_CORS_ENABLED:-true}
CORS_ORIGIN: ${SEXY_CORS_ORIGIN:-http://localhost:3000}
# Security
SESSION_COOKIE_SECURE: ${SEXY_SESSION_COOKIE_SECURE:-false}
SESSION_COOKIE_SAME_SITE: ${SEXY_SESSION_COOKIE_SAME_SITE:-lax}
SESSION_COOKIE_DOMAIN: ${SEXY_SESSION_COOKIE_DOMAIN:-localhost}
# Extensions
EXTENSIONS_PATH: ${SEXY_EXTENSIONS_PATH:-/directus/extensions}
EXTENSIONS_AUTO_RELOAD: ${SEXY_EXTENSIONS_AUTO_RELOAD:-true}
# WebSockets
WEBSOCKETS_ENABLED: ${SEXY_WEBSOCKETS_ENABLED:-true}
# Email (optional for local dev)
EMAIL_TRANSPORT: ${EMAIL_TRANSPORT:-sendmail}
EMAIL_FROM: ${EMAIL_FROM:-noreply@sexy.pivoine.art}
EMAIL_SMTP_HOST: ${EMAIL_SMTP_HOST:-}
EMAIL_SMTP_PORT: ${EMAIL_SMTP_PORT:-587}
EMAIL_SMTP_USER: ${EMAIL_SMTP_USER:-}
EMAIL_SMTP_PASSWORD: ${EMAIL_SMTP_PASSWORD:-}
# User Registration & Password Reset URLs
USER_REGISTER_URL_ALLOW_LIST: ${SEXY_USER_REGISTER_URL_ALLOW_LIST:-http://localhost:3000}
PASSWORD_RESET_URL_ALLOW_LIST: ${SEXY_PASSWORD_RESET_URL_ALLOW_LIST:-http://localhost:3000}
# Content Security Policy
CONTENT_SECURITY_POLICY_DIRECTIVES__FRAME_SRC: ${SEXY_CONTENT_SECURITY_POLICY_DIRECTIVES__FRAME_SRC:-}
# Timezone
TZ: ${TIMEZONE:-Europe/Amsterdam}
DATABASE_URL: postgresql://sexy:sexy@sexy_postgres:5432/sexy
REDIS_URL: redis://sexy_redis:6379
UPLOAD_DIR: /data/uploads
CORS_ORIGIN: http://localhost:3000
PORT: 4000
NODE_ENV: production
COOKIE_SECRET: change-me-in-production
SMTP_HOST: localhost
SMTP_PORT: 587
EMAIL_FROM: noreply@sexy.pivoine.art
PUBLIC_URL: http://localhost:3000
depends_on:
postgres:
condition: service_healthy
redis:
condition: service_healthy
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:8055/server/health"]
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:4000/health"]
interval: 30s
timeout: 10s
retries: 3
start_period: 40s
# Frontend (local development - optional, usually run via pnpm dev)
start_period: 20s
frontend:
image: ${SEXY_FRONTEND_IMAGE:-ghcr.io/valknarxxx/sexy:latest}
container_name: ${SEXY_COMPOSE_PROJECT_NAME:-sexy}_frontend
build:
context: .
dockerfile: Dockerfile
container_name: sexy_frontend
restart: unless-stopped
user: node
working_dir: /home/node/app/packages/frontend
networks:
- sexy-network
ports:
- "3000:3000"
environment:
# Node
NODE_ENV: ${NODE_ENV:-development}
NODE_ENV: production
PORT: 3000
HOST: 0.0.0.0
# Public environment variables
PUBLIC_API_URL: ${SEXY_FRONTEND_PUBLIC_API_URL:-http://localhost:8055}
PUBLIC_URL: ${SEXY_FRONTEND_PUBLIC_URL:-http://localhost:3000}
PUBLIC_UMAMI_ID: ${SEXY_FRONTEND_PUBLIC_UMAMI_ID:-}
# Letterspace newsletter integration
LETTERSPACE_API_URL: ${SEXY_FRONTEND_LETTERSPACE_API_URL:-}
LETTERSPACE_API_KEY: ${SEXY_FRONTEND_LETTERSPACE_API_KEY:-}
LETTERSPACE_LIST_ID: ${SEXY_FRONTEND_LETTERSPACE_LIST_ID:-}
# Timezone
TZ: ${TIMEZONE:-Europe/Amsterdam}
volumes:
- ${SEXY_FRONTEND_PATH:-./}:/home/node/app
command: ["node", "build/index.js"]
PUBLIC_API_URL: http://sexy_backend:4000
PUBLIC_URL: http://localhost:3000
depends_on:
- directus
healthcheck:
test: ["CMD", "node", "-e", "require('http').get('http://localhost:3000/', (r) => {process.exit(r.statusCode === 200 ? 0 : 1)})"]
interval: 30s
timeout: 3s
retries: 3
start_period: 40s
# Uncomment to run frontend in development mode with live reload
# build:
# context: .
# dockerfile: Dockerfile
# volumes:
# - ./packages/frontend:/home/node/app/packages/frontend
# - /home/node/app/packages/frontend/node_modules
# environment:
# NODE_ENV: development
networks:
sexy-network:
driver: bridge
name: ${SEXY_COMPOSE_PROJECT_NAME:-sexy}_network
backend:
condition: service_healthy
volumes:
directus-uploads:
uploads_data:
driver: local
postgres-data:
postgres_data:
driver: local
redis-data:
redis_data:
driver: local

File diff suppressed because it is too large Load Diff

59
eslint.config.js Normal file
View File

@@ -0,0 +1,59 @@
import js from "@eslint/js";
import ts from "typescript-eslint";
import svelte from "eslint-plugin-svelte";
import prettier from "eslint-config-prettier";
import globals from "globals";
export default ts.config(
js.configs.recommended,
...ts.configs.recommended,
...svelte.configs["flat/recommended"],
prettier,
...svelte.configs["flat/prettier"],
{
languageOptions: {
globals: {
...globals.browser,
...globals.node,
},
},
},
{
files: ["**/*.svelte"],
languageOptions: {
parserOptions: {
parser: ts.parser,
},
},
},
{
rules: {
// Allow unused vars prefixed with _ (common pattern for intentional ignores)
"@typescript-eslint/no-unused-vars": [
"error",
{ argsIgnorePattern: "^_", varsIgnorePattern: "^_" },
],
// Allow explicit any sparingly — we're adults here
"@typescript-eslint/no-explicit-any": "warn",
// Enforce consistent type imports
"@typescript-eslint/consistent-type-imports": [
"error",
{ prefer: "type-imports", fixStyle: "inline-type-imports" },
],
// This rule is meant for onNavigate() callbacks only; standard SvelteKit href/goto is fine
"svelte/no-navigation-without-resolve": "off",
// {@html} is used intentionally for trusted content (e.g. legal page)
"svelte/no-at-html-tags": "warn",
},
},
{
ignores: [
"**/build/",
"**/.svelte-kit/",
"**/dist/",
"**/node_modules/",
"**/migrations/",
"packages/buttplug/**",
],
},
);

View File

@@ -1,177 +0,0 @@
-- Gamification System Schema for Sexy Recordings Platform
-- Created: 2025-10-28
-- Description: Recording-focused gamification with time-weighted scoring
-- ====================
-- Table: sexy_recording_plays
-- ====================
-- Tracks when users play recordings (similar to video plays)
CREATE TABLE IF NOT EXISTS sexy_recording_plays (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID NOT NULL REFERENCES directus_users(id) ON DELETE CASCADE,
recording_id UUID NOT NULL REFERENCES sexy_recordings(id) ON DELETE CASCADE,
duration_played INTEGER, -- Duration played in milliseconds
completed BOOLEAN DEFAULT FALSE, -- True if >= 90% watched
date_created TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
date_updated TIMESTAMP WITH TIME ZONE DEFAULT NOW()
);
CREATE INDEX IF NOT EXISTS idx_recording_plays_user ON sexy_recording_plays(user_id);
CREATE INDEX IF NOT EXISTS idx_recording_plays_recording ON sexy_recording_plays(recording_id);
CREATE INDEX IF NOT EXISTS idx_recording_plays_date ON sexy_recording_plays(date_created);
COMMENT ON TABLE sexy_recording_plays IS 'Tracks user playback of recordings for analytics and gamification';
COMMENT ON COLUMN sexy_recording_plays.completed IS 'True if user watched at least 90% of the recording';
-- ====================
-- Table: sexy_user_points
-- ====================
-- Tracks individual point-earning actions with timestamps for time-weighted scoring
CREATE TABLE IF NOT EXISTS sexy_user_points (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID NOT NULL REFERENCES directus_users(id) ON DELETE CASCADE,
action VARCHAR(50) NOT NULL, -- e.g., "RECORDING_CREATE", "RECORDING_PLAY", "COMMENT_CREATE"
points INTEGER NOT NULL, -- Raw points earned
recording_id UUID REFERENCES sexy_recordings(id) ON DELETE SET NULL, -- Optional reference
date_created TIMESTAMP WITH TIME ZONE DEFAULT NOW()
);
CREATE INDEX IF NOT EXISTS idx_user_points_user ON sexy_user_points(user_id);
CREATE INDEX IF NOT EXISTS idx_user_points_date ON sexy_user_points(date_created);
CREATE INDEX IF NOT EXISTS idx_user_points_action ON sexy_user_points(action);
COMMENT ON TABLE sexy_user_points IS 'Individual point-earning actions for gamification system';
COMMENT ON COLUMN sexy_user_points.action IS 'Type of action: RECORDING_CREATE, RECORDING_PLAY, RECORDING_COMPLETE, COMMENT_CREATE, RECORDING_FEATURED';
COMMENT ON COLUMN sexy_user_points.points IS 'Raw points before time-weighted decay calculation';
-- ====================
-- Table: sexy_achievements
-- ====================
-- Predefined achievement definitions
CREATE TABLE IF NOT EXISTS sexy_achievements (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
code VARCHAR(50) UNIQUE NOT NULL, -- Unique identifier (e.g., "first_recording", "recording_100")
name VARCHAR(255) NOT NULL, -- Display name
description TEXT, -- Achievement description
icon VARCHAR(255), -- Icon identifier or emoji
category VARCHAR(50) NOT NULL, -- e.g., "recordings", "playback", "social", "special"
required_count INTEGER, -- Number of actions needed to unlock
points_reward INTEGER DEFAULT 0, -- Bonus points awarded upon unlock
sort INTEGER DEFAULT 0, -- Display order
status VARCHAR(20) DEFAULT 'published' -- published, draft, archived
);
CREATE INDEX IF NOT EXISTS idx_achievements_category ON sexy_achievements(category);
CREATE INDEX IF NOT EXISTS idx_achievements_code ON sexy_achievements(code);
COMMENT ON TABLE sexy_achievements IS 'Predefined achievement definitions for gamification';
COMMENT ON COLUMN sexy_achievements.code IS 'Unique code used in backend logic (e.g., first_recording, play_100)';
COMMENT ON COLUMN sexy_achievements.category IS 'Achievement category: recordings, playback, social, special';
-- ====================
-- Table: sexy_user_achievements
-- ====================
-- Junction table tracking unlocked achievements per user
CREATE TABLE IF NOT EXISTS sexy_user_achievements (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID NOT NULL REFERENCES directus_users(id) ON DELETE CASCADE,
achievement_id UUID NOT NULL REFERENCES sexy_achievements(id) ON DELETE CASCADE,
progress INTEGER DEFAULT 0, -- Current progress toward unlocking
date_unlocked TIMESTAMP WITH TIME ZONE, -- NULL if not yet unlocked
UNIQUE(user_id, achievement_id)
);
CREATE INDEX IF NOT EXISTS idx_user_achievements_user ON sexy_user_achievements(user_id);
CREATE INDEX IF NOT EXISTS idx_user_achievements_achievement ON sexy_user_achievements(achievement_id);
CREATE INDEX IF NOT EXISTS idx_user_achievements_unlocked ON sexy_user_achievements(date_unlocked) WHERE date_unlocked IS NOT NULL;
COMMENT ON TABLE sexy_user_achievements IS 'Tracks which achievements users have unlocked';
COMMENT ON COLUMN sexy_user_achievements.progress IS 'Current progress (e.g., 7/10 recordings created)';
COMMENT ON COLUMN sexy_user_achievements.date_unlocked IS 'NULL if achievement not yet unlocked';
-- ====================
-- Table: sexy_user_stats
-- ====================
-- Cached aggregate statistics for efficient leaderboard queries
CREATE TABLE IF NOT EXISTS sexy_user_stats (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID UNIQUE NOT NULL REFERENCES directus_users(id) ON DELETE CASCADE,
total_raw_points INTEGER DEFAULT 0, -- Sum of all points (no decay)
total_weighted_points NUMERIC(10,2) DEFAULT 0, -- Time-weighted score for rankings
recordings_count INTEGER DEFAULT 0, -- Number of published recordings
playbacks_count INTEGER DEFAULT 0, -- Number of recordings played
comments_count INTEGER DEFAULT 0, -- Number of comments on recordings
achievements_count INTEGER DEFAULT 0, -- Number of unlocked achievements
last_updated TIMESTAMP WITH TIME ZONE DEFAULT NOW() -- Cache timestamp
);
CREATE INDEX IF NOT EXISTS idx_user_stats_weighted ON sexy_user_stats(total_weighted_points DESC);
CREATE INDEX IF NOT EXISTS idx_user_stats_user ON sexy_user_stats(user_id);
COMMENT ON TABLE sexy_user_stats IS 'Cached user statistics for fast leaderboard queries';
COMMENT ON COLUMN sexy_user_stats.total_raw_points IS 'Sum of all points without time decay';
COMMENT ON COLUMN sexy_user_stats.total_weighted_points IS 'Time-weighted score using exponential decay (λ=0.005)';
COMMENT ON COLUMN sexy_user_stats.last_updated IS 'Timestamp for cache invalidation';
-- ====================
-- Insert Initial Achievements
-- ====================
-- 🎬 Recordings (Creation)
INSERT INTO sexy_achievements (code, name, description, icon, category, required_count, points_reward, sort) VALUES
('first_recording', 'First Recording', 'Create your first recording', '🎬', 'recordings', 1, 50, 1),
('recording_10', 'Recording Enthusiast', 'Create 10 recordings', '📹', 'recordings', 10, 100, 2),
('recording_50', 'Prolific Creator', 'Create 50 recordings', '🎥', 'recordings', 50, 500, 3),
('recording_100', 'Recording Master', 'Create 100 recordings', '🏆', 'recordings', 100, 1000, 4),
('featured_recording', 'Featured Creator', 'Get a recording featured', '', 'recordings', 1, 200, 5)
ON CONFLICT (code) DO NOTHING;
-- ▶️ Playback (Consumption)
INSERT INTO sexy_achievements (code, name, description, icon, category, required_count, points_reward, sort) VALUES
('first_play', 'First Play', 'Play your first recording', '▶️', 'playback', 1, 25, 10),
('play_100', 'Active Player', 'Play 100 recordings', '🎮', 'playback', 100, 250, 11),
('play_500', 'Playback Enthusiast', 'Play 500 recordings', '🔥', 'playback', 500, 1000, 12),
('completionist_10', 'Completionist', 'Complete 10 recordings to 90%+', '', 'playback', 10, 100, 13),
('completionist_100', 'Super Completionist', 'Complete 100 recordings', '💯', 'playback', 100, 500, 14)
ON CONFLICT (code) DO NOTHING;
-- 💬 Social (Community)
INSERT INTO sexy_achievements (code, name, description, icon, category, required_count, points_reward, sort) VALUES
('first_comment', 'First Comment', 'Leave your first comment', '💬', 'social', 1, 25, 20),
('comment_50', 'Conversationalist', 'Leave 50 comments', '💭', 'social', 50, 200, 21),
('comment_250', 'Community Voice', 'Leave 250 comments', '📣', 'social', 250, 750, 22)
ON CONFLICT (code) DO NOTHING;
-- ⭐ Special (Milestones)
INSERT INTO sexy_achievements (code, name, description, icon, category, required_count, points_reward, sort) VALUES
('early_adopter', 'Early Adopter', 'Join in the first month', '🚀', 'special', 1, 500, 30),
('one_year', 'One Year Anniversary', 'Be a member for 1 year', '🎂', 'special', 1, 1000, 31),
('balanced_creator', 'Balanced Creator', '50 recordings + 100 plays', '⚖️', 'special', 1, 500, 32),
('top_10_rank', 'Top 10 Leaderboard', 'Reach top 10 on leaderboard', '🏅', 'special', 1, 2000, 33)
ON CONFLICT (code) DO NOTHING;
-- ====================
-- Verification Queries
-- ====================
-- Count tables created
SELECT
'sexy_recording_plays' as table_name,
COUNT(*) as row_count
FROM sexy_recording_plays
UNION ALL
SELECT 'sexy_user_points', COUNT(*) FROM sexy_user_points
UNION ALL
SELECT 'sexy_achievements', COUNT(*) FROM sexy_achievements
UNION ALL
SELECT 'sexy_user_achievements', COUNT(*) FROM sexy_user_achievements
UNION ALL
SELECT 'sexy_user_stats', COUNT(*) FROM sexy_user_stats;
-- Show created achievements
SELECT
category,
COUNT(*) as achievement_count
FROM sexy_achievements
GROUP BY category
ORDER BY category;

View File

@@ -2,21 +2,30 @@
"name": "sexy.pivoine.art",
"version": "1.0.0",
"description": "",
"main": "index.js",
"type": "module",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"build:bundle": "git pull && pnpm install && pnpm --filter @sexy.pivoine.art/bundle build",
"build:frontend": "git pull && pnpm install && pnpm --filter @sexy.pivoine.art/frontend build",
"dev:data": "cd ../compose/data && docker compose up -d",
"dev:directus": "cd ../compose/sexy && docker compose --env-file=.env.local up -d directus",
"dev": "pnpm dev:data && pnpm dev:directus && pnpm --filter @sexy.pivoine.art/frontend dev"
"build:backend": "git pull && pnpm install && pnpm --filter @sexy.pivoine.art/backend build",
"dev:data": "docker compose up -d postgres redis",
"dev:backend": "pnpm --filter @sexy.pivoine.art/backend dev",
"dev": "pnpm dev:data && pnpm dev:backend & pnpm --filter @sexy.pivoine.art/frontend dev",
"lint": "eslint .",
"lint:fix": "eslint . --fix",
"format": "prettier --write .",
"format:check": "prettier --check .",
"check": "pnpm -r --filter=!sexy.pivoine.art check"
},
"keywords": [],
"author": "",
"license": "ISC",
"author": {
"name": "Valknar",
"email": "valknar@pivoine.art"
},
"license": "MIT",
"packageManager": "pnpm@10.19.0",
"pnpm": {
"onlyBuiltDependencies": [
"argon2",
"es5-ext",
"esbuild",
"svelte-preprocess",
@@ -26,5 +35,15 @@
"@tailwindcss/oxide",
"node-sass"
]
},
"devDependencies": {
"@eslint/js": "^10.0.1",
"eslint": "^10.0.2",
"eslint-config-prettier": "^10.1.8",
"eslint-plugin-svelte": "^3.15.0",
"globals": "^17.4.0",
"prettier": "^3.8.1",
"prettier-plugin-svelte": "^3.5.1",
"typescript-eslint": "^8.56.1"
}
}

View File

@@ -0,0 +1,10 @@
import { defineConfig } from "drizzle-kit";
export default defineConfig({
schema: "./src/db/schema/*.ts",
out: "./src/migrations",
dialect: "postgresql",
dbCredentials: {
url: process.env.DATABASE_URL || "postgresql://sexy:sexy@localhost:5432/sexy",
},
});

View File

@@ -0,0 +1,50 @@
{
"name": "@sexy.pivoine.art/backend",
"version": "1.0.0",
"private": true,
"scripts": {
"dev": "UPLOAD_DIR=../../.data/uploads DATABASE_URL=postgresql://sexy:sexy@localhost:5432/sexy REDIS_URL=redis://localhost:6379 tsx watch src/index.ts",
"build": "tsc",
"start": "node dist/index.js",
"db:generate": "drizzle-kit generate",
"db:migrate": "drizzle-kit migrate",
"db:studio": "drizzle-kit studio",
"schema:migrate": "tsx src/scripts/migrate.ts",
"migrate": "tsx src/scripts/data-migration.ts",
"check": "tsc --noEmit"
},
"dependencies": {
"@sexy.pivoine.art/types": "workspace:*",
"@fastify/cookie": "^11.0.2",
"@fastify/cors": "^10.0.2",
"@fastify/multipart": "^9.0.3",
"@fastify/static": "^8.1.1",
"@pothos/core": "^4.4.0",
"@pothos/plugin-errors": "^4.2.0",
"argon2": "^0.43.0",
"drizzle-orm": "^0.44.1",
"fastify": "^5.4.0",
"fluent-ffmpeg": "^2.1.3",
"graphql": "^16.11.0",
"graphql-scalars": "^1.24.2",
"graphql-ws": "^6.0.4",
"graphql-yoga": "^5.13.4",
"ioredis": "^5.6.1",
"nanoid": "^3.3.11",
"nodemailer": "^7.0.3",
"pg": "^8.16.0",
"sharp": "^0.33.5",
"slugify": "^1.6.6",
"uuid": "^11.1.0"
},
"devDependencies": {
"@types/fluent-ffmpeg": "^2.1.27",
"@types/nodemailer": "^6.4.17",
"@types/pg": "^8.15.4",
"@types/sharp": "^0.32.0",
"@types/uuid": "^10.0.0",
"drizzle-kit": "^0.31.1",
"tsx": "^4.19.4",
"typescript": "^5.9.3"
}
}

View File

@@ -0,0 +1,11 @@
import { drizzle } from "drizzle-orm/node-postgres";
import { Pool } from "pg";
import * as schema from "./schema/index";
const pool = new Pool({
connectionString: process.env.DATABASE_URL || "postgresql://sexy:sexy@localhost:5432/sexy",
max: 20,
});
export const db = drizzle(pool, { schema });
export type DB = typeof db;

View File

@@ -0,0 +1,32 @@
import { pgTable, text, timestamp, boolean, index, uniqueIndex } from "drizzle-orm/pg-core";
import { users } from "./users";
import { files } from "./files";
export const articles = pgTable(
"articles",
{
id: text("id")
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
slug: text("slug").notNull(),
title: text("title").notNull(),
excerpt: text("excerpt"),
content: text("content"),
image: text("image").references(() => files.id, { onDelete: "set null" }),
tags: text("tags").array().default([]),
publish_date: timestamp("publish_date").notNull().defaultNow(),
author: text("author").references(() => users.id, { onDelete: "set null" }),
category: text("category"),
featured: boolean("featured").default(false),
date_created: timestamp("date_created").notNull().defaultNow(),
date_updated: timestamp("date_updated"),
},
(t) => [
uniqueIndex("articles_slug_idx").on(t.slug),
index("articles_publish_date_idx").on(t.publish_date),
index("articles_featured_idx").on(t.featured),
],
);
export type Article = typeof articles.$inferSelect;
export type NewArticle = typeof articles.$inferInsert;

View File

@@ -0,0 +1,24 @@
import { pgTable, text, timestamp, index, integer } from "drizzle-orm/pg-core";
import { users } from "./users";
export const comments = pgTable(
"comments",
{
id: integer("id").primaryKey().generatedAlwaysAsIdentity(),
collection: text("collection").notNull(), // 'videos' | 'recordings'
item_id: text("item_id").notNull(),
comment: text("comment").notNull(),
user_id: text("user_id")
.notNull()
.references(() => users.id, { onDelete: "cascade" }),
date_created: timestamp("date_created").notNull().defaultNow(),
date_updated: timestamp("date_updated"),
},
(t) => [
index("comments_collection_item_idx").on(t.collection, t.item_id),
index("comments_user_idx").on(t.user_id),
],
);
export type Comment = typeof comments.$inferSelect;
export type NewComment = typeof comments.$inferInsert;

View File

@@ -0,0 +1,22 @@
import { pgTable, text, timestamp, bigint, integer, index } from "drizzle-orm/pg-core";
export const files = pgTable(
"files",
{
id: text("id")
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
title: text("title"),
description: text("description"),
filename: text("filename").notNull(),
mime_type: text("mime_type"),
filesize: bigint("filesize", { mode: "number" }),
duration: integer("duration"),
uploaded_by: text("uploaded_by"),
date_created: timestamp("date_created").notNull().defaultNow(),
},
(t) => [index("files_uploaded_by_idx").on(t.uploaded_by)],
);
export type File = typeof files.$inferSelect;
export type NewFile = typeof files.$inferInsert;

View File

@@ -0,0 +1,93 @@
import {
pgTable,
text,
timestamp,
integer,
real,
index,
pgEnum,
uniqueIndex,
} from "drizzle-orm/pg-core";
import { users } from "./users";
import { recordings } from "./recordings";
export const achievementStatusEnum = pgEnum("achievement_status", ["draft", "published"]);
export const achievements = pgTable(
"achievements",
{
id: text("id")
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
code: text("code").notNull(),
name: text("name").notNull(),
description: text("description"),
icon: text("icon"),
category: text("category"),
required_count: integer("required_count").notNull().default(1),
points_reward: integer("points_reward").notNull().default(0),
status: achievementStatusEnum("status").notNull().default("published"),
sort: integer("sort").default(0),
},
(t) => [uniqueIndex("achievements_code_idx").on(t.code)],
);
export const user_achievements = pgTable(
"user_achievements",
{
id: integer("id").primaryKey().generatedAlwaysAsIdentity(),
user_id: text("user_id")
.notNull()
.references(() => users.id, { onDelete: "cascade" }),
achievement_id: text("achievement_id")
.notNull()
.references(() => achievements.id, { onDelete: "cascade" }),
progress: integer("progress").default(0),
date_unlocked: timestamp("date_unlocked"),
},
(t) => [
index("user_achievements_user_idx").on(t.user_id),
uniqueIndex("user_achievements_unique_idx").on(t.user_id, t.achievement_id),
],
);
export const user_points = pgTable(
"user_points",
{
id: integer("id").primaryKey().generatedAlwaysAsIdentity(),
user_id: text("user_id")
.notNull()
.references(() => users.id, { onDelete: "cascade" }),
action: text("action").notNull(),
points: integer("points").notNull(),
recording_id: text("recording_id").references(() => recordings.id, {
onDelete: "set null",
}),
date_created: timestamp("date_created").notNull().defaultNow(),
},
(t) => [
index("user_points_user_idx").on(t.user_id),
index("user_points_date_idx").on(t.date_created),
],
);
export const user_stats = pgTable(
"user_stats",
{
id: integer("id").primaryKey().generatedAlwaysAsIdentity(),
user_id: text("user_id")
.notNull()
.references(() => users.id, { onDelete: "cascade" }),
total_raw_points: integer("total_raw_points").default(0),
total_weighted_points: real("total_weighted_points").default(0),
recordings_count: integer("recordings_count").default(0),
playbacks_count: integer("playbacks_count").default(0),
comments_count: integer("comments_count").default(0),
achievements_count: integer("achievements_count").default(0),
last_updated: timestamp("last_updated").defaultNow(),
},
(t) => [uniqueIndex("user_stats_user_idx").on(t.user_id)],
);
export type Achievement = typeof achievements.$inferSelect;
export type UserStats = typeof user_stats.$inferSelect;

View File

@@ -0,0 +1,7 @@
export * from "./files";
export * from "./users";
export * from "./videos";
export * from "./articles";
export * from "./recordings";
export * from "./comments";
export * from "./gamification";

View File

@@ -0,0 +1,73 @@
import {
pgTable,
text,
timestamp,
boolean,
integer,
pgEnum,
index,
uniqueIndex,
jsonb,
} from "drizzle-orm/pg-core";
import { users } from "./users";
import { videos } from "./videos";
export const recordingStatusEnum = pgEnum("recording_status", ["draft", "published", "archived"]);
export const recordings = pgTable(
"recordings",
{
id: text("id")
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
title: text("title").notNull(),
description: text("description"),
slug: text("slug").notNull(),
duration: integer("duration").notNull(),
events: jsonb("events").$type<object[]>().default([]),
device_info: jsonb("device_info").$type<object[]>().default([]),
user_id: text("user_id")
.notNull()
.references(() => users.id, { onDelete: "cascade" }),
status: recordingStatusEnum("status").notNull().default("draft"),
tags: text("tags").array().default([]),
linked_video: text("linked_video").references(() => videos.id, {
onDelete: "set null",
}),
featured: boolean("featured").default(false),
public: boolean("public").default(false),
original_recording_id: text("original_recording_id"),
date_created: timestamp("date_created").notNull().defaultNow(),
date_updated: timestamp("date_updated"),
},
(t) => [
uniqueIndex("recordings_slug_idx").on(t.slug),
index("recordings_user_idx").on(t.user_id),
index("recordings_status_idx").on(t.status),
index("recordings_public_idx").on(t.public),
],
);
export const recording_plays = pgTable(
"recording_plays",
{
id: text("id")
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
recording_id: text("recording_id")
.notNull()
.references(() => recordings.id, { onDelete: "cascade" }),
user_id: text("user_id").references(() => users.id, { onDelete: "set null" }),
duration_played: integer("duration_played").default(0),
completed: boolean("completed").default(false),
date_created: timestamp("date_created").notNull().defaultNow(),
date_updated: timestamp("date_updated"),
},
(t) => [
index("recording_plays_recording_idx").on(t.recording_id),
index("recording_plays_user_idx").on(t.user_id),
],
);
export type Recording = typeof recordings.$inferSelect;
export type NewRecording = typeof recordings.$inferInsert;

View File

@@ -0,0 +1,63 @@
import {
pgTable,
text,
timestamp,
pgEnum,
boolean,
index,
uniqueIndex,
integer,
} from "drizzle-orm/pg-core";
import { files } from "./files";
export const roleEnum = pgEnum("user_role", ["model", "viewer", "admin"]);
export const users = pgTable(
"users",
{
id: text("id")
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
email: text("email").notNull(),
password_hash: text("password_hash").notNull(),
first_name: text("first_name"),
last_name: text("last_name"),
artist_name: text("artist_name"),
slug: text("slug"),
description: text("description"),
tags: text("tags").array().default([]),
role: roleEnum("role").notNull().default("viewer"),
avatar: text("avatar").references(() => files.id, { onDelete: "set null" }),
banner: text("banner").references(() => files.id, { onDelete: "set null" }),
is_admin: boolean("is_admin").notNull().default(false),
email_verified: boolean("email_verified").notNull().default(false),
email_verify_token: text("email_verify_token"),
password_reset_token: text("password_reset_token"),
password_reset_expiry: timestamp("password_reset_expiry"),
date_created: timestamp("date_created").notNull().defaultNow(),
date_updated: timestamp("date_updated"),
},
(t) => [
uniqueIndex("users_email_idx").on(t.email),
uniqueIndex("users_slug_idx").on(t.slug),
index("users_role_idx").on(t.role),
],
);
export const user_photos = pgTable(
"user_photos",
{
id: integer("id").primaryKey().generatedAlwaysAsIdentity(),
user_id: text("user_id")
.notNull()
.references(() => users.id, { onDelete: "cascade" }),
file_id: text("file_id")
.notNull()
.references(() => files.id, { onDelete: "cascade" }),
sort: integer("sort").default(0),
},
(t) => [index("user_photos_user_idx").on(t.user_id)],
);
export type User = typeof users.$inferSelect;
export type NewUser = typeof users.$inferInsert;

View File

@@ -0,0 +1,96 @@
import {
pgTable,
text,
timestamp,
boolean,
integer,
index,
uniqueIndex,
primaryKey,
} from "drizzle-orm/pg-core";
import { users } from "./users";
import { files } from "./files";
export const videos = pgTable(
"videos",
{
id: text("id")
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
slug: text("slug").notNull(),
title: text("title").notNull(),
description: text("description"),
image: text("image").references(() => files.id, { onDelete: "set null" }),
movie: text("movie").references(() => files.id, { onDelete: "set null" }),
tags: text("tags").array().default([]),
upload_date: timestamp("upload_date").notNull().defaultNow(),
premium: boolean("premium").default(false),
featured: boolean("featured").default(false),
likes_count: integer("likes_count").default(0),
plays_count: integer("plays_count").default(0),
},
(t) => [
uniqueIndex("videos_slug_idx").on(t.slug),
index("videos_upload_date_idx").on(t.upload_date),
index("videos_featured_idx").on(t.featured),
],
);
export const video_models = pgTable(
"video_models",
{
video_id: text("video_id")
.notNull()
.references(() => videos.id, { onDelete: "cascade" }),
user_id: text("user_id")
.notNull()
.references(() => users.id, { onDelete: "cascade" }),
},
(t) => [primaryKey({ columns: [t.video_id, t.user_id] })],
);
export const video_likes = pgTable(
"video_likes",
{
id: text("id")
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
video_id: text("video_id")
.notNull()
.references(() => videos.id, { onDelete: "cascade" }),
user_id: text("user_id")
.notNull()
.references(() => users.id, { onDelete: "cascade" }),
date_created: timestamp("date_created").notNull().defaultNow(),
},
(t) => [
index("video_likes_video_idx").on(t.video_id),
index("video_likes_user_idx").on(t.user_id),
],
);
export const video_plays = pgTable(
"video_plays",
{
id: text("id")
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
video_id: text("video_id")
.notNull()
.references(() => videos.id, { onDelete: "cascade" }),
user_id: text("user_id").references(() => users.id, { onDelete: "set null" }),
session_id: text("session_id"),
duration_watched: integer("duration_watched"),
completed: boolean("completed").default(false),
date_created: timestamp("date_created").notNull().defaultNow(),
date_updated: timestamp("date_updated"),
},
(t) => [
index("video_plays_video_idx").on(t.video_id),
index("video_plays_user_idx").on(t.user_id),
index("video_plays_date_idx").on(t.date_created),
],
);
export type Video = typeof videos.$inferSelect;
export type NewVideo = typeof videos.$inferInsert;

View File

@@ -0,0 +1,30 @@
import SchemaBuilder from "@pothos/core";
import ErrorsPlugin from "@pothos/plugin-errors";
import type { DB } from "../db/connection";
import type { SessionUser } from "../lib/auth";
import type Redis from "ioredis";
import { GraphQLDateTime, GraphQLJSON } from "graphql-scalars";
export type Context = {
db: DB;
redis: InstanceType<typeof Redis>;
currentUser: SessionUser | null;
request: Request;
reply: unknown;
};
export const builder = new SchemaBuilder<{
Context: Context;
Scalars: {
DateTime: { Input: Date; Output: Date };
JSON: { Input: unknown; Output: unknown };
};
}>({
plugins: [ErrorsPlugin],
});
builder.addScalarType("DateTime", GraphQLDateTime, {});
builder.addScalarType("JSON", GraphQLJSON, {});
builder.queryType({});
builder.mutationType({});

View File

@@ -0,0 +1,37 @@
import type { YogaInitialContext } from "graphql-yoga";
import type { FastifyRequest, FastifyReply } from "fastify";
import type { Context } from "./builder";
import { getSession } from "../lib/auth";
import { db } from "../db/connection";
import { redis } from "../lib/auth";
type ServerContext = {
req: FastifyRequest;
reply: FastifyReply;
db: typeof db;
redis: typeof redis;
};
export async function buildContext(ctx: YogaInitialContext & ServerContext): Promise<Context> {
const request = ctx.request;
const cookieHeader = request.headers.get("cookie") || "";
// Parse session_token from cookies
const cookies = Object.fromEntries(
cookieHeader.split(";").map((c) => {
const [k, ...v] = c.trim().split("=");
return [k.trim(), v.join("=")];
}),
);
const token = cookies["session_token"];
const currentUser = token ? await getSession(token) : null;
return {
db: ctx.db || db,
redis: ctx.redis || redis,
currentUser,
request,
reply: ctx.reply,
};
}

View File

@@ -0,0 +1,14 @@
import "./builder.js";
import "./types/index.js";
import "./resolvers/auth.js";
import "./resolvers/users.js";
import "./resolvers/videos.js";
import "./resolvers/models.js";
import "./resolvers/articles.js";
import "./resolvers/recordings.js";
import "./resolvers/comments.js";
import "./resolvers/gamification.js";
import "./resolvers/stats.js";
import { builder } from "./builder";
export const schema = builder.toSchema();

View File

@@ -0,0 +1,179 @@
import { builder } from "../builder";
import { ArticleType } from "../types/index";
import { articles, users } from "../../db/schema/index";
import { eq, and, lte, desc } from "drizzle-orm";
import { requireAdmin } from "../../lib/acl";
async function enrichArticle(db: any, article: any) {
let author = null;
if (article.author) {
const authorUser = await db
.select({
id: users.id,
artist_name: users.artist_name,
slug: users.slug,
avatar: users.avatar,
})
.from(users)
.where(eq(users.id, article.author))
.limit(1);
author = authorUser[0] || null;
}
return { ...article, author };
}
builder.queryField("articles", (t) =>
t.field({
type: [ArticleType],
args: {
featured: t.arg.boolean(),
limit: t.arg.int(),
},
resolve: async (_root, args, ctx) => {
const dateFilter = lte(articles.publish_date, new Date());
const whereCondition =
args.featured !== null && args.featured !== undefined
? and(dateFilter, eq(articles.featured, args.featured))
: dateFilter;
let query = ctx.db
.select()
.from(articles)
.where(whereCondition)
.orderBy(desc(articles.publish_date));
if (args.limit) {
query = (query as any).limit(args.limit);
}
const articleList = await query;
return Promise.all(articleList.map((article: any) => enrichArticle(ctx.db, article)));
},
}),
);
builder.queryField("article", (t) =>
t.field({
type: ArticleType,
nullable: true,
args: {
slug: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const article = await ctx.db
.select()
.from(articles)
.where(and(eq(articles.slug, args.slug), lte(articles.publish_date, new Date())))
.limit(1);
if (!article[0]) return null;
return enrichArticle(ctx.db, article[0]);
},
}),
);
// ─── Admin queries & mutations ────────────────────────────────────────────────
builder.queryField("adminListArticles", (t) =>
t.field({
type: [ArticleType],
resolve: async (_root, _args, ctx) => {
requireAdmin(ctx);
const articleList = await ctx.db.select().from(articles).orderBy(desc(articles.publish_date));
return Promise.all(articleList.map((article: any) => enrichArticle(ctx.db, article)));
},
}),
);
builder.mutationField("createArticle", (t) =>
t.field({
type: ArticleType,
args: {
title: t.arg.string({ required: true }),
slug: t.arg.string({ required: true }),
excerpt: t.arg.string(),
content: t.arg.string(),
imageId: t.arg.string(),
tags: t.arg.stringList(),
category: t.arg.string(),
featured: t.arg.boolean(),
publishDate: t.arg.string(),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const inserted = await ctx.db
.insert(articles)
.values({
title: args.title,
slug: args.slug,
excerpt: args.excerpt || null,
content: args.content || null,
image: args.imageId || null,
tags: args.tags || [],
category: args.category || null,
featured: args.featured ?? false,
publish_date: args.publishDate ? new Date(args.publishDate) : new Date(),
author: ctx.currentUser!.id,
})
.returning();
return enrichArticle(ctx.db, inserted[0]);
},
}),
);
builder.mutationField("updateArticle", (t) =>
t.field({
type: ArticleType,
nullable: true,
args: {
id: t.arg.string({ required: true }),
title: t.arg.string(),
slug: t.arg.string(),
excerpt: t.arg.string(),
content: t.arg.string(),
imageId: t.arg.string(),
authorId: t.arg.string(),
tags: t.arg.stringList(),
category: t.arg.string(),
featured: t.arg.boolean(),
publishDate: t.arg.string(),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const updates: Record<string, unknown> = { date_updated: new Date() };
if (args.title !== undefined && args.title !== null) updates.title = args.title;
if (args.slug !== undefined && args.slug !== null) updates.slug = args.slug;
if (args.excerpt !== undefined) updates.excerpt = args.excerpt;
if (args.content !== undefined) updates.content = args.content;
if (args.imageId !== undefined) updates.image = args.imageId;
if (args.authorId !== undefined) updates.author = args.authorId;
if (args.tags !== undefined && args.tags !== null) updates.tags = args.tags;
if (args.category !== undefined) updates.category = args.category;
if (args.featured !== undefined && args.featured !== null) updates.featured = args.featured;
if (args.publishDate !== undefined && args.publishDate !== null)
updates.publish_date = new Date(args.publishDate);
const updated = await ctx.db
.update(articles)
.set(updates as any)
.where(eq(articles.id, args.id))
.returning();
if (!updated[0]) return null;
return enrichArticle(ctx.db, updated[0]);
},
}),
);
builder.mutationField("deleteArticle", (t) =>
t.field({
type: "Boolean",
args: {
id: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
await ctx.db.delete(articles).where(eq(articles.id, args.id));
return true;
},
}),
);

View File

@@ -0,0 +1,235 @@
import { GraphQLError } from "graphql";
import { builder } from "../builder";
import { CurrentUserType } from "../types/index";
import { users } from "../../db/schema/index";
import { eq } from "drizzle-orm";
import { hash, verify as verifyArgon } from "../../lib/argon";
import { setSession, deleteSession } from "../../lib/auth";
import { sendVerification, sendPasswordReset } from "../../lib/email";
import { slugify } from "../../lib/slugify";
import { nanoid } from "nanoid";
builder.mutationField("login", (t) =>
t.field({
type: CurrentUserType,
args: {
email: t.arg.string({ required: true }),
password: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const user = await ctx.db
.select()
.from(users)
.where(eq(users.email, args.email.toLowerCase()))
.limit(1);
if (!user[0]) throw new GraphQLError("Invalid credentials");
const valid = await verifyArgon(user[0].password_hash, args.password);
if (!valid) throw new GraphQLError("Invalid credentials");
const token = nanoid(32);
const sessionUser = {
id: user[0].id,
email: user[0].email,
role: (user[0].role === "admin" ? "viewer" : user[0].role) as "model" | "viewer",
is_admin: user[0].is_admin,
first_name: user[0].first_name,
last_name: user[0].last_name,
artist_name: user[0].artist_name,
slug: user[0].slug,
avatar: user[0].avatar,
};
await setSession(token, sessionUser);
// Set session cookie
const isProduction = process.env.NODE_ENV === "production";
const cookieValue = `session_token=${token}; HttpOnly; Path=/; SameSite=Lax; Max-Age=86400${isProduction ? "; Secure" : ""}`;
(ctx.reply as any).header?.("Set-Cookie", cookieValue);
// For graphql-yoga response
if ((ctx as any).serverResponse) {
(ctx as any).serverResponse.setHeader("Set-Cookie", cookieValue);
}
return user[0];
},
}),
);
builder.mutationField("logout", (t) =>
t.field({
type: "Boolean",
resolve: async (_root, _args, ctx) => {
const cookieHeader = ctx.request.headers.get("cookie") || "";
const cookies = Object.fromEntries(
cookieHeader.split(";").map((c) => {
const [k, ...v] = c.trim().split("=");
return [k.trim(), v.join("=")];
}),
);
const token = cookies["session_token"];
if (token) {
await deleteSession(token);
}
// Clear cookie
const cookieValue = "session_token=; HttpOnly; Path=/; Max-Age=0";
(ctx.reply as any).header?.("Set-Cookie", cookieValue);
return true;
},
}),
);
builder.mutationField("register", (t) =>
t.field({
type: "Boolean",
args: {
email: t.arg.string({ required: true }),
password: t.arg.string({ required: true }),
firstName: t.arg.string({ required: true }),
lastName: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const existing = await ctx.db
.select({ id: users.id })
.from(users)
.where(eq(users.email, args.email.toLowerCase()))
.limit(1);
if (existing.length > 0) throw new GraphQLError("Email already registered");
const passwordHash = await hash(args.password);
const artistName = `${args.firstName} ${args.lastName}`;
const baseSlug = slugify(artistName);
const verifyToken = nanoid(32);
// Ensure unique slug
let slug = baseSlug;
let attempt = 0;
while (true) {
const existing = await ctx.db
.select({ id: users.id })
.from(users)
.where(eq(users.slug, slug))
.limit(1);
if (existing.length === 0) break;
attempt++;
slug = `${baseSlug}-${attempt}`;
}
await ctx.db.insert(users).values({
email: args.email.toLowerCase(),
password_hash: passwordHash,
first_name: args.firstName,
last_name: args.lastName,
artist_name: artistName,
slug,
role: "viewer",
email_verify_token: verifyToken,
email_verified: false,
});
try {
await sendVerification(args.email, verifyToken);
} catch (e) {
console.warn("Failed to send verification email:", (e as Error).message);
}
return true;
},
}),
);
builder.mutationField("verifyEmail", (t) =>
t.field({
type: "Boolean",
args: {
token: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const user = await ctx.db
.select()
.from(users)
.where(eq(users.email_verify_token, args.token))
.limit(1);
if (!user[0]) throw new GraphQLError("Invalid verification token");
await ctx.db
.update(users)
.set({ email_verified: true, email_verify_token: null })
.where(eq(users.id, user[0].id));
return true;
},
}),
);
builder.mutationField("requestPasswordReset", (t) =>
t.field({
type: "Boolean",
args: {
email: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const user = await ctx.db
.select()
.from(users)
.where(eq(users.email, args.email.toLowerCase()))
.limit(1);
// Always return true to prevent email enumeration
if (!user[0]) return true;
const token = nanoid(32);
const expiry = new Date(Date.now() + 60 * 60 * 1000); // 1 hour
await ctx.db
.update(users)
.set({ password_reset_token: token, password_reset_expiry: expiry })
.where(eq(users.id, user[0].id));
try {
await sendPasswordReset(args.email, token);
} catch (e) {
console.warn("Failed to send password reset email:", (e as Error).message);
}
return true;
},
}),
);
builder.mutationField("resetPassword", (t) =>
t.field({
type: "Boolean",
args: {
token: t.arg.string({ required: true }),
newPassword: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const user = await ctx.db
.select()
.from(users)
.where(eq(users.password_reset_token, args.token))
.limit(1);
if (!user[0]) throw new GraphQLError("Invalid or expired reset token");
if (user[0].password_reset_expiry && user[0].password_reset_expiry < new Date()) {
throw new GraphQLError("Reset token expired");
}
const passwordHash = await hash(args.newPassword);
await ctx.db
.update(users)
.set({
password_hash: passwordHash,
password_reset_token: null,
password_reset_expiry: null,
})
.where(eq(users.id, user[0].id));
return true;
},
}),
);

View File

@@ -0,0 +1,97 @@
import { GraphQLError } from "graphql";
import { builder } from "../builder";
import { CommentType } from "../types/index";
import { comments, users } from "../../db/schema/index";
import { eq, and, desc } from "drizzle-orm";
import { awardPoints, checkAchievements } from "../../lib/gamification";
import { requireOwnerOrAdmin } from "../../lib/acl";
builder.queryField("commentsForVideo", (t) =>
t.field({
type: [CommentType],
args: {
videoId: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const commentList = await ctx.db
.select()
.from(comments)
.where(and(eq(comments.collection, "videos"), eq(comments.item_id, args.videoId)))
.orderBy(desc(comments.date_created));
return Promise.all(
commentList.map(async (c: any) => {
const user = await ctx.db
.select({
id: users.id,
first_name: users.first_name,
last_name: users.last_name,
artist_name: users.artist_name,
avatar: users.avatar,
})
.from(users)
.where(eq(users.id, c.user_id))
.limit(1);
return { ...c, user: user[0] || null };
}),
);
},
}),
);
builder.mutationField("createCommentForVideo", (t) =>
t.field({
type: CommentType,
args: {
videoId: t.arg.string({ required: true }),
comment: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
const newComment = await ctx.db
.insert(comments)
.values({
collection: "videos",
item_id: args.videoId,
comment: args.comment,
user_id: ctx.currentUser.id,
})
.returning();
// Gamification
await awardPoints(ctx.db, ctx.currentUser.id, "COMMENT_CREATE");
await checkAchievements(ctx.db, ctx.currentUser.id, "social");
const user = await ctx.db
.select({
id: users.id,
first_name: users.first_name,
last_name: users.last_name,
artist_name: users.artist_name,
avatar: users.avatar,
})
.from(users)
.where(eq(users.id, ctx.currentUser.id))
.limit(1);
return { ...newComment[0], user: user[0] || null };
},
}),
);
builder.mutationField("deleteComment", (t) =>
t.field({
type: "Boolean",
args: {
id: t.arg.int({ required: true }),
},
resolve: async (_root, args, ctx) => {
const comment = await ctx.db.select().from(comments).where(eq(comments.id, args.id)).limit(1);
if (!comment[0]) throw new GraphQLError("Comment not found");
requireOwnerOrAdmin(ctx, comment[0].user_id);
await ctx.db.delete(comments).where(eq(comments.id, args.id));
return true;
},
}),
);

View File

@@ -0,0 +1,125 @@
import { builder } from "../builder";
import { LeaderboardEntryType, UserGamificationType, AchievementType } from "../types/index";
import {
user_stats,
users,
user_achievements,
achievements,
user_points,
} from "../../db/schema/index";
import { eq, desc, gt, count, isNotNull, and } from "drizzle-orm";
builder.queryField("leaderboard", (t) =>
t.field({
type: [LeaderboardEntryType],
args: {
limit: t.arg.int(),
offset: t.arg.int(),
},
resolve: async (_root, args, ctx) => {
const limit = Math.min(args.limit || 100, 500);
const offset = args.offset || 0;
const entries = await ctx.db
.select({
user_id: user_stats.user_id,
display_name: users.artist_name,
avatar: users.avatar,
total_weighted_points: user_stats.total_weighted_points,
total_raw_points: user_stats.total_raw_points,
recordings_count: user_stats.recordings_count,
playbacks_count: user_stats.playbacks_count,
achievements_count: user_stats.achievements_count,
})
.from(user_stats)
.leftJoin(users, eq(user_stats.user_id, users.id))
.orderBy(desc(user_stats.total_weighted_points))
.limit(limit)
.offset(offset);
return entries.map((e: any, i: number) => ({ ...e, rank: offset + i + 1 }));
},
}),
);
builder.queryField("userGamification", (t) =>
t.field({
type: UserGamificationType,
nullable: true,
args: {
userId: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const stats = await ctx.db
.select()
.from(user_stats)
.where(eq(user_stats.user_id, args.userId))
.limit(1);
let rank = 1;
if (stats[0]) {
const rankResult = await ctx.db
.select({ count: count() })
.from(user_stats)
.where(gt(user_stats.total_weighted_points, stats[0].total_weighted_points || 0));
rank = (rankResult[0]?.count || 0) + 1;
}
const userAchievements = await ctx.db
.select({
id: achievements.id,
code: achievements.code,
name: achievements.name,
description: achievements.description,
icon: achievements.icon,
category: achievements.category,
date_unlocked: user_achievements.date_unlocked,
progress: user_achievements.progress,
required_count: achievements.required_count,
})
.from(user_achievements)
.leftJoin(achievements, eq(user_achievements.achievement_id, achievements.id))
.where(
and(
eq(user_achievements.user_id, args.userId),
isNotNull(user_achievements.date_unlocked),
),
)
.orderBy(desc(user_achievements.date_unlocked));
const recentPoints = await ctx.db
.select({
action: user_points.action,
points: user_points.points,
date_created: user_points.date_created,
recording_id: user_points.recording_id,
})
.from(user_points)
.where(eq(user_points.user_id, args.userId))
.orderBy(desc(user_points.date_created))
.limit(10);
return {
stats: stats[0] ? { ...stats[0], rank } : null,
achievements: userAchievements.map((a: any) => ({
...a,
date_unlocked: a.date_unlocked!,
})),
recent_points: recentPoints,
};
},
}),
);
builder.queryField("achievements", (t) =>
t.field({
type: [AchievementType],
resolve: async (_root, _args, ctx) => {
return ctx.db
.select()
.from(achievements)
.where(eq(achievements.status, "published"))
.orderBy(achievements.sort);
},
}),
);

View File

@@ -0,0 +1,63 @@
import { builder } from "../builder";
import { ModelType } from "../types/index";
import { users, user_photos, files } from "../../db/schema/index";
import { eq, and, desc } from "drizzle-orm";
async function enrichModel(db: any, user: any) {
// Fetch photos
const photoRows = await db
.select({ id: files.id, filename: files.filename })
.from(user_photos)
.leftJoin(files, eq(user_photos.file_id, files.id))
.where(eq(user_photos.user_id, user.id))
.orderBy(user_photos.sort);
return {
...user,
photos: photoRows.map((p: any) => ({ id: p.id, filename: p.filename })),
};
}
builder.queryField("models", (t) =>
t.field({
type: [ModelType],
args: {
featured: t.arg.boolean(),
limit: t.arg.int(),
},
resolve: async (_root, args, ctx) => {
let query = ctx.db
.select()
.from(users)
.where(eq(users.role, "model"))
.orderBy(desc(users.date_created));
if (args.limit) {
query = (query as any).limit(args.limit);
}
const modelList = await query;
return Promise.all(modelList.map((m: any) => enrichModel(ctx.db, m)));
},
}),
);
builder.queryField("model", (t) =>
t.field({
type: ModelType,
nullable: true,
args: {
slug: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const model = await ctx.db
.select()
.from(users)
.where(and(eq(users.slug, args.slug), eq(users.role, "model")))
.limit(1);
if (!model[0]) return null;
return enrichModel(ctx.db, model[0]);
},
}),
);

View File

@@ -0,0 +1,342 @@
import { GraphQLError } from "graphql";
import { builder } from "../builder";
import { RecordingType } from "../types/index";
import { recordings, recording_plays } from "../../db/schema/index";
import { eq, and, desc, ne } from "drizzle-orm";
import { slugify } from "../../lib/slugify";
import { awardPoints, checkAchievements } from "../../lib/gamification";
builder.queryField("recordings", (t) =>
t.field({
type: [RecordingType],
args: {
status: t.arg.string(),
tags: t.arg.string(),
linkedVideoId: t.arg.string(),
limit: t.arg.int(),
page: t.arg.int(),
},
resolve: async (_root, args, ctx) => {
if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
const conditions = [eq(recordings.user_id, ctx.currentUser.id)];
if (args.status) conditions.push(eq(recordings.status, args.status as any));
else conditions.push(ne(recordings.status, "archived" as any));
if (args.linkedVideoId) conditions.push(eq(recordings.linked_video, args.linkedVideoId));
const limit = args.limit || 50;
const page = args.page || 1;
const offset = (page - 1) * limit;
return ctx.db
.select()
.from(recordings)
.where(and(...conditions))
.orderBy(desc(recordings.date_created))
.limit(limit)
.offset(offset);
},
}),
);
builder.queryField("recording", (t) =>
t.field({
type: RecordingType,
nullable: true,
args: {
id: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
const recording = await ctx.db
.select()
.from(recordings)
.where(eq(recordings.id, args.id))
.limit(1);
if (!recording[0]) return null;
if (recording[0].user_id !== ctx.currentUser.id && !recording[0].public) {
throw new GraphQLError("Forbidden");
}
return recording[0];
},
}),
);
builder.queryField("communityRecordings", (t) =>
t.field({
type: [RecordingType],
args: {
limit: t.arg.int(),
offset: t.arg.int(),
},
resolve: async (_root, args, ctx) => {
return ctx.db
.select()
.from(recordings)
.where(and(eq(recordings.status, "published"), eq(recordings.public, true)))
.orderBy(desc(recordings.date_created))
.limit(args.limit || 50)
.offset(args.offset || 0);
},
}),
);
builder.mutationField("createRecording", (t) =>
t.field({
type: RecordingType,
args: {
title: t.arg.string({ required: true }),
description: t.arg.string(),
duration: t.arg.int({ required: true }),
events: t.arg({ type: "JSON", required: true }),
deviceInfo: t.arg({ type: "JSON", required: true }),
tags: t.arg.stringList(),
status: t.arg.string(),
linkedVideoId: t.arg.string(),
},
resolve: async (_root, args, ctx) => {
if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
const slug = slugify(args.title);
const newRecording = await ctx.db
.insert(recordings)
.values({
title: args.title,
description: args.description || null,
slug,
duration: args.duration,
events: (args.events as object[]) || [],
device_info: (args.deviceInfo as object[]) || [],
user_id: ctx.currentUser.id,
tags: args.tags || [],
linked_video: args.linkedVideoId || null,
status: (args.status as any) || "draft",
public: false,
})
.returning();
const recording = newRecording[0];
// Gamification: award points if published
if (recording.status === "published") {
await awardPoints(ctx.db, ctx.currentUser.id, "RECORDING_CREATE", recording.id);
await checkAchievements(ctx.db, ctx.currentUser.id, "recordings");
}
return recording;
},
}),
);
builder.mutationField("updateRecording", (t) =>
t.field({
type: RecordingType,
nullable: true,
args: {
id: t.arg.string({ required: true }),
title: t.arg.string(),
description: t.arg.string(),
tags: t.arg.stringList(),
status: t.arg.string(),
public: t.arg.boolean(),
linkedVideoId: t.arg.string(),
},
resolve: async (_root, args, ctx) => {
if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
const existing = await ctx.db
.select()
.from(recordings)
.where(eq(recordings.id, args.id))
.limit(1);
if (!existing[0]) throw new GraphQLError("Recording not found");
if (existing[0].user_id !== ctx.currentUser.id) throw new GraphQLError("Forbidden");
const updates: Record<string, unknown> = { date_updated: new Date() };
if (args.title !== null && args.title !== undefined) {
updates.title = args.title;
updates.slug = slugify(args.title);
}
if (args.description !== null && args.description !== undefined)
updates.description = args.description;
if (args.tags !== null && args.tags !== undefined) updates.tags = args.tags;
if (args.status !== null && args.status !== undefined) updates.status = args.status;
if (args.public !== null && args.public !== undefined) updates.public = args.public;
if (args.linkedVideoId !== null && args.linkedVideoId !== undefined)
updates.linked_video = args.linkedVideoId;
const updated = await ctx.db
.update(recordings)
.set(updates as any)
.where(eq(recordings.id, args.id))
.returning();
const recording = updated[0];
// Gamification: if newly published
if (args.status === "published" && existing[0].status !== "published") {
await awardPoints(ctx.db, ctx.currentUser.id, "RECORDING_CREATE", recording.id);
await checkAchievements(ctx.db, ctx.currentUser.id, "recordings");
}
if (args.status === "published" && recording.featured && !existing[0].featured) {
await awardPoints(ctx.db, ctx.currentUser.id, "RECORDING_FEATURED", recording.id);
await checkAchievements(ctx.db, ctx.currentUser.id, "recordings");
}
return recording;
},
}),
);
builder.mutationField("deleteRecording", (t) =>
t.field({
type: "Boolean",
args: {
id: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
const existing = await ctx.db
.select()
.from(recordings)
.where(eq(recordings.id, args.id))
.limit(1);
if (!existing[0]) throw new GraphQLError("Recording not found");
if (existing[0].user_id !== ctx.currentUser.id) throw new GraphQLError("Forbidden");
await ctx.db.delete(recordings).where(eq(recordings.id, args.id));
return true;
},
}),
);
builder.mutationField("duplicateRecording", (t) =>
t.field({
type: RecordingType,
args: {
id: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
const original = await ctx.db
.select()
.from(recordings)
.where(eq(recordings.id, args.id))
.limit(1);
if (!original[0]) throw new GraphQLError("Recording not found");
if (original[0].status !== "published" || !original[0].public) {
throw new GraphQLError("Recording is not publicly shared");
}
const slug = `${slugify(original[0].title)}-copy-${Date.now()}`;
const duplicated = await ctx.db
.insert(recordings)
.values({
title: `${original[0].title} (Copy)`,
description: original[0].description,
slug,
duration: original[0].duration,
events: original[0].events || [],
device_info: original[0].device_info || [],
user_id: ctx.currentUser.id,
tags: original[0].tags || [],
status: "draft",
public: false,
original_recording_id: original[0].id,
})
.returning();
return duplicated[0];
},
}),
);
builder.mutationField("recordRecordingPlay", (t) =>
t.field({
type: "JSON",
args: {
recordingId: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const recording = await ctx.db
.select()
.from(recordings)
.where(eq(recordings.id, args.recordingId))
.limit(1);
if (!recording[0]) throw new GraphQLError("Recording not found");
const play = await ctx.db
.insert(recording_plays)
.values({
recording_id: args.recordingId,
user_id: ctx.currentUser?.id || null,
duration_played: 0,
completed: false,
})
.returning({ id: recording_plays.id });
// Gamification
if (ctx.currentUser && recording[0].user_id !== ctx.currentUser.id) {
await awardPoints(ctx.db, ctx.currentUser.id, "RECORDING_PLAY", args.recordingId);
await checkAchievements(ctx.db, ctx.currentUser.id, "playback");
}
return { success: true, play_id: play[0].id };
},
}),
);
builder.mutationField("updateRecordingPlay", (t) =>
t.field({
type: "Boolean",
args: {
playId: t.arg.string({ required: true }),
durationPlayed: t.arg.int({ required: true }),
completed: t.arg.boolean({ required: true }),
},
resolve: async (_root, args, ctx) => {
const existing = await ctx.db
.select()
.from(recording_plays)
.where(eq(recording_plays.id, args.playId))
.limit(1);
if (!existing[0]) throw new GraphQLError("Play record not found");
const wasCompleted = existing[0].completed;
await ctx.db
.update(recording_plays)
.set({
duration_played: args.durationPlayed,
completed: args.completed,
date_updated: new Date(),
})
.where(eq(recording_plays.id, args.playId));
if (args.completed && !wasCompleted && ctx.currentUser) {
await awardPoints(
ctx.db,
ctx.currentUser.id,
"RECORDING_COMPLETE",
existing[0].recording_id,
);
await checkAchievements(ctx.db, ctx.currentUser.id, "playback");
}
return true;
},
}),
);

View File

@@ -0,0 +1,27 @@
import { builder } from "../builder";
import { StatsType } from "../types/index";
import { users, videos } from "../../db/schema/index";
import { eq, count } from "drizzle-orm";
builder.queryField("stats", (t) =>
t.field({
type: StatsType,
resolve: async (_root, _args, ctx) => {
const modelsCount = await ctx.db
.select({ count: count() })
.from(users)
.where(eq(users.role, "model"));
const viewersCount = await ctx.db
.select({ count: count() })
.from(users)
.where(eq(users.role, "viewer"));
const videosCount = await ctx.db.select({ count: count() }).from(videos);
return {
models_count: modelsCount[0]?.count || 0,
viewers_count: viewersCount[0]?.count || 0,
videos_count: videosCount[0]?.count || 0,
};
},
}),
);

View File

@@ -0,0 +1,232 @@
import { GraphQLError } from "graphql";
import { builder } from "../builder";
import { CurrentUserType, UserType, AdminUserListType, AdminUserDetailType } from "../types/index";
import { users, user_photos, files } from "../../db/schema/index";
import { eq, ilike, or, count, and } from "drizzle-orm";
import { requireAdmin } from "../../lib/acl";
builder.queryField("me", (t) =>
t.field({
type: CurrentUserType,
nullable: true,
resolve: async (_root, _args, ctx) => {
if (!ctx.currentUser) return null;
const user = await ctx.db
.select()
.from(users)
.where(eq(users.id, ctx.currentUser.id))
.limit(1);
return user[0] || null;
},
}),
);
builder.queryField("userProfile", (t) =>
t.field({
type: UserType,
nullable: true,
args: {
id: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const user = await ctx.db.select().from(users).where(eq(users.id, args.id)).limit(1);
return user[0] || null;
},
}),
);
builder.mutationField("updateProfile", (t) =>
t.field({
type: CurrentUserType,
nullable: true,
args: {
firstName: t.arg.string(),
lastName: t.arg.string(),
artistName: t.arg.string(),
description: t.arg.string(),
tags: t.arg.stringList(),
},
resolve: async (_root, args, ctx) => {
if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
const updates: Record<string, unknown> = { date_updated: new Date() };
if (args.firstName !== undefined && args.firstName !== null)
updates.first_name = args.firstName;
if (args.lastName !== undefined && args.lastName !== null) updates.last_name = args.lastName;
if (args.artistName !== undefined && args.artistName !== null)
updates.artist_name = args.artistName;
if (args.description !== undefined && args.description !== null)
updates.description = args.description;
if (args.tags !== undefined && args.tags !== null) updates.tags = args.tags;
await ctx.db
.update(users)
.set(updates as any)
.where(eq(users.id, ctx.currentUser.id));
const updated = await ctx.db
.select()
.from(users)
.where(eq(users.id, ctx.currentUser.id))
.limit(1);
return updated[0] || null;
},
}),
);
// ─── Admin queries & mutations ────────────────────────────────────────────────
builder.queryField("adminListUsers", (t) =>
t.field({
type: AdminUserListType,
args: {
role: t.arg.string(),
search: t.arg.string(),
limit: t.arg.int(),
offset: t.arg.int(),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const limit = args.limit ?? 50;
const offset = args.offset ?? 0;
let query = ctx.db.select().from(users);
let countQuery = ctx.db.select({ total: count() }).from(users);
const conditions: any[] = [];
if (args.role) {
conditions.push(eq(users.role, args.role as any));
}
if (args.search) {
const pattern = `%${args.search}%`;
conditions.push(or(ilike(users.email, pattern), ilike(users.artist_name, pattern)));
}
if (conditions.length > 0) {
const where = conditions.length === 1 ? conditions[0] : and(...conditions);
query = (query as any).where(where);
countQuery = (countQuery as any).where(where);
}
const [items, totalRows] = await Promise.all([
(query as any).limit(limit).offset(offset),
countQuery,
]);
return { items, total: totalRows[0]?.total ?? 0 };
},
}),
);
builder.mutationField("adminUpdateUser", (t) =>
t.field({
type: UserType,
nullable: true,
args: {
userId: t.arg.string({ required: true }),
role: t.arg.string(),
isAdmin: t.arg.boolean(),
firstName: t.arg.string(),
lastName: t.arg.string(),
artistName: t.arg.string(),
avatarId: t.arg.string(),
bannerId: t.arg.string(),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const updates: Record<string, unknown> = { date_updated: new Date() };
if (args.role !== undefined && args.role !== null) updates.role = args.role as any;
if (args.isAdmin !== undefined && args.isAdmin !== null) updates.is_admin = args.isAdmin;
if (args.firstName !== undefined && args.firstName !== null)
updates.first_name = args.firstName;
if (args.lastName !== undefined && args.lastName !== null) updates.last_name = args.lastName;
if (args.artistName !== undefined && args.artistName !== null)
updates.artist_name = args.artistName;
if (args.avatarId !== undefined && args.avatarId !== null) updates.avatar = args.avatarId;
if (args.bannerId !== undefined && args.bannerId !== null) updates.banner = args.bannerId;
const updated = await ctx.db
.update(users)
.set(updates as any)
.where(eq(users.id, args.userId))
.returning();
return updated[0] || null;
},
}),
);
builder.mutationField("adminDeleteUser", (t) =>
t.field({
type: "Boolean",
args: {
userId: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
if (args.userId === ctx.currentUser!.id) throw new GraphQLError("Cannot delete yourself");
await ctx.db.delete(users).where(eq(users.id, args.userId));
return true;
},
}),
);
builder.queryField("adminGetUser", (t) =>
t.field({
type: AdminUserDetailType,
nullable: true,
args: {
userId: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const user = await ctx.db.select().from(users).where(eq(users.id, args.userId)).limit(1);
if (!user[0]) return null;
const photoRows = await ctx.db
.select({ id: files.id, filename: files.filename })
.from(user_photos)
.leftJoin(files, eq(user_photos.file_id, files.id))
.where(eq(user_photos.user_id, args.userId))
.orderBy(user_photos.sort);
const seen = new Set<string>();
const photos = photoRows
.filter((p: any) => p.id && !seen.has(p.id) && seen.add(p.id))
.map((p: any) => ({ id: p.id, filename: p.filename }));
return { ...user[0], photos };
},
}),
);
builder.mutationField("adminAddUserPhoto", (t) =>
t.field({
type: "Boolean",
args: {
userId: t.arg.string({ required: true }),
fileId: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
await ctx.db.insert(user_photos).values({ user_id: args.userId, file_id: args.fileId });
return true;
},
}),
);
builder.mutationField("adminRemoveUserPhoto", (t) =>
t.field({
type: "Boolean",
args: {
userId: t.arg.string({ required: true }),
fileId: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
await ctx.db
.delete(user_photos)
.where(and(eq(user_photos.user_id, args.userId), eq(user_photos.file_id, args.fileId)));
return true;
},
}),
);

View File

@@ -0,0 +1,553 @@
import { GraphQLError } from "graphql";
import { builder } from "../builder";
import {
VideoType,
VideoLikeResponseType,
VideoPlayResponseType,
VideoLikeStatusType,
} from "../types/index";
import {
videos,
video_models,
video_likes,
video_plays,
users,
files,
} from "../../db/schema/index";
import { eq, and, lte, desc, inArray, count } from "drizzle-orm";
import { requireAdmin } from "../../lib/acl";
async function enrichVideo(db: any, video: any) {
// Fetch models
const modelRows = await db
.select({
id: users.id,
artist_name: users.artist_name,
slug: users.slug,
avatar: users.avatar,
})
.from(video_models)
.leftJoin(users, eq(video_models.user_id, users.id))
.where(eq(video_models.video_id, video.id));
// Fetch movie file
let movieFile = null;
if (video.movie) {
const mf = await db.select().from(files).where(eq(files.id, video.movie)).limit(1);
movieFile = mf[0] || null;
}
// Count likes
const likesCount = await db
.select({ count: count() })
.from(video_likes)
.where(eq(video_likes.video_id, video.id));
const playsCount = await db
.select({ count: count() })
.from(video_plays)
.where(eq(video_plays.video_id, video.id));
return {
...video,
models: modelRows,
movie_file: movieFile,
likes_count: likesCount[0]?.count || 0,
plays_count: playsCount[0]?.count || 0,
};
}
builder.queryField("videos", (t) =>
t.field({
type: [VideoType],
args: {
modelId: t.arg.string(),
featured: t.arg.boolean(),
limit: t.arg.int(),
},
resolve: async (_root, args, ctx) => {
// Unauthenticated users cannot see premium videos
const premiumFilter = !ctx.currentUser ? eq(videos.premium, false) : undefined;
let query = ctx.db
.select({ v: videos })
.from(videos)
.where(and(lte(videos.upload_date, new Date()), premiumFilter))
.orderBy(desc(videos.upload_date));
if (args.modelId) {
const videoIds = await ctx.db
.select({ video_id: video_models.video_id })
.from(video_models)
.where(eq(video_models.user_id, args.modelId));
if (videoIds.length === 0) return [];
query = ctx.db
.select({ v: videos })
.from(videos)
.where(
and(
lte(videos.upload_date, new Date()),
premiumFilter,
inArray(
videos.id,
videoIds.map((v: any) => v.video_id),
),
),
)
.orderBy(desc(videos.upload_date));
}
if (args.featured !== null && args.featured !== undefined) {
query = ctx.db
.select({ v: videos })
.from(videos)
.where(
and(
lte(videos.upload_date, new Date()),
premiumFilter,
eq(videos.featured, args.featured),
),
)
.orderBy(desc(videos.upload_date));
}
if (args.limit) {
query = (query as any).limit(args.limit);
}
const rows = await query;
const videoList = rows.map((r: any) => r.v || r);
return Promise.all(videoList.map((v: any) => enrichVideo(ctx.db, v)));
},
}),
);
builder.queryField("video", (t) =>
t.field({
type: VideoType,
nullable: true,
args: {
slug: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
const video = await ctx.db
.select()
.from(videos)
.where(and(eq(videos.slug, args.slug), lte(videos.upload_date, new Date())))
.limit(1);
if (!video[0]) return null;
if (video[0].premium && !ctx.currentUser) {
throw new GraphQLError("Unauthorized");
}
return enrichVideo(ctx.db, video[0]);
},
}),
);
builder.queryField("videoLikeStatus", (t) =>
t.field({
type: VideoLikeStatusType,
args: {
videoId: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
if (!ctx.currentUser) return { liked: false };
const existing = await ctx.db
.select()
.from(video_likes)
.where(
and(eq(video_likes.video_id, args.videoId), eq(video_likes.user_id, ctx.currentUser.id)),
)
.limit(1);
return { liked: existing.length > 0 };
},
}),
);
builder.mutationField("likeVideo", (t) =>
t.field({
type: VideoLikeResponseType,
args: {
videoId: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
const existing = await ctx.db
.select()
.from(video_likes)
.where(
and(eq(video_likes.video_id, args.videoId), eq(video_likes.user_id, ctx.currentUser.id)),
)
.limit(1);
if (existing.length > 0) throw new GraphQLError("Already liked");
await ctx.db.insert(video_likes).values({
video_id: args.videoId,
user_id: ctx.currentUser.id,
});
await ctx.db
.update(videos)
.set({
likes_count:
((
await ctx.db
.select({ c: videos.likes_count })
.from(videos)
.where(eq(videos.id, args.videoId))
.limit(1)
)[0]?.c as number) + 1 || 1,
})
.where(eq(videos.id, args.videoId));
const likesCount = await ctx.db
.select({ count: count() })
.from(video_likes)
.where(eq(video_likes.video_id, args.videoId));
return { liked: true, likes_count: likesCount[0]?.count || 1 };
},
}),
);
builder.mutationField("unlikeVideo", (t) =>
t.field({
type: VideoLikeResponseType,
args: {
videoId: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
const existing = await ctx.db
.select()
.from(video_likes)
.where(
and(eq(video_likes.video_id, args.videoId), eq(video_likes.user_id, ctx.currentUser.id)),
)
.limit(1);
if (existing.length === 0) throw new GraphQLError("Not liked");
await ctx.db
.delete(video_likes)
.where(
and(eq(video_likes.video_id, args.videoId), eq(video_likes.user_id, ctx.currentUser.id)),
);
await ctx.db
.update(videos)
.set({
likes_count: Math.max(
(((
await ctx.db
.select({ c: videos.likes_count })
.from(videos)
.where(eq(videos.id, args.videoId))
.limit(1)
)[0]?.c as number) || 1) - 1,
0,
),
})
.where(eq(videos.id, args.videoId));
const likesCount = await ctx.db
.select({ count: count() })
.from(video_likes)
.where(eq(video_likes.video_id, args.videoId));
return { liked: false, likes_count: likesCount[0]?.count || 0 };
},
}),
);
builder.mutationField("recordVideoPlay", (t) =>
t.field({
type: VideoPlayResponseType,
args: {
videoId: t.arg.string({ required: true }),
sessionId: t.arg.string(),
},
resolve: async (_root, args, ctx) => {
const play = await ctx.db
.insert(video_plays)
.values({
video_id: args.videoId,
user_id: ctx.currentUser?.id || null,
session_id: args.sessionId || null,
})
.returning({ id: video_plays.id });
const playsCount = await ctx.db
.select({ count: count() })
.from(video_plays)
.where(eq(video_plays.video_id, args.videoId));
await ctx.db
.update(videos)
.set({ plays_count: playsCount[0]?.count || 0 })
.where(eq(videos.id, args.videoId));
return {
success: true,
play_id: play[0].id,
plays_count: playsCount[0]?.count || 0,
};
},
}),
);
builder.mutationField("updateVideoPlay", (t) =>
t.field({
type: "Boolean",
args: {
videoId: t.arg.string({ required: true }),
playId: t.arg.string({ required: true }),
durationWatched: t.arg.int({ required: true }),
completed: t.arg.boolean({ required: true }),
},
resolve: async (_root, args, ctx) => {
const play = await ctx.db
.select()
.from(video_plays)
.where(eq(video_plays.id, args.playId))
.limit(1);
if (!play[0]) return false;
// If play belongs to a user, verify ownership
if (play[0].user_id && (!ctx.currentUser || play[0].user_id !== ctx.currentUser.id)) {
throw new GraphQLError("Forbidden");
}
await ctx.db
.update(video_plays)
.set({
duration_watched: args.durationWatched,
completed: args.completed,
date_updated: new Date(),
})
.where(eq(video_plays.id, args.playId));
return true;
},
}),
);
builder.queryField("analytics", (t) =>
t.field({
type: "JSON",
nullable: true,
resolve: async (_root, _args, ctx) => {
if (!ctx.currentUser || ctx.currentUser.role !== "model") {
throw new GraphQLError("Unauthorized");
}
const userId = ctx.currentUser.id;
// Get all videos by this user (via video_models)
const modelVideoIds = await ctx.db
.select({ video_id: video_models.video_id })
.from(video_models)
.where(eq(video_models.user_id, userId));
if (modelVideoIds.length === 0) {
return {
total_videos: 0,
total_likes: 0,
total_plays: 0,
plays_by_date: {},
likes_by_date: {},
videos: [],
};
}
const videoIds = modelVideoIds.map((v: any) => v.video_id);
const videoList = await ctx.db.select().from(videos).where(inArray(videos.id, videoIds));
const plays = await ctx.db
.select()
.from(video_plays)
.where(inArray(video_plays.video_id, videoIds));
const likes = await ctx.db
.select()
.from(video_likes)
.where(inArray(video_likes.video_id, videoIds));
const totalLikes = videoList.reduce((sum, v) => sum + (v.likes_count || 0), 0);
const totalPlays = videoList.reduce((sum, v) => sum + (v.plays_count || 0), 0);
const playsByDate = plays.reduce((acc: any, play) => {
const date = new Date(play.date_created).toISOString().split("T")[0];
if (!acc[date]) acc[date] = 0;
acc[date]++;
return acc;
}, {});
const likesByDate = likes.reduce((acc: any, like) => {
const date = new Date(like.date_created).toISOString().split("T")[0];
if (!acc[date]) acc[date] = 0;
acc[date]++;
return acc;
}, {});
const videoAnalytics = videoList.map((video) => {
const vPlays = plays.filter((p) => p.video_id === video.id);
const completedPlays = vPlays.filter((p) => p.completed).length;
const avgWatchTime =
vPlays.length > 0
? vPlays.reduce((sum, p) => sum + (p.duration_watched || 0), 0) / vPlays.length
: 0;
return {
id: video.id,
title: video.title,
slug: video.slug,
upload_date: video.upload_date,
likes: video.likes_count || 0,
plays: video.plays_count || 0,
completed_plays: completedPlays,
completion_rate: video.plays_count ? (completedPlays / video.plays_count) * 100 : 0,
avg_watch_time: Math.round(avgWatchTime),
};
});
return {
total_videos: videoList.length,
total_likes: totalLikes,
total_plays: totalPlays,
plays_by_date: playsByDate,
likes_by_date: likesByDate,
videos: videoAnalytics,
};
},
}),
);
// ─── Admin queries & mutations ────────────────────────────────────────────────
builder.queryField("adminListVideos", (t) =>
t.field({
type: [VideoType],
resolve: async (_root, _args, ctx) => {
requireAdmin(ctx);
const rows = await ctx.db.select().from(videos).orderBy(desc(videos.upload_date));
return Promise.all(rows.map((v: any) => enrichVideo(ctx.db, v)));
},
}),
);
builder.mutationField("createVideo", (t) =>
t.field({
type: VideoType,
args: {
title: t.arg.string({ required: true }),
slug: t.arg.string({ required: true }),
description: t.arg.string(),
imageId: t.arg.string(),
movieId: t.arg.string(),
tags: t.arg.stringList(),
premium: t.arg.boolean(),
featured: t.arg.boolean(),
uploadDate: t.arg.string(),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const inserted = await ctx.db
.insert(videos)
.values({
title: args.title,
slug: args.slug,
description: args.description || null,
image: args.imageId || null,
movie: args.movieId || null,
tags: args.tags || [],
premium: args.premium ?? false,
featured: args.featured ?? false,
upload_date: args.uploadDate ? new Date(args.uploadDate) : new Date(),
})
.returning();
return enrichVideo(ctx.db, inserted[0]);
},
}),
);
builder.mutationField("updateVideo", (t) =>
t.field({
type: VideoType,
nullable: true,
args: {
id: t.arg.string({ required: true }),
title: t.arg.string(),
slug: t.arg.string(),
description: t.arg.string(),
imageId: t.arg.string(),
movieId: t.arg.string(),
tags: t.arg.stringList(),
premium: t.arg.boolean(),
featured: t.arg.boolean(),
uploadDate: t.arg.string(),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
const updates: Record<string, unknown> = {};
if (args.title !== undefined && args.title !== null) updates.title = args.title;
if (args.slug !== undefined && args.slug !== null) updates.slug = args.slug;
if (args.description !== undefined) updates.description = args.description;
if (args.imageId !== undefined) updates.image = args.imageId;
if (args.movieId !== undefined) updates.movie = args.movieId;
if (args.tags !== undefined && args.tags !== null) updates.tags = args.tags;
if (args.premium !== undefined && args.premium !== null) updates.premium = args.premium;
if (args.featured !== undefined && args.featured !== null) updates.featured = args.featured;
if (args.uploadDate !== undefined && args.uploadDate !== null)
updates.upload_date = new Date(args.uploadDate);
const updated = await ctx.db
.update(videos)
.set(updates as any)
.where(eq(videos.id, args.id))
.returning();
if (!updated[0]) return null;
return enrichVideo(ctx.db, updated[0]);
},
}),
);
builder.mutationField("deleteVideo", (t) =>
t.field({
type: "Boolean",
args: {
id: t.arg.string({ required: true }),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
await ctx.db.delete(videos).where(eq(videos.id, args.id));
return true;
},
}),
);
builder.mutationField("setVideoModels", (t) =>
t.field({
type: "Boolean",
args: {
videoId: t.arg.string({ required: true }),
userIds: t.arg.stringList({ required: true }),
},
resolve: async (_root, args, ctx) => {
requireAdmin(ctx);
await ctx.db.delete(video_models).where(eq(video_models.video_id, args.videoId));
if (args.userIds.length > 0) {
await ctx.db.insert(video_models).values(
args.userIds.map((userId) => ({
video_id: args.videoId,
user_id: userId,
})),
);
}
return true;
},
}),
);

View File

@@ -0,0 +1,361 @@
import type {
MediaFile,
User,
VideoModel,
VideoFile,
Video,
ModelPhoto,
Model,
Article,
CommentUser,
Comment,
Stats,
Recording,
VideoLikeStatus,
VideoLikeResponse,
VideoPlayResponse,
VideoAnalytics,
Analytics,
LeaderboardEntry,
UserStats,
UserAchievement,
RecentPoint,
UserGamification,
Achievement,
} from "@sexy.pivoine.art/types";
type AdminUserDetail = User & { photos: ModelPhoto[] };
import { builder } from "../builder";
export const FileType = builder.objectRef<MediaFile>("File").implement({
fields: (t) => ({
id: t.exposeString("id"),
title: t.exposeString("title", { nullable: true }),
description: t.exposeString("description", { nullable: true }),
filename: t.exposeString("filename"),
mime_type: t.exposeString("mime_type", { nullable: true }),
filesize: t.exposeFloat("filesize", { nullable: true }),
duration: t.exposeInt("duration", { nullable: true }),
uploaded_by: t.exposeString("uploaded_by", { nullable: true }),
date_created: t.expose("date_created", { type: "DateTime" }),
}),
});
export const UserType = builder.objectRef<User>("User").implement({
fields: (t) => ({
id: t.exposeString("id"),
email: t.exposeString("email"),
first_name: t.exposeString("first_name", { nullable: true }),
last_name: t.exposeString("last_name", { nullable: true }),
artist_name: t.exposeString("artist_name", { nullable: true }),
slug: t.exposeString("slug", { nullable: true }),
description: t.exposeString("description", { nullable: true }),
tags: t.exposeStringList("tags", { nullable: true }),
role: t.exposeString("role"),
is_admin: t.exposeBoolean("is_admin"),
avatar: t.exposeString("avatar", { nullable: true }),
banner: t.exposeString("banner", { nullable: true }),
email_verified: t.exposeBoolean("email_verified"),
date_created: t.expose("date_created", { type: "DateTime" }),
}),
});
// CurrentUser is the same shape as User
export const CurrentUserType = builder.objectRef<User>("CurrentUser").implement({
fields: (t) => ({
id: t.exposeString("id"),
email: t.exposeString("email"),
first_name: t.exposeString("first_name", { nullable: true }),
last_name: t.exposeString("last_name", { nullable: true }),
artist_name: t.exposeString("artist_name", { nullable: true }),
slug: t.exposeString("slug", { nullable: true }),
description: t.exposeString("description", { nullable: true }),
tags: t.exposeStringList("tags", { nullable: true }),
role: t.exposeString("role"),
is_admin: t.exposeBoolean("is_admin"),
avatar: t.exposeString("avatar", { nullable: true }),
banner: t.exposeString("banner", { nullable: true }),
email_verified: t.exposeBoolean("email_verified"),
date_created: t.expose("date_created", { type: "DateTime" }),
}),
});
export const VideoModelType = builder.objectRef<VideoModel>("VideoModel").implement({
fields: (t) => ({
id: t.exposeString("id"),
artist_name: t.exposeString("artist_name", { nullable: true }),
slug: t.exposeString("slug", { nullable: true }),
avatar: t.exposeString("avatar", { nullable: true }),
}),
});
export const VideoFileType = builder.objectRef<VideoFile>("VideoFile").implement({
fields: (t) => ({
id: t.exposeString("id"),
filename: t.exposeString("filename"),
mime_type: t.exposeString("mime_type", { nullable: true }),
duration: t.exposeInt("duration", { nullable: true }),
}),
});
export const VideoType = builder.objectRef<Video>("Video").implement({
fields: (t) => ({
id: t.exposeString("id"),
slug: t.exposeString("slug"),
title: t.exposeString("title"),
description: t.exposeString("description", { nullable: true }),
image: t.exposeString("image", { nullable: true }),
movie: t.exposeString("movie", { nullable: true }),
tags: t.exposeStringList("tags", { nullable: true }),
upload_date: t.expose("upload_date", { type: "DateTime" }),
premium: t.exposeBoolean("premium", { nullable: true }),
featured: t.exposeBoolean("featured", { nullable: true }),
likes_count: t.exposeInt("likes_count", { nullable: true }),
plays_count: t.exposeInt("plays_count", { nullable: true }),
models: t.expose("models", { type: [VideoModelType], nullable: true }),
movie_file: t.expose("movie_file", { type: VideoFileType, nullable: true }),
}),
});
export const ModelPhotoType = builder.objectRef<ModelPhoto>("ModelPhoto").implement({
fields: (t) => ({
id: t.exposeString("id"),
filename: t.exposeString("filename"),
}),
});
export const ModelType = builder.objectRef<Model>("Model").implement({
fields: (t) => ({
id: t.exposeString("id"),
slug: t.exposeString("slug", { nullable: true }),
artist_name: t.exposeString("artist_name", { nullable: true }),
description: t.exposeString("description", { nullable: true }),
avatar: t.exposeString("avatar", { nullable: true }),
banner: t.exposeString("banner", { nullable: true }),
tags: t.exposeStringList("tags", { nullable: true }),
date_created: t.expose("date_created", { type: "DateTime" }),
photos: t.expose("photos", { type: [ModelPhotoType], nullable: true }),
}),
});
export const ArticleType = builder.objectRef<Article>("Article").implement({
fields: (t) => ({
id: t.exposeString("id"),
slug: t.exposeString("slug"),
title: t.exposeString("title"),
excerpt: t.exposeString("excerpt", { nullable: true }),
content: t.exposeString("content", { nullable: true }),
image: t.exposeString("image", { nullable: true }),
tags: t.exposeStringList("tags", { nullable: true }),
publish_date: t.expose("publish_date", { type: "DateTime" }),
category: t.exposeString("category", { nullable: true }),
featured: t.exposeBoolean("featured", { nullable: true }),
author: t.expose("author", { type: VideoModelType, nullable: true }),
}),
});
export const CommentUserType = builder.objectRef<CommentUser>("CommentUser").implement({
fields: (t) => ({
id: t.exposeString("id"),
first_name: t.exposeString("first_name", { nullable: true }),
last_name: t.exposeString("last_name", { nullable: true }),
artist_name: t.exposeString("artist_name", { nullable: true }),
avatar: t.exposeString("avatar", { nullable: true }),
}),
});
export const CommentType = builder.objectRef<Comment>("Comment").implement({
fields: (t) => ({
id: t.exposeInt("id"),
collection: t.exposeString("collection"),
item_id: t.exposeString("item_id"),
comment: t.exposeString("comment"),
user_id: t.exposeString("user_id"),
date_created: t.expose("date_created", { type: "DateTime" }),
user: t.expose("user", { type: CommentUserType, nullable: true }),
}),
});
export const StatsType = builder.objectRef<Stats>("Stats").implement({
fields: (t) => ({
videos_count: t.exposeInt("videos_count"),
models_count: t.exposeInt("models_count"),
viewers_count: t.exposeInt("viewers_count"),
}),
});
export const RecordingType = builder.objectRef<Recording>("Recording").implement({
fields: (t) => ({
id: t.exposeString("id"),
title: t.exposeString("title"),
description: t.exposeString("description", { nullable: true }),
slug: t.exposeString("slug"),
duration: t.exposeInt("duration"),
events: t.expose("events", { type: "JSON", nullable: true }),
device_info: t.expose("device_info", { type: "JSON", nullable: true }),
user_id: t.exposeString("user_id"),
status: t.exposeString("status"),
tags: t.exposeStringList("tags", { nullable: true }),
linked_video: t.exposeString("linked_video", { nullable: true }),
featured: t.exposeBoolean("featured", { nullable: true }),
public: t.exposeBoolean("public", { nullable: true }),
date_created: t.expose("date_created", { type: "DateTime" }),
date_updated: t.expose("date_updated", { type: "DateTime", nullable: true }),
}),
});
export const VideoLikeResponseType = builder
.objectRef<VideoLikeResponse>("VideoLikeResponse")
.implement({
fields: (t) => ({
liked: t.exposeBoolean("liked"),
likes_count: t.exposeInt("likes_count"),
}),
});
export const VideoPlayResponseType = builder
.objectRef<VideoPlayResponse>("VideoPlayResponse")
.implement({
fields: (t) => ({
success: t.exposeBoolean("success"),
play_id: t.exposeString("play_id"),
plays_count: t.exposeInt("plays_count"),
}),
});
export const VideoLikeStatusType = builder.objectRef<VideoLikeStatus>("VideoLikeStatus").implement({
fields: (t) => ({
liked: t.exposeBoolean("liked"),
}),
});
export const VideoAnalyticsType = builder.objectRef<VideoAnalytics>("VideoAnalytics").implement({
fields: (t) => ({
id: t.exposeString("id"),
title: t.exposeString("title"),
slug: t.exposeString("slug"),
upload_date: t.expose("upload_date", { type: "DateTime" }),
likes: t.exposeInt("likes"),
plays: t.exposeInt("plays"),
completed_plays: t.exposeInt("completed_plays"),
completion_rate: t.exposeFloat("completion_rate"),
avg_watch_time: t.exposeInt("avg_watch_time"),
}),
});
export const AnalyticsType = builder.objectRef<Analytics>("Analytics").implement({
fields: (t) => ({
total_videos: t.exposeInt("total_videos"),
total_likes: t.exposeInt("total_likes"),
total_plays: t.exposeInt("total_plays"),
plays_by_date: t.expose("plays_by_date", { type: "JSON" }),
likes_by_date: t.expose("likes_by_date", { type: "JSON" }),
videos: t.expose("videos", { type: [VideoAnalyticsType] }),
}),
});
export const LeaderboardEntryType = builder
.objectRef<LeaderboardEntry>("LeaderboardEntry")
.implement({
fields: (t) => ({
user_id: t.exposeString("user_id"),
display_name: t.exposeString("display_name", { nullable: true }),
avatar: t.exposeString("avatar", { nullable: true }),
total_weighted_points: t.exposeFloat("total_weighted_points", { nullable: true }),
total_raw_points: t.exposeInt("total_raw_points", { nullable: true }),
recordings_count: t.exposeInt("recordings_count", { nullable: true }),
playbacks_count: t.exposeInt("playbacks_count", { nullable: true }),
achievements_count: t.exposeInt("achievements_count", { nullable: true }),
rank: t.exposeInt("rank"),
}),
});
export const UserStatsType = builder.objectRef<UserStats>("UserStats").implement({
fields: (t) => ({
user_id: t.exposeString("user_id"),
total_raw_points: t.exposeInt("total_raw_points", { nullable: true }),
total_weighted_points: t.exposeFloat("total_weighted_points", { nullable: true }),
recordings_count: t.exposeInt("recordings_count", { nullable: true }),
playbacks_count: t.exposeInt("playbacks_count", { nullable: true }),
comments_count: t.exposeInt("comments_count", { nullable: true }),
achievements_count: t.exposeInt("achievements_count", { nullable: true }),
rank: t.exposeInt("rank"),
}),
});
export const UserAchievementType = builder.objectRef<UserAchievement>("UserAchievement").implement({
fields: (t) => ({
id: t.exposeString("id"),
code: t.exposeString("code"),
name: t.exposeString("name"),
description: t.exposeString("description", { nullable: true }),
icon: t.exposeString("icon", { nullable: true }),
category: t.exposeString("category", { nullable: true }),
date_unlocked: t.expose("date_unlocked", { type: "DateTime" }),
progress: t.exposeInt("progress", { nullable: true }),
required_count: t.exposeInt("required_count"),
}),
});
export const RecentPointType = builder.objectRef<RecentPoint>("RecentPoint").implement({
fields: (t) => ({
action: t.exposeString("action"),
points: t.exposeInt("points"),
date_created: t.expose("date_created", { type: "DateTime" }),
recording_id: t.exposeString("recording_id", { nullable: true }),
}),
});
export const UserGamificationType = builder
.objectRef<UserGamification>("UserGamification")
.implement({
fields: (t) => ({
stats: t.expose("stats", { type: UserStatsType, nullable: true }),
achievements: t.expose("achievements", { type: [UserAchievementType] }),
recent_points: t.expose("recent_points", { type: [RecentPointType] }),
}),
});
export const AchievementType = builder.objectRef<Achievement>("Achievement").implement({
fields: (t) => ({
id: t.exposeString("id"),
code: t.exposeString("code"),
name: t.exposeString("name"),
description: t.exposeString("description", { nullable: true }),
icon: t.exposeString("icon", { nullable: true }),
category: t.exposeString("category", { nullable: true }),
required_count: t.exposeInt("required_count"),
points_reward: t.exposeInt("points_reward"),
}),
});
export const AdminUserListType = builder
.objectRef<{ items: User[]; total: number }>("AdminUserList")
.implement({
fields: (t) => ({
items: t.expose("items", { type: [UserType] }),
total: t.exposeInt("total"),
}),
});
export const AdminUserDetailType = builder
.objectRef<AdminUserDetail>("AdminUserDetail")
.implement({
fields: (t) => ({
id: t.exposeString("id"),
email: t.exposeString("email"),
first_name: t.exposeString("first_name", { nullable: true }),
last_name: t.exposeString("last_name", { nullable: true }),
artist_name: t.exposeString("artist_name", { nullable: true }),
slug: t.exposeString("slug", { nullable: true }),
description: t.exposeString("description", { nullable: true }),
tags: t.exposeStringList("tags", { nullable: true }),
role: t.exposeString("role"),
is_admin: t.exposeBoolean("is_admin"),
avatar: t.exposeString("avatar", { nullable: true }),
banner: t.exposeString("banner", { nullable: true }),
email_verified: t.exposeBoolean("email_verified"),
date_created: t.expose("date_created", { type: "DateTime" }),
photos: t.expose("photos", { type: [ModelPhotoType] }),
}),
});

View File

@@ -0,0 +1,140 @@
import Fastify, { type FastifyRequest, type FastifyReply } from "fastify";
import fastifyCookie from "@fastify/cookie";
import fastifyCors from "@fastify/cors";
import fastifyMultipart from "@fastify/multipart";
import fastifyStatic from "@fastify/static";
import { createYoga } from "graphql-yoga";
import { eq } from "drizzle-orm";
import { files } from "./db/schema/index";
import path from "path";
import { existsSync } from "fs";
import sharp from "sharp";
import { schema } from "./graphql/index";
import { buildContext } from "./graphql/context";
import { db } from "./db/connection";
import { redis } from "./lib/auth";
import { logger } from "./lib/logger";
const PORT = parseInt(process.env.PORT || "4000");
const UPLOAD_DIR = process.env.UPLOAD_DIR || "/data/uploads";
const CORS_ORIGIN = process.env.CORS_ORIGIN || "http://localhost:3000";
async function main() {
const fastify = Fastify({ loggerInstance: logger });
await fastify.register(fastifyCookie, {
secret: process.env.COOKIE_SECRET || "change-me-in-production",
});
await fastify.register(fastifyCors, {
origin: CORS_ORIGIN,
credentials: true,
methods: ["GET", "POST", "PUT", "PATCH", "DELETE", "OPTIONS"],
});
await fastify.register(fastifyMultipart, {
limits: {
fileSize: 5 * 1024 * 1024 * 1024, // 5 GB
},
});
// fastify-static provides reply.sendFile(); files are stored as <UPLOAD_DIR>/<id>/<filename>
await fastify.register(fastifyStatic, {
root: path.resolve(UPLOAD_DIR),
prefix: "/assets/",
serve: false, // disable auto-serving; we use a custom route below
decorateReply: true,
});
const yoga = createYoga<{
req: FastifyRequest;
reply: FastifyReply;
db: typeof db;
redis: typeof redis;
}>({
schema,
context: buildContext,
graphqlEndpoint: "/graphql",
healthCheckEndpoint: "/health",
logging: {
debug: (...args) => args.forEach((arg) => fastify.log.debug(arg)),
info: (...args) => args.forEach((arg) => fastify.log.info(arg)),
warn: (...args) => args.forEach((arg) => fastify.log.warn(arg)),
error: (...args) => args.forEach((arg) => fastify.log.error(arg)),
},
});
fastify.route({
url: "/graphql",
method: ["GET", "POST", "OPTIONS"],
handler: (req, reply) =>
yoga.handleNodeRequestAndResponse(req, reply, { req, reply, db, redis }),
});
// Transform presets — only banner/thumbnail force a crop; others preserve aspect ratio
const TRANSFORMS: Record<string, { width: number; height?: number; fit?: "cover" | "inside" }> = {
mini: { width: 80, height: 80, fit: "cover" },
thumbnail: { width: 300, height: 300, fit: "cover" },
preview: { width: 800, fit: "inside" },
medium: { width: 1400, fit: "inside" },
banner: { width: 1600, height: 480, fit: "cover" },
};
// Serve uploaded files: GET /assets/:id?transform=<preset>
// Files are stored as <UPLOAD_DIR>/<id>/<filename> — look up filename in DB
fastify.get("/assets/:id", async (request, reply) => {
const { id } = request.params as { id: string };
const { transform } = request.query as { transform?: string };
const result = await db
.select({ filename: files.filename, mime_type: files.mime_type })
.from(files)
.where(eq(files.id, id))
.limit(1);
if (!result[0]) return reply.status(404).send({ error: "File not found" });
const { filename, mime_type } = result[0];
reply.header("Cache-Control", "public, max-age=31536000, immutable");
const preset = transform ? TRANSFORMS[transform] : null;
if (preset && mime_type?.startsWith("image/")) {
const cacheFile = path.join(UPLOAD_DIR, id, `${transform}.webp`);
if (!existsSync(cacheFile)) {
const originalPath = path.join(UPLOAD_DIR, id, filename);
await sharp(originalPath)
.resize({
width: preset.width,
height: preset.height,
fit: preset.fit ?? "inside",
withoutEnlargement: true,
})
.webp({ quality: 92 })
.toFile(cacheFile);
}
reply.header("Content-Type", "image/webp");
return reply.sendFile(path.join(id, `${transform}.webp`));
}
reply.header("Content-Type", mime_type);
return reply.sendFile(path.join(id, filename));
});
fastify.get("/health", async (_request, reply) => {
return reply.send({ status: "ok", timestamp: new Date().toISOString() });
});
try {
await fastify.listen({ port: PORT, host: "0.0.0.0" });
fastify.log.info(`Backend running at http://0.0.0.0:${PORT}`);
fastify.log.info(`GraphQL at http://0.0.0.0:${PORT}/graphql`);
} catch (err) {
fastify.log.error(err);
process.exit(1);
}
}
main().catch((err) => {
console.error("Fatal error:", err);
process.exit(1);
});

View File

@@ -0,0 +1,18 @@
import { GraphQLError } from "graphql";
import type { Context } from "../graphql/builder";
export function requireAuth(ctx: Context): void {
if (!ctx.currentUser) throw new GraphQLError("Unauthorized");
}
export function requireAdmin(ctx: Context): void {
requireAuth(ctx);
if (!ctx.currentUser!.is_admin) throw new GraphQLError("Forbidden");
}
export function requireOwnerOrAdmin(ctx: Context, ownerId: string): void {
requireAuth(ctx);
if (ctx.currentUser!.id !== ownerId && !ctx.currentUser!.is_admin) {
throw new GraphQLError("Forbidden");
}
}

View File

@@ -0,0 +1,9 @@
import argon2 from "argon2";
export async function hash(password: string): Promise<string> {
return argon2.hash(password);
}
export async function verify(hash: string, password: string): Promise<boolean> {
return argon2.verify(hash, password);
}

View File

@@ -0,0 +1,29 @@
import Redis from "ioredis";
export type SessionUser = {
id: string;
email: string;
role: "model" | "viewer";
is_admin: boolean;
first_name: string | null;
last_name: string | null;
artist_name: string | null;
slug: string | null;
avatar: string | null;
};
export const redis = new Redis(process.env.REDIS_URL || "redis://localhost:6379");
export async function setSession(token: string, user: SessionUser): Promise<void> {
await redis.set(`session:${token}`, JSON.stringify(user), "EX", 86400);
}
export async function getSession(token: string): Promise<SessionUser | null> {
const data = await redis.get(`session:${token}`);
if (!data) return null;
return JSON.parse(data) as SessionUser;
}
export async function deleteSession(token: string): Promise<void> {
await redis.del(`session:${token}`);
}

View File

@@ -0,0 +1,34 @@
import nodemailer from "nodemailer";
const transporter = nodemailer.createTransport({
host: process.env.SMTP_HOST || "localhost",
port: parseInt(process.env.SMTP_PORT || "587"),
secure: process.env.SMTP_SECURE === "true",
auth: process.env.SMTP_USER
? {
user: process.env.SMTP_USER,
pass: process.env.SMTP_PASS,
}
: undefined,
});
const FROM = process.env.EMAIL_FROM || "noreply@sexy.pivoine.art";
const BASE_URL = process.env.PUBLIC_URL || "http://localhost:3000";
export async function sendVerification(email: string, token: string): Promise<void> {
await transporter.sendMail({
from: FROM,
to: email,
subject: "Verify your email",
html: `<p>Click <a href="${BASE_URL}/signup/verify?token=${token}">here</a> to verify your email.</p>`,
});
}
export async function sendPasswordReset(email: string, token: string): Promise<void> {
await transporter.sendMail({
from: FROM,
to: email,
subject: "Reset your password",
html: `<p>Click <a href="${BASE_URL}/password/reset?token=${token}">here</a> to reset your password.</p>`,
});
}

View File

@@ -0,0 +1,10 @@
import ffmpeg from "fluent-ffmpeg";
export function extractDuration(filePath: string): Promise<number> {
return new Promise((resolve, reject) => {
ffmpeg.ffprobe(filePath, (err, metadata) => {
if (err) return reject(err);
resolve(Math.round(metadata.format.duration || 0));
});
});
}

View File

@@ -0,0 +1,323 @@
import { eq, sql, and, gt, isNotNull, count, sum } from "drizzle-orm";
import type { DB } from "../db/connection";
import {
user_points,
user_stats,
recordings,
recording_plays,
comments,
user_achievements,
achievements,
users,
} from "../db/schema/index";
export const POINT_VALUES = {
RECORDING_CREATE: 50,
RECORDING_PLAY: 10,
RECORDING_COMPLETE: 5,
COMMENT_CREATE: 5,
RECORDING_FEATURED: 100,
} as const;
const DECAY_LAMBDA = 0.005;
export async function awardPoints(
db: DB,
userId: string,
action: keyof typeof POINT_VALUES,
recordingId?: string,
): Promise<void> {
const points = POINT_VALUES[action];
await db.insert(user_points).values({
user_id: userId,
action,
points,
recording_id: recordingId || null,
date_created: new Date(),
});
await updateUserStats(db, userId);
}
export async function calculateWeightedScore(db: DB, userId: string): Promise<number> {
const now = new Date();
const result = await db.execute(sql`
SELECT SUM(
points * EXP(-${DECAY_LAMBDA} * EXTRACT(EPOCH FROM (${now}::timestamptz - date_created)) / 86400)
) as weighted_score
FROM user_points
WHERE user_id = ${userId}
`);
return parseFloat((result.rows[0] as any)?.weighted_score || "0");
}
export async function updateUserStats(db: DB, userId: string): Promise<void> {
const now = new Date();
const rawPointsResult = await db
.select({ total: sum(user_points.points) })
.from(user_points)
.where(eq(user_points.user_id, userId));
const totalRawPoints = parseInt(String(rawPointsResult[0]?.total || "0"));
const totalWeightedPoints = await calculateWeightedScore(db, userId);
const recordingsResult = await db
.select({ count: count() })
.from(recordings)
.where(and(eq(recordings.user_id, userId), eq(recordings.status, "published")));
const recordingsCount = recordingsResult[0]?.count || 0;
// Get playbacks count (excluding own recordings)
const ownRecordingIds = await db
.select({ id: recordings.id })
.from(recordings)
.where(eq(recordings.user_id, userId));
const ownIds = ownRecordingIds.map((r) => r.id);
let playbacksCount: number;
if (ownIds.length > 0) {
const playbacksResult = await db.execute(sql`
SELECT COUNT(*) as count FROM recording_plays
WHERE user_id = ${userId}
AND recording_id NOT IN (${sql.join(
ownIds.map((id) => sql`${id}`),
sql`, `,
)})
`);
playbacksCount = parseInt((playbacksResult.rows[0] as any)?.count || "0");
} else {
const playbacksResult = await db
.select({ count: count() })
.from(recording_plays)
.where(eq(recording_plays.user_id, userId));
playbacksCount = playbacksResult[0]?.count || 0;
}
const commentsResult = await db
.select({ count: count() })
.from(comments)
.where(and(eq(comments.user_id, userId), eq(comments.collection, "recordings")));
const commentsCount = commentsResult[0]?.count || 0;
const achievementsResult = await db
.select({ count: count() })
.from(user_achievements)
.where(and(eq(user_achievements.user_id, userId), isNotNull(user_achievements.date_unlocked)));
const achievementsCount = achievementsResult[0]?.count || 0;
const existing = await db
.select()
.from(user_stats)
.where(eq(user_stats.user_id, userId))
.limit(1);
if (existing.length > 0) {
await db
.update(user_stats)
.set({
total_raw_points: totalRawPoints,
total_weighted_points: totalWeightedPoints,
recordings_count: recordingsCount,
playbacks_count: playbacksCount,
comments_count: commentsCount,
achievements_count: achievementsCount,
last_updated: now,
})
.where(eq(user_stats.user_id, userId));
} else {
await db.insert(user_stats).values({
user_id: userId,
total_raw_points: totalRawPoints,
total_weighted_points: totalWeightedPoints,
recordings_count: recordingsCount,
playbacks_count: playbacksCount,
comments_count: commentsCount,
achievements_count: achievementsCount,
last_updated: now,
});
}
}
export async function checkAchievements(db: DB, userId: string, category?: string): Promise<void> {
let achievementsQuery = db
.select()
.from(achievements)
.where(eq(achievements.status, "published"));
if (category) {
achievementsQuery = db
.select()
.from(achievements)
.where(and(eq(achievements.status, "published"), eq(achievements.category, category)));
}
const achievementsList = await achievementsQuery;
for (const achievement of achievementsList) {
const progress = await getAchievementProgress(db, userId, achievement);
const existing = await db
.select()
.from(user_achievements)
.where(
and(
eq(user_achievements.user_id, userId),
eq(user_achievements.achievement_id, achievement.id),
),
)
.limit(1);
const isUnlocked = progress >= achievement.required_count;
const wasUnlocked = existing[0]?.date_unlocked !== null;
if (existing.length > 0) {
await db
.update(user_achievements)
.set({
progress,
date_unlocked: isUnlocked ? existing[0].date_unlocked || new Date() : null,
})
.where(
and(
eq(user_achievements.user_id, userId),
eq(user_achievements.achievement_id, achievement.id),
),
);
} else {
await db.insert(user_achievements).values({
user_id: userId,
achievement_id: achievement.id,
progress,
date_unlocked: isUnlocked ? new Date() : null,
});
}
if (isUnlocked && !wasUnlocked && achievement.points_reward > 0) {
await db.insert(user_points).values({
user_id: userId,
action: `ACHIEVEMENT_${achievement.code}`,
points: achievement.points_reward,
recording_id: null,
date_created: new Date(),
});
await updateUserStats(db, userId);
}
}
}
async function getAchievementProgress(
db: DB,
userId: string,
achievement: typeof achievements.$inferSelect,
): Promise<number> {
const { code } = achievement;
if (["first_recording", "recording_10", "recording_50", "recording_100"].includes(code)) {
const result = await db
.select({ count: count() })
.from(recordings)
.where(and(eq(recordings.user_id, userId), eq(recordings.status, "published")));
return result[0]?.count || 0;
}
if (code === "featured_recording") {
const result = await db
.select({ count: count() })
.from(recordings)
.where(
and(
eq(recordings.user_id, userId),
eq(recordings.status, "published"),
eq(recordings.featured, true),
),
);
return result[0]?.count || 0;
}
if (["first_play", "play_100", "play_500"].includes(code)) {
const result = await db.execute(sql`
SELECT COUNT(*) as count
FROM recording_plays rp
LEFT JOIN recordings r ON rp.recording_id = r.id
WHERE rp.user_id = ${userId}
AND r.user_id != ${userId}
`);
return parseInt((result.rows[0] as any)?.count || "0");
}
if (["completionist_10", "completionist_100"].includes(code)) {
const result = await db
.select({ count: count() })
.from(recording_plays)
.where(and(eq(recording_plays.user_id, userId), eq(recording_plays.completed, true)));
return result[0]?.count || 0;
}
if (["first_comment", "comment_50", "comment_250"].includes(code)) {
const result = await db
.select({ count: count() })
.from(comments)
.where(and(eq(comments.user_id, userId), eq(comments.collection, "recordings")));
return result[0]?.count || 0;
}
if (code === "early_adopter") {
const user = await db.select().from(users).where(eq(users.id, userId)).limit(1);
if (user[0]) {
const joinDate = new Date(user[0].date_created);
const platformLaunch = new Date("2025-01-01");
const oneMonthAfterLaunch = new Date(platformLaunch);
oneMonthAfterLaunch.setMonth(oneMonthAfterLaunch.getMonth() + 1);
return joinDate <= oneMonthAfterLaunch ? 1 : 0;
}
}
if (code === "one_year") {
const user = await db.select().from(users).where(eq(users.id, userId)).limit(1);
if (user[0]) {
const joinDate = new Date(user[0].date_created);
const oneYearAgo = new Date();
oneYearAgo.setFullYear(oneYearAgo.getFullYear() - 1);
return joinDate <= oneYearAgo ? 1 : 0;
}
}
if (code === "balanced_creator") {
const recordingsResult = await db
.select({ count: count() })
.from(recordings)
.where(and(eq(recordings.user_id, userId), eq(recordings.status, "published")));
const playsResult = await db.execute(sql`
SELECT COUNT(*) as count FROM recording_plays rp
LEFT JOIN recordings r ON rp.recording_id = r.id
WHERE rp.user_id = ${userId} AND r.user_id != ${userId}
`);
const rc = recordingsResult[0]?.count || 0;
const pc = parseInt((playsResult.rows[0] as any)?.count || "0");
return rc >= 50 && pc >= 100 ? 1 : 0;
}
if (code === "top_10_rank") {
const userStat = await db
.select()
.from(user_stats)
.where(eq(user_stats.user_id, userId))
.limit(1);
if (!userStat[0]) return 0;
const rankResult = await db
.select({ count: count() })
.from(user_stats)
.where(gt(user_stats.total_weighted_points, userStat[0].total_weighted_points || 0));
const userRank = (rankResult[0]?.count || 0) + 1;
return userRank <= 10 ? 1 : 0;
}
return 0;
}
export async function recalculateAllWeightedScores(db: DB): Promise<void> {
const allUsers = await db.select({ user_id: user_stats.user_id }).from(user_stats);
for (const u of allUsers) {
await updateUserStats(db, u.user_id);
}
}

View File

@@ -0,0 +1,101 @@
type LogLevel = "trace" | "debug" | "info" | "warn" | "error" | "fatal";
const LEVEL_VALUES: Record<LogLevel, number> = {
trace: 10,
debug: 20,
info: 30,
warn: 40,
error: 50,
fatal: 60,
};
function createLogger(bindings: Record<string, unknown> = {}, initialLevel: LogLevel = "info") {
let currentLevel = initialLevel;
function shouldLog(level: LogLevel): boolean {
return LEVEL_VALUES[level] >= LEVEL_VALUES[currentLevel];
}
function formatMessage(level: LogLevel, arg: unknown, msg?: string): string {
const timestamp = new Date().toISOString();
let message: string;
const meta: Record<string, unknown> = { ...bindings };
if (typeof arg === "string") {
message = arg;
} else if (arg !== null && typeof arg === "object") {
// Pino-style: log(obj, msg?) — strip internal pino keys
const {
msg: m,
level: _l,
time: _t,
pid: _p,
hostname: _h,
req: _req,
res: _res,
reqId,
...rest
} = arg as Record<string, unknown>;
message = msg || (typeof m === "string" ? m : "");
if (reqId) meta.reqId = reqId;
Object.assign(meta, rest);
} else {
message = String(arg ?? "");
}
const parts = [`[${timestamp}]`, `[${level.toUpperCase()}]`, message];
let result = parts.join(" ");
const metaEntries = Object.entries(meta).filter(([k]) => k !== "reqId");
const reqId = meta.reqId;
if (reqId) result = `[${timestamp}] [${level.toUpperCase()}] [${reqId}] ${message}`;
if (metaEntries.length > 0) {
result += " " + JSON.stringify(Object.fromEntries(metaEntries));
}
return result;
}
function write(level: LogLevel, arg: unknown, msg?: string) {
if (!shouldLog(level)) return;
const formatted = formatMessage(level, arg, msg);
switch (level) {
case "trace":
case "debug":
console.debug(formatted);
break;
case "info":
console.info(formatted);
break;
case "warn":
console.warn(formatted);
break;
case "error":
case "fatal":
console.error(formatted);
break;
}
}
return {
get level() {
return currentLevel;
},
set level(l: string) {
currentLevel = l as LogLevel;
},
trace: (arg: unknown, msg?: string) => write("trace", arg, msg),
debug: (arg: unknown, msg?: string) => write("debug", arg, msg),
info: (arg: unknown, msg?: string) => write("info", arg, msg),
warn: (arg: unknown, msg?: string) => write("warn", arg, msg),
error: (arg: unknown, msg?: string) => write("error", arg, msg),
fatal: (arg: unknown, msg?: string) => write("fatal", arg, msg),
silent: () => {},
child: (newBindings: Record<string, unknown>) =>
createLogger({ ...bindings, ...newBindings }, currentLevel),
};
}
export const logger = createLogger({}, (process.env.LOG_LEVEL as LogLevel) || "info");

View File

@@ -0,0 +1,5 @@
import slugifyLib from "slugify";
export function slugify(text: string): string {
return slugifyLib(text, { lower: true, strict: true });
}

View File

@@ -0,0 +1,233 @@
CREATE TYPE "public"."achievement_status" AS ENUM('draft', 'published');--> statement-breakpoint
CREATE TYPE "public"."user_role" AS ENUM('model', 'viewer', 'admin');--> statement-breakpoint
CREATE TYPE "public"."recording_status" AS ENUM('draft', 'published', 'archived');--> statement-breakpoint
CREATE TABLE "articles" (
"id" text PRIMARY KEY NOT NULL,
"slug" text NOT NULL,
"title" text NOT NULL,
"excerpt" text,
"content" text,
"image" text,
"tags" text[] DEFAULT '{}',
"publish_date" timestamp DEFAULT now() NOT NULL,
"author" text,
"category" text,
"featured" boolean DEFAULT false,
"date_created" timestamp DEFAULT now() NOT NULL,
"date_updated" timestamp
);
--> statement-breakpoint
CREATE TABLE "comments" (
"id" integer PRIMARY KEY GENERATED ALWAYS AS IDENTITY (sequence name "comments_id_seq" INCREMENT BY 1 MINVALUE 1 MAXVALUE 2147483647 START WITH 1 CACHE 1),
"collection" text NOT NULL,
"item_id" text NOT NULL,
"comment" text NOT NULL,
"user_id" text NOT NULL,
"date_created" timestamp DEFAULT now() NOT NULL,
"date_updated" timestamp
);
--> statement-breakpoint
CREATE TABLE "files" (
"id" text PRIMARY KEY NOT NULL,
"title" text,
"description" text,
"filename" text NOT NULL,
"mime_type" text,
"filesize" bigint,
"duration" integer,
"uploaded_by" text,
"date_created" timestamp DEFAULT now() NOT NULL
);
--> statement-breakpoint
CREATE TABLE "achievements" (
"id" text PRIMARY KEY NOT NULL,
"code" text NOT NULL,
"name" text NOT NULL,
"description" text,
"icon" text,
"category" text,
"required_count" integer DEFAULT 1 NOT NULL,
"points_reward" integer DEFAULT 0 NOT NULL,
"status" "achievement_status" DEFAULT 'published' NOT NULL,
"sort" integer DEFAULT 0
);
--> statement-breakpoint
CREATE TABLE "user_achievements" (
"id" integer PRIMARY KEY GENERATED ALWAYS AS IDENTITY (sequence name "user_achievements_id_seq" INCREMENT BY 1 MINVALUE 1 MAXVALUE 2147483647 START WITH 1 CACHE 1),
"user_id" text NOT NULL,
"achievement_id" text NOT NULL,
"progress" integer DEFAULT 0,
"date_unlocked" timestamp
);
--> statement-breakpoint
CREATE TABLE "user_points" (
"id" integer PRIMARY KEY GENERATED ALWAYS AS IDENTITY (sequence name "user_points_id_seq" INCREMENT BY 1 MINVALUE 1 MAXVALUE 2147483647 START WITH 1 CACHE 1),
"user_id" text NOT NULL,
"action" text NOT NULL,
"points" integer NOT NULL,
"recording_id" text,
"date_created" timestamp DEFAULT now() NOT NULL
);
--> statement-breakpoint
CREATE TABLE "user_stats" (
"id" integer PRIMARY KEY GENERATED ALWAYS AS IDENTITY (sequence name "user_stats_id_seq" INCREMENT BY 1 MINVALUE 1 MAXVALUE 2147483647 START WITH 1 CACHE 1),
"user_id" text NOT NULL,
"total_raw_points" integer DEFAULT 0,
"total_weighted_points" real DEFAULT 0,
"recordings_count" integer DEFAULT 0,
"playbacks_count" integer DEFAULT 0,
"comments_count" integer DEFAULT 0,
"achievements_count" integer DEFAULT 0,
"last_updated" timestamp DEFAULT now()
);
--> statement-breakpoint
CREATE TABLE "user_photos" (
"id" integer PRIMARY KEY GENERATED ALWAYS AS IDENTITY (sequence name "user_photos_id_seq" INCREMENT BY 1 MINVALUE 1 MAXVALUE 2147483647 START WITH 1 CACHE 1),
"user_id" text NOT NULL,
"file_id" text NOT NULL,
"sort" integer DEFAULT 0
);
--> statement-breakpoint
CREATE TABLE "users" (
"id" text PRIMARY KEY NOT NULL,
"email" text NOT NULL,
"password_hash" text NOT NULL,
"first_name" text,
"last_name" text,
"artist_name" text,
"slug" text,
"description" text,
"tags" text[] DEFAULT '{}',
"role" "user_role" DEFAULT 'viewer' NOT NULL,
"avatar" text,
"banner" text,
"email_verified" boolean DEFAULT false NOT NULL,
"email_verify_token" text,
"password_reset_token" text,
"password_reset_expiry" timestamp,
"date_created" timestamp DEFAULT now() NOT NULL,
"date_updated" timestamp
);
--> statement-breakpoint
CREATE TABLE "video_likes" (
"id" text PRIMARY KEY NOT NULL,
"video_id" text NOT NULL,
"user_id" text NOT NULL,
"date_created" timestamp DEFAULT now() NOT NULL
);
--> statement-breakpoint
CREATE TABLE "video_models" (
"video_id" text NOT NULL,
"user_id" text NOT NULL,
CONSTRAINT "video_models_video_id_user_id_pk" PRIMARY KEY("video_id","user_id")
);
--> statement-breakpoint
CREATE TABLE "video_plays" (
"id" text PRIMARY KEY NOT NULL,
"video_id" text NOT NULL,
"user_id" text,
"session_id" text,
"duration_watched" integer,
"completed" boolean DEFAULT false,
"date_created" timestamp DEFAULT now() NOT NULL,
"date_updated" timestamp
);
--> statement-breakpoint
CREATE TABLE "videos" (
"id" text PRIMARY KEY NOT NULL,
"slug" text NOT NULL,
"title" text NOT NULL,
"description" text,
"image" text,
"movie" text,
"tags" text[] DEFAULT '{}',
"upload_date" timestamp DEFAULT now() NOT NULL,
"premium" boolean DEFAULT false,
"featured" boolean DEFAULT false,
"likes_count" integer DEFAULT 0,
"plays_count" integer DEFAULT 0
);
--> statement-breakpoint
CREATE TABLE "recording_plays" (
"id" text PRIMARY KEY NOT NULL,
"recording_id" text NOT NULL,
"user_id" text,
"duration_played" integer DEFAULT 0,
"completed" boolean DEFAULT false,
"date_created" timestamp DEFAULT now() NOT NULL,
"date_updated" timestamp
);
--> statement-breakpoint
CREATE TABLE "recordings" (
"id" text PRIMARY KEY NOT NULL,
"title" text NOT NULL,
"description" text,
"slug" text NOT NULL,
"duration" integer NOT NULL,
"events" jsonb DEFAULT '[]'::jsonb,
"device_info" jsonb DEFAULT '[]'::jsonb,
"user_id" text NOT NULL,
"status" "recording_status" DEFAULT 'draft' NOT NULL,
"tags" text[] DEFAULT '{}',
"linked_video" text,
"featured" boolean DEFAULT false,
"public" boolean DEFAULT false,
"original_recording_id" text,
"date_created" timestamp DEFAULT now() NOT NULL,
"date_updated" timestamp
);
--> statement-breakpoint
ALTER TABLE "articles" ADD CONSTRAINT "articles_image_files_id_fk" FOREIGN KEY ("image") REFERENCES "public"."files"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "articles" ADD CONSTRAINT "articles_author_users_id_fk" FOREIGN KEY ("author") REFERENCES "public"."users"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "comments" ADD CONSTRAINT "comments_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "user_achievements" ADD CONSTRAINT "user_achievements_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "user_achievements" ADD CONSTRAINT "user_achievements_achievement_id_achievements_id_fk" FOREIGN KEY ("achievement_id") REFERENCES "public"."achievements"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "user_points" ADD CONSTRAINT "user_points_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "user_points" ADD CONSTRAINT "user_points_recording_id_recordings_id_fk" FOREIGN KEY ("recording_id") REFERENCES "public"."recordings"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "user_stats" ADD CONSTRAINT "user_stats_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "user_photos" ADD CONSTRAINT "user_photos_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "user_photos" ADD CONSTRAINT "user_photos_file_id_files_id_fk" FOREIGN KEY ("file_id") REFERENCES "public"."files"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "users" ADD CONSTRAINT "users_avatar_files_id_fk" FOREIGN KEY ("avatar") REFERENCES "public"."files"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "users" ADD CONSTRAINT "users_banner_files_id_fk" FOREIGN KEY ("banner") REFERENCES "public"."files"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "video_likes" ADD CONSTRAINT "video_likes_video_id_videos_id_fk" FOREIGN KEY ("video_id") REFERENCES "public"."videos"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "video_likes" ADD CONSTRAINT "video_likes_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "video_models" ADD CONSTRAINT "video_models_video_id_videos_id_fk" FOREIGN KEY ("video_id") REFERENCES "public"."videos"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "video_models" ADD CONSTRAINT "video_models_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "video_plays" ADD CONSTRAINT "video_plays_video_id_videos_id_fk" FOREIGN KEY ("video_id") REFERENCES "public"."videos"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "video_plays" ADD CONSTRAINT "video_plays_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "videos" ADD CONSTRAINT "videos_image_files_id_fk" FOREIGN KEY ("image") REFERENCES "public"."files"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "videos" ADD CONSTRAINT "videos_movie_files_id_fk" FOREIGN KEY ("movie") REFERENCES "public"."files"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "recording_plays" ADD CONSTRAINT "recording_plays_recording_id_recordings_id_fk" FOREIGN KEY ("recording_id") REFERENCES "public"."recordings"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "recording_plays" ADD CONSTRAINT "recording_plays_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "recordings" ADD CONSTRAINT "recordings_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "recordings" ADD CONSTRAINT "recordings_linked_video_videos_id_fk" FOREIGN KEY ("linked_video") REFERENCES "public"."videos"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
CREATE UNIQUE INDEX "articles_slug_idx" ON "articles" USING btree ("slug");--> statement-breakpoint
CREATE INDEX "articles_publish_date_idx" ON "articles" USING btree ("publish_date");--> statement-breakpoint
CREATE INDEX "articles_featured_idx" ON "articles" USING btree ("featured");--> statement-breakpoint
CREATE INDEX "comments_collection_item_idx" ON "comments" USING btree ("collection","item_id");--> statement-breakpoint
CREATE INDEX "comments_user_idx" ON "comments" USING btree ("user_id");--> statement-breakpoint
CREATE INDEX "files_uploaded_by_idx" ON "files" USING btree ("uploaded_by");--> statement-breakpoint
CREATE UNIQUE INDEX "achievements_code_idx" ON "achievements" USING btree ("code");--> statement-breakpoint
CREATE INDEX "user_achievements_user_idx" ON "user_achievements" USING btree ("user_id");--> statement-breakpoint
CREATE UNIQUE INDEX "user_achievements_unique_idx" ON "user_achievements" USING btree ("user_id","achievement_id");--> statement-breakpoint
CREATE INDEX "user_points_user_idx" ON "user_points" USING btree ("user_id");--> statement-breakpoint
CREATE INDEX "user_points_date_idx" ON "user_points" USING btree ("date_created");--> statement-breakpoint
CREATE UNIQUE INDEX "user_stats_user_idx" ON "user_stats" USING btree ("user_id");--> statement-breakpoint
CREATE INDEX "user_photos_user_idx" ON "user_photos" USING btree ("user_id");--> statement-breakpoint
CREATE UNIQUE INDEX "users_email_idx" ON "users" USING btree ("email");--> statement-breakpoint
CREATE UNIQUE INDEX "users_slug_idx" ON "users" USING btree ("slug");--> statement-breakpoint
CREATE INDEX "users_role_idx" ON "users" USING btree ("role");--> statement-breakpoint
CREATE INDEX "video_likes_video_idx" ON "video_likes" USING btree ("video_id");--> statement-breakpoint
CREATE INDEX "video_likes_user_idx" ON "video_likes" USING btree ("user_id");--> statement-breakpoint
CREATE INDEX "video_plays_video_idx" ON "video_plays" USING btree ("video_id");--> statement-breakpoint
CREATE INDEX "video_plays_user_idx" ON "video_plays" USING btree ("user_id");--> statement-breakpoint
CREATE INDEX "video_plays_date_idx" ON "video_plays" USING btree ("date_created");--> statement-breakpoint
CREATE UNIQUE INDEX "videos_slug_idx" ON "videos" USING btree ("slug");--> statement-breakpoint
CREATE INDEX "videos_upload_date_idx" ON "videos" USING btree ("upload_date");--> statement-breakpoint
CREATE INDEX "videos_featured_idx" ON "videos" USING btree ("featured");--> statement-breakpoint
CREATE INDEX "recording_plays_recording_idx" ON "recording_plays" USING btree ("recording_id");--> statement-breakpoint
CREATE INDEX "recording_plays_user_idx" ON "recording_plays" USING btree ("user_id");--> statement-breakpoint
CREATE UNIQUE INDEX "recordings_slug_idx" ON "recordings" USING btree ("slug");--> statement-breakpoint
CREATE INDEX "recordings_user_idx" ON "recordings" USING btree ("user_id");--> statement-breakpoint
CREATE INDEX "recordings_status_idx" ON "recordings" USING btree ("status");--> statement-breakpoint
CREATE INDEX "recordings_public_idx" ON "recordings" USING btree ("public");

View File

@@ -0,0 +1,3 @@
ALTER TABLE "users" ADD COLUMN "is_admin" boolean NOT NULL DEFAULT false;--> statement-breakpoint
UPDATE "users" SET "is_admin" = true WHERE "role" = 'admin';--> statement-breakpoint
UPDATE "users" SET "role" = 'viewer' WHERE "role" = 'admin';

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,20 @@
{
"version": "7",
"dialect": "postgresql",
"entries": [
{
"idx": 0,
"version": "7",
"when": 1772645674513,
"tag": "0000_pale_hellion",
"breakpoints": true
},
{
"idx": 1,
"version": "7",
"when": 1772645674514,
"tag": "0001_is_admin",
"breakpoints": true
}
]
}

View File

@@ -0,0 +1,565 @@
/**
* Data Migration: Directus → Custom Backend
*
* Migrates data from Directus tables to the new schema.
* Run with: tsx src/scripts/data-migration.ts
*
* Environment variables:
* DATABASE_URL - PostgreSQL connection (same DB)
* OLD_UPLOAD_DIR - Path to Directus uploads (e.g. /old-uploads)
* NEW_UPLOAD_DIR - Path to new upload dir (e.g. /data/uploads)
*/
import { Pool } from "pg";
import fs from "fs";
import path from "path";
const DATABASE_URL = process.env.DATABASE_URL || "postgresql://sexy:sexy@localhost:5432/sexy";
const OLD_UPLOAD_DIR = process.env.OLD_UPLOAD_DIR || "/old-uploads";
const NEW_UPLOAD_DIR = process.env.NEW_UPLOAD_DIR || "/data/uploads";
const pool = new Pool({ connectionString: DATABASE_URL });
async function query(sql: string, params: unknown[] = []) {
const client = await pool.connect();
try {
return await client.query(sql, params);
} finally {
client.release();
}
}
function copyFile(src: string, dest: string) {
const dir = path.dirname(dest);
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir, { recursive: true });
}
if (fs.existsSync(src)) {
fs.copyFileSync(src, dest);
return true;
}
return false;
}
async function migrateFiles() {
console.log("📁 Migrating files...");
const { rows } = await query(
`SELECT id, title, description, filename_disk, type, filesize, duration, uploaded_by, uploaded_on as date_created
FROM directus_files`,
);
let migrated = 0;
let skipped = 0;
for (const file of rows) {
// Check if already migrated
const existing = await query("SELECT id FROM files WHERE id = $1", [file.id]);
if (existing.rows.length > 0) {
skipped++;
continue;
}
await query(
`INSERT INTO files (id, title, description, filename, mime_type, filesize, duration, uploaded_by, date_created)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9)
ON CONFLICT (id) DO NOTHING`,
[
file.id,
file.title,
file.description,
file.filename_disk || `${file.id}`,
file.type,
file.filesize,
file.duration,
file.uploaded_by,
file.date_created,
],
);
// Copy file to new location
const srcPath = path.join(OLD_UPLOAD_DIR, file.filename_disk || "");
const destPath = path.join(NEW_UPLOAD_DIR, file.id, file.filename_disk || `${file.id}`);
const copied = copyFile(srcPath, destPath);
if (!copied) {
console.warn(` ⚠️ File not found on disk: ${file.filename_disk}`);
}
migrated++;
}
console.log(` ✅ Files: ${migrated} migrated, ${skipped} already existed`);
}
async function migrateUsers() {
console.log("👥 Migrating users...");
const { rows } = await query(
`SELECT u.id, u.email, u.password, u.first_name, u.last_name,
u.description, u.avatar, u.join_date as date_created,
u.artist_name, u.slug,
r.name as role_name
FROM directus_users u
LEFT JOIN directus_roles r ON u.role = r.id
WHERE u.status = 'active'`,
);
let migrated = 0;
for (const user of rows) {
const existing = await query("SELECT id FROM users WHERE id = $1", [user.id]);
if (existing.rows.length > 0) {
migrated++;
continue;
}
const role =
user.role_name === "Model"
? "model"
: user.role_name === "Administrator"
? "admin"
: "viewer";
// Fetch tags from custom user fields if they exist
let tags: string[] = [];
try {
const tagsRes = await query("SELECT tags FROM directus_users WHERE id = $1", [user.id]);
if (tagsRes.rows[0]?.tags) {
tags = Array.isArray(tagsRes.rows[0].tags)
? tagsRes.rows[0].tags
: JSON.parse(String(tagsRes.rows[0].tags || "[]"));
}
} catch {
/* tags column may not exist on older Directus installs */
}
await query(
`INSERT INTO users (id, email, password_hash, first_name, last_name, artist_name, slug,
description, tags, role, avatar, email_verified, date_created)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13)
ON CONFLICT (id) DO NOTHING`,
[
user.id,
user.email,
user.password || "MIGRATED_NO_PASSWORD",
user.first_name,
user.last_name,
user.artist_name,
user.slug,
user.description,
tags,
role,
user.avatar,
true,
user.date_created,
],
);
migrated++;
}
console.log(` ✅ Users: ${migrated} migrated`);
}
async function migrateUserPhotos() {
console.log("🖼️ Migrating user photos...");
const { rows } = await query(
`SELECT directus_users_id as user_id, directus_files_id as file_id
FROM junction_directus_users_files`,
);
let migrated = 0;
for (const row of rows) {
const userExists = await query("SELECT id FROM users WHERE id = $1", [row.user_id]);
const fileExists = await query("SELECT id FROM files WHERE id = $1", [row.file_id]);
if (!userExists.rows.length || !fileExists.rows.length) continue;
await query(
`INSERT INTO user_photos (user_id, file_id, sort) VALUES ($1, $2, $3)
ON CONFLICT DO NOTHING`,
[row.user_id, row.file_id, 0],
);
migrated++;
}
console.log(` ✅ User photos: ${migrated} migrated`);
}
async function migrateArticles() {
console.log("📰 Migrating articles...");
const { rows } = await query(
`SELECT id, slug, title, excerpt, content, image, tags, publish_date,
author, category, featured, date_created, date_updated
FROM sexy_articles`,
);
let migrated = 0;
for (const article of rows) {
await query(
`INSERT INTO articles (id, slug, title, excerpt, content, image, tags, publish_date,
author, category, featured, date_created, date_updated)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13)
ON CONFLICT (id) DO NOTHING`,
[
article.id,
article.slug,
article.title,
article.excerpt,
article.content,
article.image,
Array.isArray(article.tags) ? article.tags : JSON.parse(String(article.tags || "[]")),
article.publish_date,
article.author,
article.category,
article.featured,
article.date_created,
article.date_updated,
],
);
migrated++;
}
console.log(` ✅ Articles: ${migrated} migrated`);
}
async function migrateVideos() {
console.log("🎬 Migrating videos...");
const { rows } = await query(
`SELECT id, slug, title, description, image, movie, tags, upload_date,
premium, featured
FROM sexy_videos`,
);
let migrated = 0;
for (const video of rows) {
await query(
`INSERT INTO videos (id, slug, title, description, image, movie, tags, upload_date,
premium, featured, likes_count, plays_count)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12)
ON CONFLICT (id) DO NOTHING`,
[
video.id,
video.slug,
video.title,
video.description,
video.image,
video.movie,
Array.isArray(video.tags) ? video.tags : JSON.parse(String(video.tags || "[]")),
video.upload_date,
video.premium,
video.featured,
0,
0,
],
);
migrated++;
}
console.log(` ✅ Videos: ${migrated} migrated`);
}
async function migrateVideoModels() {
console.log("🔗 Migrating video models...");
const { rows } = await query(
`SELECT sexy_videos_id as video_id, directus_users_id as user_id
FROM sexy_videos_models`,
);
let migrated = 0;
for (const row of rows) {
const videoExists = await query("SELECT id FROM videos WHERE id = $1", [row.video_id]);
const userExists = await query("SELECT id FROM users WHERE id = $1", [row.user_id]);
if (!videoExists.rows.length || !userExists.rows.length) continue;
await query(
`INSERT INTO video_models (video_id, user_id) VALUES ($1, $2) ON CONFLICT DO NOTHING`,
[row.video_id, row.user_id],
);
migrated++;
}
console.log(` ✅ Video models: ${migrated} migrated`);
}
async function migrateVideoLikes() {
console.log("❤️ Migrating video likes...");
const { rows } = await query(`SELECT id, video_id, user_id, date_created FROM sexy_video_likes`);
let migrated = 0;
for (const row of rows) {
await query(
`INSERT INTO video_likes (id, video_id, user_id, date_created) VALUES ($1, $2, $3, $4)
ON CONFLICT (id) DO NOTHING`,
[row.id, row.video_id, row.user_id, row.date_created],
);
migrated++;
}
console.log(` ✅ Video likes: ${migrated} migrated`);
}
async function migrateVideoPlays() {
console.log("▶️ Migrating video plays...");
const { rows } = await query(
`SELECT id, video_id, user_id, session_id, duration_watched, completed, date_created
FROM sexy_video_plays`,
);
let migrated = 0;
for (const row of rows) {
await query(
`INSERT INTO video_plays (id, video_id, user_id, session_id, duration_watched, completed, date_created)
VALUES ($1, $2, $3, $4, $5, $6, $7)
ON CONFLICT (id) DO NOTHING`,
[
row.id,
row.video_id,
row.user_id,
row.session_id,
row.duration_watched,
row.completed,
row.date_created,
],
);
migrated++;
}
console.log(` ✅ Video plays: ${migrated} migrated`);
}
async function migrateRecordings() {
console.log("🎙️ Migrating recordings...");
const { rows } = await query(
`SELECT id, title, description, slug, duration, events, device_info,
user_created as user_id, status, tags, linked_video, public,
original_recording_id, date_created, date_updated
FROM sexy_recordings`,
);
let migrated = 0;
for (const recording of rows) {
await query(
`INSERT INTO recordings (id, title, description, slug, duration, events, device_info,
user_id, status, tags, linked_video, public,
original_recording_id, date_created, date_updated)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15)
ON CONFLICT (id) DO NOTHING`,
[
recording.id,
recording.title,
recording.description,
recording.slug,
recording.duration != null ? Math.round(Number(recording.duration)) : null,
typeof recording.events === "string" ? recording.events : JSON.stringify(recording.events),
typeof recording.device_info === "string"
? recording.device_info
: JSON.stringify(recording.device_info),
recording.user_id,
recording.status,
Array.isArray(recording.tags) ? recording.tags : JSON.parse(String(recording.tags || "[]")),
recording.linked_video,
recording.public,
recording.original_recording_id,
recording.date_created,
recording.date_updated,
],
);
migrated++;
}
console.log(` ✅ Recordings: ${migrated} migrated`);
}
async function migrateRecordingPlays() {
console.log("▶️ Migrating recording plays...");
const { rows } = await query(
`SELECT id, user_id, recording_id, duration_played, completed, date_created
FROM sexy_recording_plays`,
);
let migrated = 0;
for (const row of rows) {
await query(
`INSERT INTO recording_plays (id, recording_id, user_id, duration_played, completed, date_created)
VALUES ($1, $2, $3, $4, $5, $6)
ON CONFLICT (id) DO NOTHING`,
[row.id, row.recording_id, row.user_id, row.duration_played, row.completed, row.date_created],
);
migrated++;
}
console.log(` ✅ Recording plays: ${migrated} migrated`);
}
async function migrateComments() {
console.log("💬 Migrating comments...");
const { rows } = await query(
`SELECT id, collection, item, comment, user_created as user_id, date_created
FROM directus_comments
WHERE collection IN ('sexy_videos', 'sexy_recordings')`,
);
let migrated = 0;
for (const row of rows) {
// Map collection names
const collection = row.collection === "sexy_videos" ? "videos" : "recordings";
await query(
`INSERT INTO comments (collection, item_id, comment, user_id, date_created)
VALUES ($1, $2, $3, $4, $5)`,
[collection, row.item, row.comment, row.user_id, row.date_created],
);
migrated++;
}
console.log(` ✅ Comments: ${migrated} migrated`);
}
async function migrateAchievements() {
console.log("🏆 Migrating achievements...");
const { rows } = await query(
`SELECT id, code, name, description, icon, category, required_count, points_reward, status, sort
FROM sexy_achievements`,
);
let migrated = 0;
for (const row of rows) {
await query(
`INSERT INTO achievements (id, code, name, description, icon, category, required_count, points_reward, status, sort)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10)
ON CONFLICT (id) DO NOTHING`,
[
row.id,
row.code,
row.name,
row.description,
row.icon,
row.category,
row.required_count,
row.points_reward,
row.status,
row.sort,
],
);
migrated++;
}
console.log(` ✅ Achievements: ${migrated} migrated`);
}
async function migrateUserAchievements() {
console.log("🎖️ Migrating user achievements...");
const { rows } = await query(
`SELECT user_id, achievement_id, progress, date_unlocked FROM sexy_user_achievements`,
);
let migrated = 0;
for (const row of rows) {
const userExists = await query("SELECT id FROM users WHERE id = $1", [row.user_id]);
const achievementExists = await query("SELECT id FROM achievements WHERE id = $1", [
row.achievement_id,
]);
if (!userExists.rows.length || !achievementExists.rows.length) continue;
await query(
`INSERT INTO user_achievements (user_id, achievement_id, progress, date_unlocked)
VALUES ($1, $2, $3, $4)
ON CONFLICT (user_id, achievement_id) DO NOTHING`,
[row.user_id, row.achievement_id, row.progress, row.date_unlocked],
);
migrated++;
}
console.log(` ✅ User achievements: ${migrated} migrated`);
}
async function migrateUserPoints() {
console.log("💎 Migrating user points...");
const { rows } = await query(
`SELECT user_id, action, points, recording_id, date_created FROM sexy_user_points`,
);
let migrated = 0;
for (const row of rows) {
const userExists = await query("SELECT id FROM users WHERE id = $1", [row.user_id]);
if (!userExists.rows.length) continue;
await query(
`INSERT INTO user_points (user_id, action, points, recording_id, date_created)
VALUES ($1, $2, $3, $4, $5)`,
[row.user_id, row.action, row.points, row.recording_id, row.date_created],
);
migrated++;
}
console.log(` ✅ User points: ${migrated} migrated`);
}
async function migrateUserStats() {
console.log("📊 Migrating user stats...");
const { rows } = await query(
`SELECT user_id, total_raw_points, total_weighted_points, recordings_count,
playbacks_count, comments_count, achievements_count, last_updated
FROM sexy_user_stats`,
);
let migrated = 0;
for (const row of rows) {
const userExists = await query("SELECT id FROM users WHERE id = $1", [row.user_id]);
if (!userExists.rows.length) continue;
await query(
`INSERT INTO user_stats (user_id, total_raw_points, total_weighted_points, recordings_count,
playbacks_count, comments_count, achievements_count, last_updated)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8)
ON CONFLICT (user_id) DO NOTHING`,
[
row.user_id,
row.total_raw_points,
row.total_weighted_points,
row.recordings_count,
row.playbacks_count,
row.comments_count,
row.achievements_count,
row.last_updated,
],
);
migrated++;
}
console.log(` ✅ User stats: ${migrated} migrated`);
}
async function main() {
console.log("🚀 Starting data migration from Directus to custom backend...\n");
try {
// Verify connection
await query("SELECT 1");
console.log("✅ Database connected\n");
// Migration order respects FK dependencies
await migrateFiles();
await migrateUsers();
await migrateUserPhotos();
await migrateArticles();
await migrateVideos();
await migrateVideoModels();
await migrateVideoLikes();
await migrateVideoPlays();
await migrateRecordings();
await migrateRecordingPlays();
await migrateComments();
await migrateAchievements();
await migrateUserAchievements();
await migrateUserPoints();
await migrateUserStats();
console.log("\n🎉 Migration complete!");
} catch (error) {
console.error("❌ Migration failed:", error);
process.exit(1);
} finally {
await pool.end();
}
}
main();

View File

@@ -0,0 +1,27 @@
import { Pool } from "pg";
import { drizzle } from "drizzle-orm/node-postgres";
import { migrate } from "drizzle-orm/node-postgres/migrator";
import path from "path";
const pool = new Pool({
connectionString: process.env.DATABASE_URL || "postgresql://sexy:sexy@localhost:5432/sexy",
});
const db = drizzle(pool);
async function main() {
console.log("Running schema migrations...");
// In dev (tsx): __dirname = src/scripts → migrations are at src/migrations
// In prod (node dist): __dirname = dist/scripts → migrations are at ../../migrations (package root)
const migrationsFolder = __dirname.includes("/src/")
? path.join(__dirname, "../migrations")
: path.join(__dirname, "../../migrations");
await migrate(db, { migrationsFolder });
console.log("Schema migrations complete.");
await pool.end();
}
main().catch((err) => {
console.error("Migration failed:", err);
process.exit(1);
});

View File

@@ -0,0 +1,20 @@
{
"compilerOptions": {
"target": "ES2022",
"module": "CommonJS",
"moduleResolution": "Node",
"lib": ["ES2022"],
"outDir": "./dist",
"rootDir": "./src",
"strict": true,
"skipLibCheck": true,
"esModuleInterop": true,
"experimentalDecorators": true,
"emitDecoratorMetadata": true,
"declaration": true,
"declarationMap": true,
"sourceMap": true
},
"include": ["src/**/*"],
"exclude": ["node_modules", "dist"]
}

View File

@@ -1,54 +0,0 @@
{
"name": "@sexy.pivoine.art/bundle",
"description": "Please enter a description for your extension",
"icon": "extension",
"version": "1.0.0",
"keywords": [
"directus",
"directus-extension",
"directus-extension-bundle"
],
"type": "module",
"files": [
"dist"
],
"directus:extension": {
"type": "bundle",
"path": {
"app": "dist/app.js",
"api": "dist/api.js"
},
"entries": [
{
"name": "endpoint",
"type": "endpoint",
"source": "src/endpoint"
},
{
"name": "hook",
"type": "hook",
"source": "src/hook"
},
{
"name": "theme",
"type": "theme",
"source": "src/theme"
}
],
"host": "^11.11.0"
},
"scripts": {
"build": "directus-extension build",
"dev": "directus-extension build -w --no-minify",
"link": "directus-extension link",
"validate": "directus-extension validate",
"add": "directus-extension add"
},
"devDependencies": {
"@directus/extensions-sdk": "16.0.2"
},
"dependencies": {
"@sindresorhus/slugify": "^3.0.0",
"fluent-ffmpeg": "^2.1.3"
}
}

View File

@@ -1,336 +0,0 @@
/**
* Gamification Helper Functions
* Handles points, achievements, and user stats for recording-focused gamification system
*/
import type { Knex } from "knex";
/**
* Point values for different actions
*/
export const POINT_VALUES = {
RECORDING_CREATE: 50,
RECORDING_PLAY: 10,
RECORDING_COMPLETE: 5,
COMMENT_CREATE: 5,
RECORDING_FEATURED: 100,
} as const;
/**
* Time decay constant for weighted scoring
* λ = 0.005 means ~14% decay per month
*/
const DECAY_LAMBDA = 0.005;
/**
* Award points to a user for a specific action
*/
export async function awardPoints(
database: Knex,
userId: string,
action: keyof typeof POINT_VALUES,
recordingId?: string,
): Promise<void> {
const points = POINT_VALUES[action];
await database("sexy_user_points").insert({
user_id: userId,
action,
points,
recording_id: recordingId || null,
date_created: new Date(),
});
// Update cached stats
await updateUserStats(database, userId);
}
/**
* Calculate time-weighted score using exponential decay
* Score = Σ (points × e^(-λ × age_in_days))
*/
export async function calculateWeightedScore(
database: Knex,
userId: string,
): Promise<number> {
const now = new Date();
const result = await database("sexy_user_points")
.where({ user_id: userId })
.select(
database.raw(`
SUM(
points * EXP(-${DECAY_LAMBDA} * EXTRACT(EPOCH FROM (? - date_created)) / 86400)
) as weighted_score
`, [now]),
);
return result[0]?.weighted_score || 0;
}
/**
* Update or create user stats cache
*/
export async function updateUserStats(database: Knex, userId: string): Promise<void> {
const now = new Date();
// Calculate raw points
const rawPointsResult = await database("sexy_user_points")
.where({ user_id: userId })
.sum("points as total");
const totalRawPoints = rawPointsResult[0]?.total || 0;
// Calculate weighted points
const totalWeightedPoints = await calculateWeightedScore(database, userId);
// Get recordings count
const recordingsResult = await database("sexy_recordings")
.where({ user_created: userId, status: "published" })
.count("* as count");
const recordingsCount = recordingsResult[0]?.count || 0;
// Get playbacks count (excluding own recordings)
const playbacksResult = await database("sexy_recording_plays")
.where({ user_id: userId })
.whereNotIn("recording_id", function () {
this.select("id").from("sexy_recordings").where("user_created", userId);
})
.count("* as count");
const playbacksCount = playbacksResult[0]?.count || 0;
// Get comments count (on recordings only)
const commentsResult = await database("comments")
.where({ user_created: userId, collection: "sexy_recordings" })
.count("* as count");
const commentsCount = commentsResult[0]?.count || 0;
// Get achievements count
const achievementsResult = await database("sexy_user_achievements")
.where({ user_id: userId })
.whereNotNull("date_unlocked")
.count("* as count");
const achievementsCount = achievementsResult[0]?.count || 0;
// Upsert stats
const existing = await database("sexy_user_stats")
.where({ user_id: userId })
.first();
if (existing) {
await database("sexy_user_stats")
.where({ user_id: userId })
.update({
total_raw_points: totalRawPoints,
total_weighted_points: totalWeightedPoints,
recordings_count: recordingsCount,
playbacks_count: playbacksCount,
comments_count: commentsCount,
achievements_count: achievementsCount,
last_updated: now,
});
} else {
await database("sexy_user_stats").insert({
user_id: userId,
total_raw_points: totalRawPoints,
total_weighted_points: totalWeightedPoints,
recordings_count: recordingsCount,
playbacks_count: playbacksCount,
comments_count: commentsCount,
achievements_count: achievementsCount,
last_updated: now,
});
}
}
/**
* Check and update achievement progress for a user
*/
export async function checkAchievements(
database: Knex,
userId: string,
category?: string,
): Promise<void> {
// Get all achievements (optionally filtered by category)
let achievementsQuery = database("sexy_achievements")
.where({ status: "published" });
if (category) {
achievementsQuery = achievementsQuery.where({ category });
}
const achievements = await achievementsQuery;
for (const achievement of achievements) {
const progress = await getAchievementProgress(database, userId, achievement);
// Check if already unlocked
const existing = await database("sexy_user_achievements")
.where({ user_id: userId, achievement_id: achievement.id })
.first();
const isUnlocked = progress >= achievement.required_count;
const wasUnlocked = existing?.date_unlocked !== null;
if (existing) {
// Update progress
await database("sexy_user_achievements")
.where({ user_id: userId, achievement_id: achievement.id })
.update({
progress,
date_unlocked: isUnlocked ? (existing.date_unlocked || new Date()) : null,
});
} else {
// Insert new progress
await database("sexy_user_achievements").insert({
user_id: userId,
achievement_id: achievement.id,
progress,
date_unlocked: isUnlocked ? new Date() : null,
});
}
// Award bonus points if newly unlocked
if (isUnlocked && !wasUnlocked && achievement.points_reward > 0) {
await database("sexy_user_points").insert({
user_id: userId,
action: `ACHIEVEMENT_${achievement.code}`,
points: achievement.points_reward,
recording_id: null,
date_created: new Date(),
});
// Refresh stats after awarding bonus
await updateUserStats(database, userId);
}
}
}
/**
* Get progress for a specific achievement
*/
async function getAchievementProgress(
database: Knex,
userId: string,
achievement: any,
): Promise<number> {
const { code } = achievement;
// Recordings achievements
if (code === "first_recording" || code === "recording_10" || code === "recording_50" || code === "recording_100") {
const result = await database("sexy_recordings")
.where({ user_created: userId, status: "published" })
.count("* as count");
return result[0]?.count || 0;
}
// Featured recording
if (code === "featured_recording") {
const result = await database("sexy_recordings")
.where({ user_created: userId, status: "published", featured: true })
.count("* as count");
return result[0]?.count || 0;
}
// Playback achievements (excluding own recordings)
if (code === "first_play" || code === "play_100" || code === "play_500") {
const result = await database("sexy_recording_plays as rp")
.leftJoin("sexy_recordings as r", "rp.recording_id", "r.id")
.where({ "rp.user_id": userId })
.where("r.user_created", "!=", userId)
.count("* as count");
return result[0]?.count || 0;
}
// Completionist achievements
if (code === "completionist_10" || code === "completionist_100") {
const result = await database("sexy_recording_plays")
.where({ user_id: userId, completed: true })
.count("* as count");
return result[0]?.count || 0;
}
// Social achievements
if (code === "first_comment" || code === "comment_50" || code === "comment_250") {
const result = await database("comments")
.where({ user_created: userId, collection: "sexy_recordings" })
.count("* as count");
return result[0]?.count || 0;
}
// Special: Early adopter (joined in first month)
if (code === "early_adopter") {
const user = await database("directus_users")
.where({ id: userId })
.first();
if (user) {
const joinDate = new Date(user.date_created);
const platformLaunch = new Date("2025-01-01"); // Adjust to actual launch date
const oneMonthAfterLaunch = new Date(platformLaunch);
oneMonthAfterLaunch.setMonth(oneMonthAfterLaunch.getMonth() + 1);
return joinDate <= oneMonthAfterLaunch ? 1 : 0;
}
}
// Special: One year anniversary
if (code === "one_year") {
const user = await database("directus_users")
.where({ id: userId })
.first();
if (user) {
const joinDate = new Date(user.date_created);
const oneYearAgo = new Date();
oneYearAgo.setFullYear(oneYearAgo.getFullYear() - 1);
return joinDate <= oneYearAgo ? 1 : 0;
}
}
// Special: Balanced creator (50 recordings + 100 plays)
if (code === "balanced_creator") {
const recordings = await database("sexy_recordings")
.where({ user_created: userId, status: "published" })
.count("* as count");
const plays = await database("sexy_recording_plays as rp")
.leftJoin("sexy_recordings as r", "rp.recording_id", "r.id")
.where({ "rp.user_id": userId })
.where("r.user_created", "!=", userId)
.count("* as count");
const recordingsCount = recordings[0]?.count || 0;
const playsCount = plays[0]?.count || 0;
return (recordingsCount >= 50 && playsCount >= 100) ? 1 : 0;
}
// Special: Top 10 rank
if (code === "top_10_rank") {
const userStats = await database("sexy_user_stats")
.where({ user_id: userId })
.first();
if (!userStats) return 0;
const rank = await database("sexy_user_stats")
.where("total_weighted_points", ">", userStats.total_weighted_points)
.count("* as count");
const userRank = (rank[0]?.count || 0) + 1;
return userRank <= 10 ? 1 : 0;
}
return 0;
}
/**
* Recalculate all weighted scores (for cron job)
*/
export async function recalculateAllWeightedScores(database: Knex): Promise<void> {
const users = await database("sexy_user_stats").select("user_id");
for (const user of users) {
await updateUserStats(database, user.user_id);
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,145 +0,0 @@
import { createRequire } from "module";
global.require = createRequire(import.meta.url);
import { defineHook } from "@directus/extensions-sdk";
import slugify from "@sindresorhus/slugify";
import ffmpeg from "fluent-ffmpeg";
import { awardPoints, checkAchievements } from "../endpoint/gamification.js";
async function processVideo(
meta,
{ schema, accountability },
services,
logger,
) {
const { FilesService } = services;
const itemId = meta.key;
const videoPath = `/directus/uploads/${meta.payload.filename_disk}`; // Adjust path as needed
const videoService = new FilesService({ schema, accountability }); // Replace with your collection name
try {
const durationInSeconds = await new Promise((resolve, reject) => {
ffmpeg.ffprobe(videoPath, function (err, metadata) {
if (err) {
reject(err);
}
resolve(parseInt(metadata.format.duration));
});
});
// Update the item with the duration
await videoService.updateOne(itemId, { duration: durationInSeconds });
logger.info(`Video ${itemId} duration updated to ${durationInSeconds}`);
} catch (error) {
logger.error(`Error processing video ${itemId}:`, error);
}
}
export default defineHook(async ({ filter, action }, { services, logger, database, getSchema }) => {
action("files.upload", async (meta, context) => {
await processVideo(meta, context, services, logger);
});
filter(
"users.create",
(payload: {
first_name: string;
last_name: string;
artist_name: string;
slug: string;
}) => {
const artist_name = `${payload.first_name}-${new Date().getTime()}`;
const slug = slugify(artist_name);
const join_date = new Date();
return { ...payload, artist_name, slug, join_date };
},
);
filter(
"users.update",
(payload: {
first_name: string;
last_name: string;
artist_name: string;
slug: string;
}) => {
if (payload.artist_name) {
const slug = slugify(payload.artist_name);
return { ...payload, slug };
}
return payload;
},
);
// =========================================
// GAMIFICATION HOOKS
// =========================================
// Hook: Award points when recording is published
action("items.create", async (meta, { collection, accountability }) => {
if (collection === "sexy_recordings") {
const { payload, key } = meta;
// Award points if recording is published
if (payload.status === "published" && accountability?.user) {
try {
await awardPoints(database, accountability.user, "RECORDING_CREATE", key);
await checkAchievements(database, accountability.user, "recordings");
logger.info(`Awarded RECORDING_CREATE points to user ${accountability.user}`);
} catch (error) {
logger.error("Failed to award recording creation points:", error);
}
}
}
});
// Hook: Award points when recording status changes to published or featured
action("items.update", async (meta, { collection, accountability, schema }) => {
if (collection === "sexy_recordings") {
const { payload, keys } = meta;
try {
const { ItemsService } = services;
const recordingsService = new ItemsService("sexy_recordings", {
schema: await getSchema(),
});
for (const key of keys) {
const recording = await recordingsService.readOne(key);
// Award points if status changed from non-published to published
if (payload.status === "published" && recording.status !== "published" && recording.user_created) {
await awardPoints(database, recording.user_created, "RECORDING_CREATE", key);
await checkAchievements(database, recording.user_created, "recordings");
logger.info(`Awarded RECORDING_CREATE points to user ${recording.user_created}`);
}
// Award bonus points if recording becomes featured
if (payload.featured === true && !recording.featured && recording.user_created) {
await awardPoints(database, recording.user_created, "RECORDING_FEATURED", key);
await checkAchievements(database, recording.user_created, "recordings");
logger.info(`Awarded RECORDING_FEATURED points to user ${recording.user_created}`);
}
}
} catch (error) {
logger.error("Failed to award recording update points:", error);
}
}
});
// Hook: Award points when user creates a comment on a recording
action("comments.create", async (meta, { accountability }) => {
if (!accountability?.user) return;
try {
const { payload } = meta;
// Check if comment is on a recording
if (payload.collection === "sexy_recordings") {
await awardPoints(database, accountability.user, "COMMENT_CREATE");
await checkAchievements(database, accountability.user, "social");
logger.info(`Awarded COMMENT_CREATE points to user ${accountability.user}`);
}
} catch (error) {
logger.error("Failed to award comment points:", error);
}
});
});

View File

@@ -1,130 +0,0 @@
import { defineTheme } from "@directus/extensions-sdk";
import "./style.css";
export default defineTheme({
id: "@sexy.pivoine.art/theme",
name: "Sexy.Art Dark",
appearance: "dark",
rules: {
borderRadius: "6px",
borderWidth: "2px",
foreground: "#c9d1d9",
foregroundSubdued: "#666672",
foregroundAccent: "#f0f6fc",
background: "#0D1117",
backgroundNormal: "#21262E",
backgroundAccent: "#30363D",
backgroundSubdued: "#161B22",
borderColor: "#21262E",
borderColorAccent: "#30363D",
borderColorSubdued: "#161B22",
primary: "#ce47eb",
secondary: "#613dff",
success: "#87ff66",
warning: "#ffbf66",
danger: "#ff6467",
navigation: {
background: "#21262E",
backgroundAccent: "#30363D",
borderWidth: "0px",
borderColor: "transparent",
project: {
background: "#30363D",
borderWidth: "0px",
borderColor: "transparent",
},
modules: {
borderWidth: "0px",
borderColor: "transparent",
button: {
foregroundHover: "#fff",
background: "transparent",
backgroundHover: "transparent",
backgroundActive: "#21262E",
},
},
list: {
background: "transparent",
backgroundHover: "#30363D",
backgroundActive: "#30363D",
divider: {
borderColor: "#30363D",
},
},
},
header: {
borderWidth: "0px",
borderColor: "transparent",
boxShadow: "0 4px 7px -4px black",
},
form: {
columnGap: "32px",
rowGap: "40px",
field: {
label: {
fontWeight: "600",
},
input: {
borderColor: "#21262E",
borderColorHover: "#30363D",
boxShadow: "none",
boxShadowHover: "none",
height: "60px",
padding: "16px",
},
},
},
sidebar: {
background: "#21262E",
borderWidth: "0px",
borderColor: "transparent",
section: {
toggle: {
background: "#30363D",
borderWidth: "0px",
borderColor: "transparent",
},
form: {
field: {
input: {
height: "52px",
padding: "12px",
},
},
},
},
},
public: {
art: {
background: "#21262E",
speed: "1",
},
},
popover: {
menu: {
background: "#30363D",
boxShadow: "0px 0px 6px 0px black",
},
},
banner: {
background: "#161B22",
padding: "40px",
avatar: {
background: "#fff",
borderRadius: "50%",
},
headline: {
foreground: "#fff",
},
title: {
foreground: "#fff",
},
subtitle: {
foreground: "#969696",
},
art: {
foreground: "#21262E",
},
},
},
});

View File

@@ -1,29 +0,0 @@
{
"compilerOptions": {
"target": "ES2022",
"lib": ["ES2022", "DOM"],
"module": "ES2022",
"moduleResolution": "node",
"strict": false,
"noFallthroughCasesInSwitch": true,
"esModuleInterop": true,
"noImplicitAny": false,
"noImplicitThis": true,
"noImplicitReturns": true,
"noUnusedLocals": true,
"noUncheckedIndexedAccess": true,
"noUnusedParameters": true,
"alwaysStrict": true,
"strictNullChecks": true,
"strictFunctionTypes": true,
"strictBindCallApply": true,
"strictPropertyInitialization": true,
"resolveJsonModule": false,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"allowSyntheticDefaultImports": true,
"isolatedModules": true,
"allowJs": true
},
"include": ["./src/**/*.ts"]
}

View File

@@ -177,7 +177,7 @@ checksum = "46c5e41b57b8bba42a04676d81cb89e9ee8e859a1a66f80a5a72e1cb76b34d43"
[[package]]
name = "buttplug_core"
version = "10.0.0"
source = "git+https://github.com/valknarthing/buttplug.git#c569409c51ad15f343c3f97a57711cdaa358f2ea"
source = "git+https://github.com/valknarthing/buttplug.git?rev=fad6c9d#fad6c9d97895218b01ceb55fd4a872a89043194a"
dependencies = [
"async-stream",
"cfg-if",
@@ -203,7 +203,7 @@ dependencies = [
[[package]]
name = "buttplug_server"
version = "10.0.0"
source = "git+https://github.com/valknarthing/buttplug.git#c569409c51ad15f343c3f97a57711cdaa358f2ea"
source = "git+https://github.com/valknarthing/buttplug.git?rev=fad6c9d#fad6c9d97895218b01ceb55fd4a872a89043194a"
dependencies = [
"aes",
"async-trait",
@@ -243,8 +243,8 @@ dependencies = [
[[package]]
name = "buttplug_server_device_config"
version = "10.0.0"
source = "git+https://github.com/valknarthing/buttplug.git#c569409c51ad15f343c3f97a57711cdaa358f2ea"
version = "10.0.1"
source = "git+https://github.com/valknarthing/buttplug.git?rev=fad6c9d#fad6c9d97895218b01ceb55fd4a872a89043194a"
dependencies = [
"buttplug_core",
"dashmap",
@@ -913,9 +913,9 @@ checksum = "4a5f13b858c8d314ee3e8f639011f7ccefe71f97f96e50151fb991f267928e2c"
[[package]]
name = "js-sys"
version = "0.3.80"
version = "0.3.87"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "852f13bec5eba4ba9afbeb93fd7c13fe56147f055939ae21c43a29a0ecb2702e"
checksum = "93f0862381daaec758576dcc22eb7bbf4d7efd67328553f3b45a412a51a3fb21"
dependencies = [
"once_cell",
"wasm-bindgen",
@@ -1860,9 +1860,9 @@ dependencies = [
[[package]]
name = "wasm-bindgen"
version = "0.2.103"
version = "0.2.110"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ab10a69fbd0a177f5f649ad4d8d3305499c42bab9aef2f7ff592d0ec8f833819"
checksum = "1de241cdc66a9d91bd84f097039eb140cdc6eec47e0cdbaf9d932a1dd6c35866"
dependencies = [
"cfg-if",
"once_cell",
@@ -1873,27 +1873,14 @@ dependencies = [
"wasm-bindgen-shared",
]
[[package]]
name = "wasm-bindgen-backend"
version = "0.2.103"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0bb702423545a6007bbc368fde243ba47ca275e549c8a28617f56f6ba53b1d1c"
dependencies = [
"bumpalo",
"log",
"proc-macro2",
"quote",
"syn",
"wasm-bindgen-shared",
]
[[package]]
name = "wasm-bindgen-futures"
version = "0.4.53"
version = "0.4.60"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a0b221ff421256839509adbb55998214a70d829d3a28c69b4a6672e9d2a42f67"
checksum = "a42e96ea38f49b191e08a1bab66c7ffdba24b06f9995b39a9dd60222e5b6f1da"
dependencies = [
"cfg-if",
"futures-util",
"js-sys",
"once_cell",
"wasm-bindgen",
@@ -1902,9 +1889,9 @@ dependencies = [
[[package]]
name = "wasm-bindgen-macro"
version = "0.2.103"
version = "0.2.110"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fc65f4f411d91494355917b605e1480033152658d71f722a90647f56a70c88a0"
checksum = "e12fdf6649048f2e3de6d7d5ff3ced779cdedee0e0baffd7dff5cdfa3abc8a52"
dependencies = [
"quote",
"wasm-bindgen-macro-support",
@@ -1912,22 +1899,22 @@ dependencies = [
[[package]]
name = "wasm-bindgen-macro-support"
version = "0.2.103"
version = "0.2.110"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ffc003a991398a8ee604a401e194b6b3a39677b3173d6e74495eb51b82e99a32"
checksum = "0e63d1795c565ac3462334c1e396fd46dbf481c40f51f5072c310717bc4fb309"
dependencies = [
"bumpalo",
"proc-macro2",
"quote",
"syn",
"wasm-bindgen-backend",
"wasm-bindgen-shared",
]
[[package]]
name = "wasm-bindgen-shared"
version = "0.2.103"
version = "0.2.110"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "293c37f4efa430ca14db3721dfbe48d8c33308096bd44d80ebaa775ab71ba1cf"
checksum = "e9f9cdac23a5ce71f6bf9f8824898a501e511892791ea2a0c6b8568c68b9cb53"
dependencies = [
"unicode-ident",
]
@@ -1948,9 +1935,9 @@ dependencies = [
[[package]]
name = "web-sys"
version = "0.3.80"
version = "0.3.87"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fbe734895e869dc429d78c4b433f8d17d95f8d05317440b4fad5ab2d33e596dc"
checksum = "f2c7c5718134e770ee62af3b6b4a84518ec10101aad610c024b64d6ff29bb1ff"
dependencies = [
"js-sys",
"wasm-bindgen",

View File

@@ -16,15 +16,15 @@ name = "buttplug_wasm"
path = "src/lib.rs"
[dependencies]
buttplug_core = { git = "https://github.com/valknarthing/buttplug.git", default-features = false, features = ["wasm"] }
buttplug_server = { git = "https://github.com/valknarthing/buttplug.git", default-features = false, features = ["wasm"] }
buttplug_server_device_config = { git = "https://github.com/valknarthing/buttplug.git" }
js-sys = "0.3.80"
buttplug_core = { git = "https://github.com/valknarthing/buttplug.git", rev = "fad6c9d", default-features = false, features = ["wasm"] }
buttplug_server = { git = "https://github.com/valknarthing/buttplug.git", rev = "fad6c9d", default-features = false, features = ["wasm"] }
buttplug_server_device_config = { git = "https://github.com/valknarthing/buttplug.git", rev = "fad6c9d" }
js-sys = "0.3.87"
tracing-wasm = "0.2.1"
log-panics = { version = "2.1.0", features = ["with-backtrace"] }
console_error_panic_hook = "0.1.7"
wasmtimer = "0.4.3"
wasm-bindgen = { version = "0.2.103", features = ["serde-serialize"] }
wasm-bindgen = { version = "0.2.110", features = ["serde-serialize"] }
tokio = { version = "1.47.1", features = ["sync", "macros", "io-util"] }
tokio-stream = "0.1.17"
tracing = "0.1.41"
@@ -33,12 +33,12 @@ tracing-subscriber = { version = "0.3.20", features = ["json"] }
futures = "0.3.31"
futures-util = "0.3.31"
async-trait = "0.1.89"
wasm-bindgen-futures = "0.4.53"
wasm-bindgen-futures = "0.4.60"
getrandom = { version = "0.3", features = ["wasm_js"] }
parking_lot = { version = "0.11.1", features = ["wasm-bindgen"]}
[dependencies.web-sys]
version = "0.3.80"
version = "0.3.87"
# path = "../../wasm-bindgen/crates/web-sys"
#git = "https://github.com/rustwasm/wasm-bindgen"
features = [

View File

@@ -13,13 +13,13 @@
"build:wasm": "wasm-pack build --out-dir wasm --out-name index --target bundler --release"
},
"dependencies": {
"eventemitter3": "^5.0.1",
"typescript": "^5.9.2",
"vite": "^7.1.4",
"eventemitter3": "^5.0.4",
"typescript": "^5.9.3",
"vite": "^7.3.1",
"vite-plugin-wasm": "3.5.0",
"ws": "^8.18.3"
"ws": "^8.19.0"
},
"devDependencies": {
"wasm-pack": "^0.13.1"
"wasm-pack": "^0.14.0"
}
}

View File

@@ -6,11 +6,11 @@
* @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved.
*/
'use strict';
"use strict";
import { IButtplugClientConnector } from './IButtplugClientConnector';
import { ButtplugMessage } from '../core/Messages';
import { ButtplugBrowserWebsocketConnector } from '../utils/ButtplugBrowserWebsocketConnector';
import { IButtplugClientConnector } from "./IButtplugClientConnector";
import { ButtplugMessage } from "../core/Messages";
import { ButtplugBrowserWebsocketConnector } from "../utils/ButtplugBrowserWebsocketConnector";
export class ButtplugBrowserWebsocketClientConnector
extends ButtplugBrowserWebsocketConnector
@@ -18,7 +18,7 @@ export class ButtplugBrowserWebsocketClientConnector
{
public send = (msg: ButtplugMessage): void => {
if (!this.Connected) {
throw new Error('ButtplugClient not connected');
throw new Error("ButtplugClient not connected");
}
this.sendMessage(msg);
};

View File

@@ -6,20 +6,16 @@
* @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved.
*/
'use strict';
"use strict";
import { ButtplugLogger } from '../core/Logging';
import { EventEmitter } from 'eventemitter3';
import { ButtplugClientDevice } from './ButtplugClientDevice';
import { IButtplugClientConnector } from './IButtplugClientConnector';
import { ButtplugMessageSorter } from '../utils/ButtplugMessageSorter';
import * as Messages from '../core/Messages';
import {
ButtplugError,
ButtplugInitError,
ButtplugMessageError,
} from '../core/Exceptions';
import { ButtplugClientConnectorException } from './ButtplugClientConnectorException';
import { ButtplugLogger } from "../core/Logging";
import { EventEmitter } from "eventemitter3";
import { ButtplugClientDevice } from "./ButtplugClientDevice";
import { IButtplugClientConnector } from "./IButtplugClientConnector";
import { ButtplugMessageSorter } from "../utils/ButtplugMessageSorter";
import * as Messages from "../core/Messages";
import { ButtplugError, ButtplugInitError, ButtplugMessageError } from "../core/Exceptions";
import { ButtplugClientConnectorException } from "./ButtplugClientConnectorException";
export class ButtplugClient extends EventEmitter {
protected _pingTimer: NodeJS.Timeout | null = null;
@@ -30,7 +26,7 @@ export class ButtplugClient extends EventEmitter {
protected _isScanning = false;
private _sorter: ButtplugMessageSorter = new ButtplugMessageSorter(true);
constructor(clientName = 'Generic Buttplug Client') {
constructor(clientName = "Generic Buttplug Client") {
super();
this._clientName = clientName;
this._logger.Debug(`ButtplugClient: Client ${clientName} created.`);
@@ -52,18 +48,16 @@ export class ButtplugClient extends EventEmitter {
}
public connect = async (connector: IButtplugClientConnector) => {
this._logger.Info(
`ButtplugClient: Connecting using ${connector.constructor.name}`
);
this._logger.Info(`ButtplugClient: Connecting using ${connector.constructor.name}`);
await connector.connect();
this._connector = connector;
this._connector.addListener('message', this.parseMessages);
this._connector.addListener('disconnect', this.disconnectHandler);
this._connector.addListener("message", this.parseMessages);
this._connector.addListener("disconnect", this.disconnectHandler);
await this.initializeConnection();
};
public disconnect = async () => {
this._logger.Debug('ButtplugClient: Disconnect called');
this._logger.Debug("ButtplugClient: Disconnect called");
this._devices.clear();
this.checkConnector();
await this.shutdownConnection();
@@ -71,25 +65,33 @@ export class ButtplugClient extends EventEmitter {
};
public startScanning = async () => {
this._logger.Debug('ButtplugClient: StartScanning called');
this._logger.Debug("ButtplugClient: StartScanning called");
this._isScanning = true;
await this.sendMsgExpectOk({ StartScanning: { Id: 1 } });
};
public stopScanning = async () => {
this._logger.Debug('ButtplugClient: StopScanning called');
this._logger.Debug("ButtplugClient: StopScanning called");
this._isScanning = false;
await this.sendMsgExpectOk({ StopScanning: { Id: 1 } });
};
public stopAllDevices = async () => {
this._logger.Debug('ButtplugClient: StopAllDevices');
await this.sendMsgExpectOk({ StopCmd: { Id: 1, DeviceIndex: undefined, FeatureIndex: undefined, Inputs: true, Outputs: true } });
this._logger.Debug("ButtplugClient: StopAllDevices");
await this.sendMsgExpectOk({
StopCmd: {
Id: 1,
DeviceIndex: undefined,
FeatureIndex: undefined,
Inputs: true,
Outputs: true,
},
});
};
protected disconnectHandler = () => {
this._logger.Info('ButtplugClient: Disconnect event receieved.');
this.emit('disconnect');
this._logger.Info("ButtplugClient: Disconnect event receieved.");
this.emit("disconnect");
};
protected parseMessages = (msgs: Messages.ButtplugMessage[]) => {
@@ -100,10 +102,10 @@ export class ButtplugClient extends EventEmitter {
break;
} else if (x.ScanningFinished !== undefined) {
this._isScanning = false;
this.emit('scanningfinished', x);
this.emit("scanningfinished", x);
} else if (x.InputReading !== undefined) {
// TODO this should be emitted from the device or feature, not the client
this.emit('inputreading', x);
this.emit("inputreading", x);
} else {
console.log(`Unhandled message: ${x}`);
}
@@ -112,21 +114,17 @@ export class ButtplugClient extends EventEmitter {
protected initializeConnection = async (): Promise<boolean> => {
this.checkConnector();
const msg = await this.sendMessage(
{
const msg = await this.sendMessage({
RequestServerInfo: {
ClientName: this._clientName,
Id: 1,
ProtocolVersionMajor: Messages.MESSAGE_SPEC_VERSION_MAJOR,
ProtocolVersionMinor: Messages.MESSAGE_SPEC_VERSION_MINOR
}
}
);
ProtocolVersionMinor: Messages.MESSAGE_SPEC_VERSION_MINOR,
},
});
if (msg.ServerInfo !== undefined) {
const serverinfo = msg as Messages.ServerInfo;
this._logger.Info(
`ButtplugClient: Connected to Server ${serverinfo.ServerName}`
);
this._logger.Info(`ButtplugClient: Connected to Server ${serverinfo.ServerName}`);
// TODO: maybe store server name, do something with message template version?
const ping = serverinfo.MaxPingTime;
// If the server version is lower than the client version, the server will disconnect here.
@@ -153,22 +151,19 @@ export class ButtplugClient extends EventEmitter {
throw ButtplugError.LogAndError(
ButtplugInitError,
this._logger,
`Cannot connect to server. ${err.ErrorMessage}`
`Cannot connect to server. ${err.ErrorMessage}`,
);
}
return false;
}
};
private parseDeviceList = (list: Messages.DeviceList) => {
for (let [_, d] of Object.entries(list.Devices)) {
if (!this._devices.has(d.DeviceIndex)) {
const device = ButtplugClientDevice.fromMsg(
d,
this.sendMessageClosure
);
const device = ButtplugClientDevice.fromMsg(d, this.sendMessageClosure);
this._logger.Debug(`ButtplugClient: Adding Device: ${device}`);
this._devices.set(d.DeviceIndex, device);
this.emit('deviceadded', device);
this.emit("deviceadded", device);
} else {
this._logger.Debug(`ButtplugClient: Device already added: ${d}`);
}
@@ -176,19 +171,17 @@ export class ButtplugClient extends EventEmitter {
for (let [index, device] of this._devices.entries()) {
if (!list.Devices.hasOwnProperty(index.toString())) {
this._devices.delete(index);
this.emit('deviceremoved', device);
}
this.emit("deviceremoved", device);
}
}
};
protected requestDeviceList = async () => {
this.checkConnector();
this._logger.Debug('ButtplugClient: ReceiveDeviceList called');
const response = (await this.sendMessage(
{
RequestDeviceList: { Id: 1 }
}
));
this._logger.Debug("ButtplugClient: ReceiveDeviceList called");
const response = await this.sendMessage({
RequestDeviceList: { Id: 1 },
});
this.parseDeviceList(response.DeviceList!);
};
@@ -200,9 +193,7 @@ export class ButtplugClient extends EventEmitter {
}
};
protected async sendMessage(
msg: Messages.ButtplugMessage
): Promise<Messages.ButtplugMessage> {
protected async sendMessage(msg: Messages.ButtplugMessage): Promise<Messages.ButtplugMessage> {
this.checkConnector();
const p = this._sorter.PrepareOutgoingMessage(msg);
await this._connector!.send(msg);
@@ -211,15 +202,11 @@ export class ButtplugClient extends EventEmitter {
protected checkConnector() {
if (!this.connected) {
throw new ButtplugClientConnectorException(
'ButtplugClient not connected'
);
throw new ButtplugClientConnectorException("ButtplugClient not connected");
}
}
protected sendMsgExpectOk = async (
msg: Messages.ButtplugMessage
): Promise<void> => {
protected sendMsgExpectOk = async (msg: Messages.ButtplugMessage): Promise<void> => {
const response = await this.sendMessage(msg);
if (response.Ok !== undefined) {
return;
@@ -229,13 +216,13 @@ export class ButtplugClient extends EventEmitter {
throw ButtplugError.LogAndError(
ButtplugMessageError,
this._logger,
`Message ${response} not handled by SendMsgExpectOk`
`Message ${response} not handled by SendMsgExpectOk`,
);
}
};
protected sendMessageClosure = async (
msg: Messages.ButtplugMessage
msg: Messages.ButtplugMessage,
): Promise<Messages.ButtplugMessage> => {
return await this.sendMessage(msg);
};

View File

@@ -6,8 +6,8 @@
* @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved.
*/
import { ButtplugError } from '../core/Exceptions';
import * as Messages from '../core/Messages';
import { ButtplugError } from "../core/Exceptions";
import * as Messages from "../core/Messages";
export class ButtplugClientConnectorException extends ButtplugError {
public constructor(message: string) {

View File

@@ -6,22 +6,17 @@
* @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved.
*/
'use strict';
import * as Messages from '../core/Messages';
import {
ButtplugDeviceError,
ButtplugError,
ButtplugMessageError,
} from '../core/Exceptions';
import { EventEmitter } from 'eventemitter3';
import { ButtplugClientDeviceFeature } from './ButtplugClientDeviceFeature';
import { DeviceOutputCommand } from './ButtplugClientDeviceCommand';
"use strict";
import * as Messages from "../core/Messages";
import { ButtplugDeviceError, ButtplugError, ButtplugMessageError } from "../core/Exceptions";
import { EventEmitter } from "eventemitter3";
import { ButtplugClientDeviceFeature } from "./ButtplugClientDeviceFeature";
import { DeviceOutputCommand } from "./ButtplugClientDeviceCommand";
/**
* Represents an abstract device, capable of taking certain kinds of messages.
*/
export class ButtplugClientDevice extends EventEmitter {
private _features: Map<number, ButtplugClientDeviceFeature>;
/**
@@ -58,9 +53,7 @@ export class ButtplugClientDevice extends EventEmitter {
public static fromMsg(
msg: Messages.DeviceInfo,
sendClosure: (
msg: Messages.ButtplugMessage
) => Promise<Messages.ButtplugMessage>
sendClosure: (msg: Messages.ButtplugMessage) => Promise<Messages.ButtplugMessage>,
): ButtplugClientDevice {
return new ButtplugClientDevice(msg, sendClosure);
}
@@ -72,25 +65,29 @@ export class ButtplugClientDevice extends EventEmitter {
*/
private constructor(
private _deviceInfo: Messages.DeviceInfo,
private _sendClosure: (
msg: Messages.ButtplugMessage
) => Promise<Messages.ButtplugMessage>
private _sendClosure: (msg: Messages.ButtplugMessage) => Promise<Messages.ButtplugMessage>,
) {
super();
this._features = new Map(Object.entries(_deviceInfo.DeviceFeatures).map(([index, v]) => [parseInt(index), new ButtplugClientDeviceFeature(_deviceInfo.DeviceIndex, _deviceInfo.DeviceName, v, _sendClosure)]));
this._features = new Map(
Object.entries(_deviceInfo.DeviceFeatures).map(([index, v]) => [
parseInt(index),
new ButtplugClientDeviceFeature(
_deviceInfo.DeviceIndex,
_deviceInfo.DeviceName,
v,
_sendClosure,
),
]),
);
}
public async send(
msg: Messages.ButtplugMessage
): Promise<Messages.ButtplugMessage> {
public async send(msg: Messages.ButtplugMessage): Promise<Messages.ButtplugMessage> {
// Assume we're getting the closure from ButtplugClient, which does all of
// the index/existence/connection/message checks for us.
return await this._sendClosure(msg);
}
protected sendMsgExpectOk = async (
msg: Messages.ButtplugMessage
): Promise<void> => {
protected sendMsgExpectOk = async (msg: Messages.ButtplugMessage): Promise<void> => {
const response = await this.send(msg);
if (response.Ok !== undefined) {
return;
@@ -109,19 +106,36 @@ export class ButtplugClientDevice extends EventEmitter {
protected isOutputValid(featureIndex: number, type: Messages.OutputType) {
if (!this._deviceInfo.DeviceFeatures.hasOwnProperty(featureIndex.toString())) {
throw new ButtplugDeviceError(`Feature index ${featureIndex} does not exist for device ${this.name}`);
throw new ButtplugDeviceError(
`Feature index ${featureIndex} does not exist for device ${this.name}`,
);
}
if (this._deviceInfo.DeviceFeatures[featureIndex.toString()].Outputs !== undefined && !this._deviceInfo.DeviceFeatures[featureIndex.toString()].Outputs.hasOwnProperty(type)) {
throw new ButtplugDeviceError(`Feature index ${featureIndex} does not support type ${type} for device ${this.name}`);
if (
this._deviceInfo.DeviceFeatures[featureIndex.toString()].Outputs !== undefined &&
!this._deviceInfo.DeviceFeatures[featureIndex.toString()].Outputs.hasOwnProperty(type)
) {
throw new ButtplugDeviceError(
`Feature index ${featureIndex} does not support type ${type} for device ${this.name}`,
);
}
}
public hasOutput(type: Messages.OutputType): boolean {
return this._features.values().filter((f) => f.hasOutput(type)).toArray().length > 0;
return (
this._features
.values()
.filter((f) => f.hasOutput(type))
.toArray().length > 0
);
}
public hasInput(type: Messages.InputType): boolean {
return this._features.values().filter((f) => f.hasInput(type)).toArray().length > 0;
return (
this._features
.values()
.filter((f) => f.hasInput(type))
.toArray().length > 0
);
}
public async runOutput(cmd: DeviceOutputCommand): Promise<void> {
@@ -138,7 +152,15 @@ export class ButtplugClientDevice extends EventEmitter {
}
public async stop(): Promise<void> {
await this.sendMsgExpectOk({StopCmd: { Id: 1, DeviceIndex: this.index, FeatureIndex: undefined, Inputs: true, Outputs: true}});
await this.sendMsgExpectOk({
StopCmd: {
Id: 1,
DeviceIndex: this.index,
FeatureIndex: undefined,
Inputs: true,
Outputs: true,
},
});
}
public async battery(): Promise<number> {
@@ -160,6 +182,6 @@ export class ButtplugClientDevice extends EventEmitter {
}
public emitDisconnected() {
this.emit('deviceremoved');
this.emit("deviceremoved");
}
}

View File

@@ -14,7 +14,7 @@ class PercentOrSteps {
}
public static createSteps(s: number): PercentOrSteps {
let v = new PercentOrSteps;
let v = new PercentOrSteps();
v._steps = s;
return v;
}
@@ -24,7 +24,7 @@ class PercentOrSteps {
throw new ButtplugDeviceError(`Percent value ${p} is not in the range 0.0 <= x <= 1.0`);
}
let v = new PercentOrSteps;
let v = new PercentOrSteps();
v._percent = p;
return v;
}
@@ -35,8 +35,7 @@ export class DeviceOutputCommand {
private _outputType: OutputType,
private _value: PercentOrSteps,
private _duration?: number,
)
{}
) {}
public get outputType() {
return this._outputType;
@@ -52,26 +51,36 @@ export class DeviceOutputCommand {
}
export class DeviceOutputValueConstructor {
public constructor(
private _outputType: OutputType)
{}
public constructor(private _outputType: OutputType) {}
public steps(steps: number): DeviceOutputCommand {
return new DeviceOutputCommand(this._outputType, PercentOrSteps.createSteps(steps), undefined);
}
public percent(percent: number): DeviceOutputCommand {
return new DeviceOutputCommand(this._outputType, PercentOrSteps.createPercent(percent), undefined);
return new DeviceOutputCommand(
this._outputType,
PercentOrSteps.createPercent(percent),
undefined,
);
}
}
export class DeviceOutputPositionWithDurationConstructor {
public steps(steps: number, duration: number): DeviceOutputCommand {
return new DeviceOutputCommand(OutputType.Position, PercentOrSteps.createSteps(steps), duration);
return new DeviceOutputCommand(
OutputType.Position,
PercentOrSteps.createSteps(steps),
duration,
);
}
public percent(percent: number, duration: number): DeviceOutputCommand {
return new DeviceOutputCommand(OutputType.HwPositionWithDuration, PercentOrSteps.createPercent(percent), duration);
return new DeviceOutputCommand(
OutputType.HwPositionWithDuration,
PercentOrSteps.createPercent(percent),
duration,
);
}
}

View File

@@ -3,23 +3,18 @@ import * as Messages from "../core/Messages";
import { DeviceOutputCommand } from "./ButtplugClientDeviceCommand";
export class ButtplugClientDeviceFeature {
constructor(
private _deviceIndex: number,
private _deviceName: string,
private _feature: Messages.DeviceFeature,
private _sendClosure: (
msg: Messages.ButtplugMessage
) => Promise<Messages.ButtplugMessage>) {
}
private _sendClosure: (msg: Messages.ButtplugMessage) => Promise<Messages.ButtplugMessage>,
) {}
protected send = async (msg: Messages.ButtplugMessage): Promise<Messages.ButtplugMessage> => {
return await this._sendClosure(msg);
}
};
protected sendMsgExpectOk = async (
msg: Messages.ButtplugMessage
): Promise<void> => {
protected sendMsgExpectOk = async (msg: Messages.ButtplugMessage): Promise<void> => {
const response = await this.send(msg);
if (response.Ok !== undefined) {
return;
@@ -32,13 +27,17 @@ export class ButtplugClientDeviceFeature {
protected isOutputValid(type: Messages.OutputType) {
if (this._feature.Output !== undefined && !this._feature.Output.hasOwnProperty(type)) {
throw new ButtplugDeviceError(`Feature index ${this._feature.FeatureIndex} does not support type ${type} for device ${this._deviceName}`);
throw new ButtplugDeviceError(
`Feature index ${this._feature.FeatureIndex} does not support type ${type} for device ${this._deviceName}`,
);
}
}
protected isInputValid(type: Messages.InputType) {
if (this._feature.Input !== undefined && !this._feature.Input.hasOwnProperty(type)) {
throw new ButtplugDeviceError(`Feature index ${this._feature.FeatureIndex} does not support type ${type} for device ${this._deviceName}`);
throw new ButtplugDeviceError(
`Feature index ${this._feature.FeatureIndex} does not support type ${type} for device ${this._deviceName}`,
);
}
}
@@ -74,8 +73,8 @@ export class ButtplugClientDeviceFeature {
Id: 1,
DeviceIndex: this._deviceIndex,
FeatureIndex: this._feature.FeatureIndex,
Command: outCommand
}
Command: outCommand,
},
};
await this.sendMsgExpectOk(cmd);
}
@@ -124,20 +123,29 @@ export class ButtplugClientDeviceFeature {
return false;
}
public async runOutput(cmd: DeviceOutputCommand): Promise<void> {
if (this._feature.Output !== undefined && this._feature.Output.hasOwnProperty(cmd.outputType.toString())) {
if (
this._feature.Output !== undefined &&
this._feature.Output.hasOwnProperty(cmd.outputType.toString())
) {
return this.sendOutputCmd(cmd);
}
throw new ButtplugDeviceError(`Output type ${cmd.outputType} not supported by feature.`);
}
public async runInput(inputType: Messages.InputType, inputCommand: Messages.InputCommandType): Promise<Messages.InputReading | undefined> {
public async runInput(
inputType: Messages.InputType,
inputCommand: Messages.InputCommandType,
): Promise<Messages.InputReading | undefined> {
// Make sure the requested feature is valid
this.isInputValid(inputType);
let inputAttributes = this._feature.Input[inputType];
console.log(this._feature.Input);
if ((inputCommand === Messages.InputCommandType.Unsubscribe && !inputAttributes.Command.includes(Messages.InputCommandType.Subscribe)) && !inputAttributes.Command.includes(inputCommand)) {
if (
inputCommand === Messages.InputCommandType.Unsubscribe &&
!inputAttributes.Command.includes(Messages.InputCommandType.Subscribe) &&
!inputAttributes.Command.includes(inputCommand)
) {
throw new ButtplugDeviceError(`${inputType} does not support command ${inputCommand}`);
}
@@ -148,7 +156,7 @@ export class ButtplugClientDeviceFeature {
FeatureIndex: this._feature.FeatureIndex,
Type: inputType,
Command: inputCommand,
}
},
};
if (inputCommand == Messages.InputCommandType.Read) {
const response = await this.send(cmd);

View File

@@ -6,12 +6,11 @@
* @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved.
*/
'use strict';
"use strict";
import { ButtplugBrowserWebsocketClientConnector } from './ButtplugBrowserWebsocketClientConnector';
import { WebSocket as NodeWebSocket } from 'ws';
import { ButtplugBrowserWebsocketClientConnector } from "./ButtplugBrowserWebsocketClientConnector";
import { WebSocket as NodeWebSocket } from "ws";
export class ButtplugNodeWebsocketClientConnector extends ButtplugBrowserWebsocketClientConnector {
protected _websocketConstructor =
NodeWebSocket as unknown as typeof WebSocket;
protected _websocketConstructor = NodeWebSocket as unknown as typeof WebSocket;
}

View File

@@ -6,8 +6,8 @@
* @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved.
*/
import { ButtplugMessage } from '../core/Messages';
import { EventEmitter } from 'eventemitter3';
import { ButtplugMessage } from "../core/Messages";
import { EventEmitter } from "eventemitter3";
export interface IButtplugClientConnector extends EventEmitter {
connect: () => Promise<void>;

View File

@@ -6,8 +6,8 @@
* @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved.
*/
import * as Messages from './Messages';
import { ButtplugLogger } from './Logging';
import * as Messages from "./Messages";
import { ButtplugLogger } from "./Logging";
export class ButtplugError extends Error {
public get ErrorClass(): Messages.ErrorClass {
@@ -27,16 +27,16 @@ export class ButtplugError extends Error {
Error: {
Id: this.Id,
ErrorCode: this.ErrorClass,
ErrorMessage: this.message
}
}
ErrorMessage: this.message,
},
};
}
public static LogAndError<T extends ButtplugError>(
constructor: new (str: string, num: number) => T,
logger: ButtplugLogger,
message: string,
id: number = Messages.SYSTEM_MESSAGE_ID
id: number = Messages.SYSTEM_MESSAGE_ID,
): T {
logger.Error(message);
return new constructor(message, id);
@@ -67,7 +67,7 @@ export class ButtplugError extends Error {
message: string,
errorClass: Messages.ErrorClass,
id: number = Messages.SYSTEM_MESSAGE_ID,
inner?: Error
inner?: Error,
) {
super(message);
this.errorClass = errorClass;

View File

@@ -6,7 +6,7 @@
* @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved.
*/
import { EventEmitter } from 'eventemitter3';
import { EventEmitter } from "eventemitter3";
export enum ButtplugLogLevel {
Off,
@@ -69,9 +69,7 @@ export class LogMessage {
* Returns a formatted string with timestamp, level, and message.
*/
public get FormattedMessage() {
return `${ButtplugLogLevel[this.logLevel]} : ${this.timestamp} : ${
this.logMessage
}`;
return `${ButtplugLogLevel[this.logLevel]} : ${this.timestamp} : ${this.logMessage}`;
}
}
@@ -176,10 +174,7 @@ export class ButtplugLogger extends EventEmitter {
*/
protected AddLogMessage(msg: string, level: ButtplugLogLevel) {
// If nothing wants the log message we have, ignore it.
if (
level > this.maximumEventLogLevel &&
level > this.maximumConsoleLogLevel
) {
if (level > this.maximumEventLogLevel && level > this.maximumConsoleLogLevel) {
return;
}
const logMsg = new LogMessage(msg, level);
@@ -191,7 +186,7 @@ export class ButtplugLogger extends EventEmitter {
console.log(logMsg.FormattedMessage);
}
if (level <= this.maximumEventLogLevel) {
this.emit('log', logMsg);
this.emit("log", logMsg);
}
}
}

View File

@@ -7,9 +7,9 @@
*/
// tslint:disable:max-classes-per-file
'use strict';
"use strict";
import { ButtplugMessageError } from './Exceptions';
import { ButtplugMessageError } from "./Exceptions";
export const SYSTEM_MESSAGE_ID = 0;
export const DEFAULT_MESSAGE_ID = 1;
@@ -132,34 +132,34 @@ export interface DeviceList {
}
export enum OutputType {
Unknown = 'Unknown',
Vibrate = 'Vibrate',
Rotate = 'Rotate',
Oscillate = 'Oscillate',
Constrict = 'Constrict',
Inflate = 'Inflate',
Position = 'Position',
HwPositionWithDuration = 'HwPositionWithDuration',
Temperature = 'Temperature',
Spray = 'Spray',
Led = 'Led',
Unknown = "Unknown",
Vibrate = "Vibrate",
Rotate = "Rotate",
Oscillate = "Oscillate",
Constrict = "Constrict",
Inflate = "Inflate",
Position = "Position",
HwPositionWithDuration = "HwPositionWithDuration",
Temperature = "Temperature",
Spray = "Spray",
Led = "Led",
}
export enum InputType {
Unknown = 'Unknown',
Battery = 'Battery',
RSSI = 'RSSI',
Button = 'Button',
Pressure = 'Pressure',
Unknown = "Unknown",
Battery = "Battery",
RSSI = "RSSI",
Button = "Button",
Pressure = "Pressure",
// Temperature,
// Accelerometer,
// Gyro,
}
export enum InputCommandType {
Read = 'Read',
Subscribe = 'Subscribe',
Unsubscribe = 'Unsubscribe',
Read = "Read",
Subscribe = "Subscribe",
Unsubscribe = "Unsubscribe",
}
export interface DeviceFeatureInput {

View File

@@ -6,27 +6,24 @@
* @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved.
*/
import { ButtplugMessage } from './core/Messages';
import { IButtplugClientConnector } from './client/IButtplugClientConnector';
import { EventEmitter } from 'eventemitter3';
import { ButtplugMessage } from "./core/Messages";
import { IButtplugClientConnector } from "./client/IButtplugClientConnector";
import { EventEmitter } from "eventemitter3";
export * from './client/ButtplugClient';
export * from './client/ButtplugClientDevice';
export * from './client/ButtplugBrowserWebsocketClientConnector';
export * from './client/ButtplugNodeWebsocketClientConnector';
export * from './client/ButtplugClientConnectorException';
export * from './utils/ButtplugMessageSorter';
export * from './client/ButtplugClientDeviceCommand';
export * from './client/ButtplugClientDeviceFeature';
export * from './client/IButtplugClientConnector';
export * from './core/Messages';
export * from './core/Logging';
export * from './core/Exceptions';
export * from "./client/ButtplugClient";
export * from "./client/ButtplugClientDevice";
export * from "./client/ButtplugBrowserWebsocketClientConnector";
export * from "./client/ButtplugNodeWebsocketClientConnector";
export * from "./client/ButtplugClientConnectorException";
export * from "./utils/ButtplugMessageSorter";
export * from "./client/ButtplugClientDeviceCommand";
export * from "./client/ButtplugClientDeviceFeature";
export * from "./client/IButtplugClientConnector";
export * from "./core/Messages";
export * from "./core/Logging";
export * from "./core/Exceptions";
export class ButtplugWasmClientConnector
extends EventEmitter
implements IButtplugClientConnector
{
export class ButtplugWasmClientConnector extends EventEmitter implements IButtplugClientConnector {
private static _loggingActivated = false;
private static wasmInstance;
private _connected: boolean = false;
@@ -43,30 +40,25 @@ export class ButtplugWasmClientConnector
private static maybeLoadWasm = async () => {
if (ButtplugWasmClientConnector.wasmInstance == undefined) {
ButtplugWasmClientConnector.wasmInstance = await import(
'../wasm/index.js'
);
ButtplugWasmClientConnector.wasmInstance = await import("../wasm/index.js");
}
};
public static activateLogging = async (logLevel: string = 'debug') => {
public static activateLogging = async (logLevel: string = "debug") => {
await ButtplugWasmClientConnector.maybeLoadWasm();
if (this._loggingActivated) {
console.log('Logging already activated, ignoring.');
console.log("Logging already activated, ignoring.");
return;
}
console.log('Turning on logging.');
ButtplugWasmClientConnector.wasmInstance.buttplug_activate_env_logger(
logLevel,
);
console.log("Turning on logging.");
ButtplugWasmClientConnector.wasmInstance.buttplug_activate_env_logger(logLevel);
};
public initialize = async (): Promise<void> => {};
public connect = async (): Promise<void> => {
await ButtplugWasmClientConnector.maybeLoadWasm();
this.client =
ButtplugWasmClientConnector.wasmInstance.buttplug_create_embedded_wasm_server(
this.client = ButtplugWasmClientConnector.wasmInstance.buttplug_create_embedded_wasm_server(
(msgs) => {
this.emitMessage(msgs);
},
@@ -80,7 +72,7 @@ export class ButtplugWasmClientConnector
public send = (msg: ButtplugMessage): void => {
ButtplugWasmClientConnector.wasmInstance.buttplug_client_send_json_message(
this.client,
new TextEncoder().encode('[' + JSON.stringify(msg) + ']'),
new TextEncoder().encode("[" + JSON.stringify(msg) + "]"),
(output) => {
this.emitMessage(output);
},
@@ -90,6 +82,6 @@ export class ButtplugWasmClientConnector
private emitMessage = (msg: Uint8Array) => {
const str = new TextDecoder().decode(msg);
const msgs: ButtplugMessage[] = JSON.parse(str);
this.emit('message', msgs);
this.emit("message", msgs);
};
}

View File

@@ -6,10 +6,10 @@
* @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved.
*/
'use strict';
"use strict";
import { EventEmitter } from 'eventemitter3';
import { ButtplugMessage } from '../core/Messages';
import { EventEmitter } from "eventemitter3";
import { ButtplugMessage } from "../core/Messages";
export class ButtplugBrowserWebsocketConnector extends EventEmitter {
protected _ws: WebSocket | undefined;
@@ -26,18 +26,20 @@ export class ButtplugBrowserWebsocketConnector extends EventEmitter {
public connect = async (): Promise<void> => {
return new Promise<void>((resolve, reject) => {
const ws = new (this._websocketConstructor ?? WebSocket)(this._url);
const onErrorCallback = (event: Event) => {reject(event)}
const onCloseCallback = (event: CloseEvent) => reject(event.reason)
ws.addEventListener('open', async () => {
const onErrorCallback = (event: Event) => {
reject(event);
};
const onCloseCallback = (event: CloseEvent) => reject(event.reason);
ws.addEventListener("open", async () => {
this._ws = ws;
try {
await this.initialize();
this._ws.addEventListener('message', (msg) => {
this._ws.addEventListener("message", (msg) => {
this.parseIncomingMessage(msg);
});
this._ws.removeEventListener('close', onCloseCallback);
this._ws.removeEventListener('error', onErrorCallback);
this._ws.addEventListener('close', this.disconnect);
this._ws.removeEventListener("close", onCloseCallback);
this._ws.removeEventListener("error", onErrorCallback);
this._ws.addEventListener("close", this.disconnect);
resolve();
} catch (e) {
reject(e);
@@ -47,8 +49,8 @@ export class ButtplugBrowserWebsocketConnector extends EventEmitter {
// browsers usually only throw Error Code 1006. It's up to those using this
// library to state what the problem might be.
ws.addEventListener('error', onErrorCallback)
ws.addEventListener('close', onCloseCallback);
ws.addEventListener("error", onErrorCallback);
ws.addEventListener("close", onCloseCallback);
});
};
@@ -58,14 +60,14 @@ export class ButtplugBrowserWebsocketConnector extends EventEmitter {
}
this._ws!.close();
this._ws = undefined;
this.emit('disconnect');
this.emit("disconnect");
};
public sendMessage(msg: ButtplugMessage) {
if (!this.Connected) {
throw new Error('ButtplugBrowserWebsocketConnector not connected');
throw new Error("ButtplugBrowserWebsocketConnector not connected");
}
this._ws!.send('[' + JSON.stringify(msg) + ']');
this._ws!.send("[" + JSON.stringify(msg) + "]");
}
public initialize = async (): Promise<void> => {
@@ -73,9 +75,9 @@ export class ButtplugBrowserWebsocketConnector extends EventEmitter {
};
protected parseIncomingMessage(event: MessageEvent) {
if (typeof event.data === 'string') {
if (typeof event.data === "string") {
const msgs: ButtplugMessage[] = JSON.parse(event.data);
this.emit('message', msgs);
this.emit("message", msgs);
} else if (event.data instanceof Blob) {
// No-op, we only use text message types.
}
@@ -83,6 +85,6 @@ export class ButtplugBrowserWebsocketConnector extends EventEmitter {
protected onReaderLoad(event: Event) {
const msgs: ButtplugMessage[] = JSON.parse((event.target as FileReader).result as string);
this.emit('message', msgs);
this.emit("message", msgs);
}
}

View File

@@ -6,8 +6,8 @@
* @copyright Copyright (c) Nonpolynomial Labs LLC. All rights reserved.
*/
import * as Messages from '../core/Messages';
import { ButtplugError } from '../core/Exceptions';
import * as Messages from "../core/Messages";
import { ButtplugError } from "../core/Exceptions";
export class ButtplugMessageSorter {
protected _counter = 1;
@@ -21,9 +21,7 @@ export class ButtplugMessageSorter {
// One of the places we should actually return a promise, as we need to store
// them while waiting for them to return across the line.
// tslint:disable:promise-function-async
public PrepareOutgoingMessage(
msg: Messages.ButtplugMessage
): Promise<Messages.ButtplugMessage> {
public PrepareOutgoingMessage(msg: Messages.ButtplugMessage): Promise<Messages.ButtplugMessage> {
if (this._useCounter) {
Messages.setMsgId(msg, this._counter);
// Always increment last, otherwise we might lose sync
@@ -31,19 +29,15 @@ export class ButtplugMessageSorter {
}
let res;
let rej;
const msgPromise = new Promise<Messages.ButtplugMessage>(
(resolve, reject) => {
const msgPromise = new Promise<Messages.ButtplugMessage>((resolve, reject) => {
res = resolve;
rej = reject;
}
);
});
this._waitingMsgs.set(Messages.msgId(msg), [res, rej]);
return msgPromise;
}
public ParseIncomingMessages(
msgs: Messages.ButtplugMessage[]
): Messages.ButtplugMessage[] {
public ParseIncomingMessages(msgs: Messages.ButtplugMessage[]): Messages.ButtplugMessage[] {
const noMatch: Messages.ButtplugMessage[] = [];
for (const x of msgs) {
let id = Messages.msgId(x);

View File

@@ -7,7 +7,6 @@ use buttplug_server::device::hardware::communication::{
HardwareCommunicationManagerEvent,
};
use futures::future;
use js_sys::Array;
use tokio::sync::mpsc::Sender;
use wasm_bindgen::prelude::*;
use wasm_bindgen_futures::{spawn_local, JsFuture};
@@ -63,8 +62,8 @@ impl HardwareCommunicationManager for WebBluetoothCommunicationManager {
// way for anyone to add device configurations through FFI yet anyways.
let config_manager = create_test_dcm(false);
let options = web_sys::RequestDeviceOptions::new();
let filters = Array::new();
let optional_services = Array::new();
let mut filters = Vec::new();
let mut optional_services = Vec::new();
for vals in config_manager.base_communication_specifiers().iter() {
for config in vals.1.iter() {
if let ProtocolCommunicationSpecifier::BluetoothLE(btle) = &config {
@@ -77,16 +76,16 @@ impl HardwareCommunicationManager for WebBluetoothCommunicationManager {
} else {
filter.set_name(&name);
}
filters.push(&filter.into());
filters.push(filter);
}
for (service, _) in btle.services() {
optional_services.push(&service.to_string().into());
optional_services.push(js_sys::JsString::from(service.to_string()));
}
}
}
}
options.set_filters(&filters.into());
options.set_optional_services(&optional_services.into());
options.set_filters(&filters);
options.set_optional_services(&optional_services);
let nav = web_sys::window().unwrap().navigator();
//nav.bluetooth().get_availability();
//JsFuture::from(nav.bluetooth().request_device()).await;

View File

@@ -1,6 +1,4 @@
PUBLIC_API_URL=https://sexy.pivoine.art/api
PUBLIC_URL=https://sexy.pivoine.art
PUBLIC_API_URL=http://localhost:3000/api
PUBLIC_URL=http://localhost:3000
PUBLIC_UMAMI_ID=
LETTERSPACE_API_URL=
LETTERSPACE_API_KEY=
LETTERSPACE_LIST_ID=
PUBLIC_UMAMI_SCRIPT=

View File

@@ -8,42 +8,47 @@
"dev": "vite",
"build": "vite build",
"preview": "vite preview",
"start": "node ./build"
"start": "node ./build",
"check": "svelte-check --tsconfig ./tsconfig.json --threshold warning"
},
"devDependencies": {
"@iconify-json/ri": "^1.2.5",
"@iconify/tailwind4": "^1.0.6",
"@internationalized/date": "^3.8.2",
"@lucide/svelte": "^0.544.0",
"@sveltejs/adapter-node": "^5.3.1",
"@sveltejs/adapter-static": "^3.0.9",
"@sveltejs/kit": "^2.37.0",
"@sveltejs/vite-plugin-svelte": "^6.1.4",
"@tailwindcss/forms": "^0.5.9",
"@tailwindcss/typography": "^0.5.15",
"@tailwindcss/vite": "^4.0.0",
"@tsconfig/svelte": "^5.0.5",
"bits-ui": "2.11.0",
"@iconify-json/ri": "^1.2.10",
"@iconify/tailwind4": "^1.2.1",
"@internationalized/date": "^3.11.0",
"@lucide/svelte": "^0.561.0",
"@sveltejs/adapter-node": "^5.5.4",
"@sveltejs/adapter-static": "^3.0.10",
"@sveltejs/kit": "^2.53.4",
"@sveltejs/vite-plugin-svelte": "^6.2.4",
"@tailwindcss/forms": "^0.5.11",
"@tailwindcss/typography": "^0.5.19",
"@tailwindcss/vite": "^4.2.1",
"@tsconfig/svelte": "^5.0.8",
"bits-ui": "2.16.2",
"clsx": "^2.1.1",
"glob": "^11.0.3",
"glob": "^13.0.6",
"mode-watcher": "^1.1.0",
"prettier-plugin-svelte": "^3.4.0",
"super-sitemap": "^1.0.5",
"svelte": "^5.38.6",
"svelte-sonner": "^1.0.5",
"tailwind-merge": "^3.3.1",
"tailwind-variants": "^1.0.0",
"tailwindcss": "^4.0.0",
"tw-animate-css": "^1.3.8",
"typescript": "^5.9.2",
"vite": "^7.1.4",
"prettier-plugin-svelte": "^3.5.1",
"super-sitemap": "^1.0.7",
"svelte": "^5.53.7",
"svelte-check": "^4.4.4",
"svelte-sonner": "^1.0.8",
"tailwind-merge": "^3.5.0",
"tailwind-variants": "^3.2.2",
"tailwindcss": "^4.2.1",
"tw-animate-css": "^1.4.0",
"typescript": "^5.9.3",
"vite": "^7.3.1",
"vite-plugin-wasm": "3.5.0"
},
"dependencies": {
"@directus/sdk": "^20.0.3",
"@sexy.pivoine.art/buttplug": "workspace:*",
"javascript-time-ago": "^2.5.11",
"media-chrome": "^4.13.1",
"@sexy.pivoine.art/types": "workspace:*",
"graphql": "^16.11.0",
"graphql-request": "^7.1.2",
"javascript-time-ago": "^2.6.4",
"marked": "^17.0.4",
"media-chrome": "^4.18.0",
"svelte-i18n": "^4.0.1"
}
}

View File

@@ -5,6 +5,8 @@
@custom-variant dark (&:where(.dark, .dark *));
@custom-variant hover (&:hover);
@theme {
--animate-vibrate: vibrate 0.3s linear infinite;
--animate-fade-in: fadeIn 0.3s ease-out;
@@ -75,11 +77,11 @@
@keyframes pulseGlow {
0%,
100% {
boxShadow: 0 0 20px rgba(183, 0, 217, 0.3);
boxshadow: 0 0 20px rgba(183, 0, 217, 0.3);
}
50% {
boxShadow: 0 0 40px rgba(183, 0, 217, 0.6);
boxshadow: 0 0 40px rgba(183, 0, 217, 0.6);
}
}
}
@@ -102,6 +104,31 @@
* {
border-color: var(--border);
outline-color: var(--ring);
scrollbar-width: thin;
scrollbar-color: color-mix(in oklab, var(--primary) 40%, transparent) transparent;
}
*::-webkit-scrollbar {
width: 6px;
height: 6px;
}
*::-webkit-scrollbar-track {
background: transparent;
}
*::-webkit-scrollbar-thumb {
background-color: color-mix(in oklab, var(--primary) 40%, transparent);
border-radius: 9999px;
}
*::-webkit-scrollbar-thumb:hover {
background-color: color-mix(in oklab, var(--primary) 70%, transparent);
}
html {
scrollbar-width: thin;
scrollbar-color: color-mix(in oklab, var(--primary) 40%, transparent) transparent;
}
.prose h2 {

View File

@@ -1,24 +1,23 @@
<!doctype html>
<html lang="en">
<head>
<head>
<meta charset="utf-8" />
<meta name="viewport" content="width=device-width, initial-scale=1" />
<link rel="apple-touch-icon" sizes="180x180" href="/apple-touch-icon.png" />
<link rel="icon" type="image/png" sizes="32x32" href="/favicon-32x32.png" />
<link rel="icon" type="image/png" sizes="16x16" href="/favicon-16x16.png" />
<link rel="preconnect" href="https://fonts.googleapis.com">
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin>
<link href="https://fonts.googleapis.com/css2?family=Dancing+Script:wght@400..700&family=Noto+Sans:ital,wght@0,100..900;1,100..900&display=swap" rel="stylesheet">
<link rel="preconnect" href="https://fonts.googleapis.com" />
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin />
<link
href="https://fonts.googleapis.com/css2?family=Dancing+Script:wght@400..700&family=Noto+Sans:ital,wght@0,100..900;1,100..900&display=swap"
rel="stylesheet"
/>
<link rel="manifest" href="/site.webmanifest" />
%sveltekit.head%
</head>
</head>
<body data-sveltekit-preload-data="hover" class="dark">
<body data-sveltekit-preload-data="hover" class="dark">
<div style="display: contents">%sveltekit.body%</div>
</body>
</body>
</html>

View File

@@ -2,12 +2,8 @@ import { isAuthenticated } from "$lib/services";
import { logger, generateRequestId } from "$lib/logger";
import type { Handle } from "@sveltejs/kit";
// Log startup info once
let hasLoggedStartup = false;
if (!hasLoggedStartup) {
logger.startup();
hasLoggedStartup = true;
}
// Log startup info once (module-level code runs exactly once on import)
logger.startup();
export const handle: Handle = async ({ event, resolve }) => {
const { cookies, locals, url, request } = event;
@@ -23,40 +19,40 @@ export const handle: Handle = async ({ event, resolve }) => {
logger.request(request.method, url.pathname, {
requestId,
context: {
userAgent: request.headers.get('user-agent')?.substring(0, 100),
referer: request.headers.get('referer'),
ip: request.headers.get('x-forwarded-for') || request.headers.get('x-real-ip'),
userAgent: request.headers.get("user-agent")?.substring(0, 100),
referer: request.headers.get("referer"),
ip: request.headers.get("x-forwarded-for") || request.headers.get("x-real-ip"),
},
});
// Handle authentication
const token = cookies.get("directus_session_token");
const token = cookies.get("session_token");
if (token) {
try {
locals.authStatus = await isAuthenticated(token);
if (locals.authStatus.authenticated) {
logger.auth('Token validated', true, {
logger.auth("Token validated", true, {
requestId,
userId: locals.authStatus.user?.id,
context: {
email: locals.authStatus.user?.email,
role: locals.authStatus.user?.role?.name,
role: locals.authStatus.user?.role,
},
});
} else {
logger.auth('Token invalid', false, { requestId });
logger.auth("Token invalid", false, { requestId });
}
} catch (error) {
logger.error('Authentication check failed', {
logger.error("Authentication check failed", {
requestId,
error: error instanceof Error ? error : new Error(String(error)),
});
locals.authStatus = { authenticated: false };
}
} else {
logger.debug('No session token found', { requestId });
logger.debug("No session token found", { requestId });
locals.authStatus = { authenticated: false };
}
@@ -70,7 +66,7 @@ export const handle: Handle = async ({ event, resolve }) => {
});
} catch (error) {
const duration = Date.now() - startTime;
logger.error('Request handler error', {
logger.error("Request handler error", {
requestId,
method: request.method,
path: url.pathname,
@@ -86,12 +82,12 @@ export const handle: Handle = async ({ event, resolve }) => {
requestId,
userId: locals.authStatus.authenticated ? locals.authStatus.user?.id : undefined,
context: {
cached: response.headers.get('x-sveltekit-page') === 'true',
cached: response.headers.get("x-sveltekit-page") === "true",
},
});
// Add request ID to response headers (useful for debugging)
response.headers.set('x-request-id', requestId);
response.headers.set("x-request-id", requestId);
return response;
};

View File

@@ -0,0 +1,25 @@
import { GraphQLClient } from "graphql-request";
import { env } from "$env/dynamic/public";
import type { CurrentUser } from "./types";
export const apiUrl = env.PUBLIC_API_URL || "http://localhost:3000/api";
export const getGraphQLClient = (fetchFn?: typeof globalThis.fetch) =>
new GraphQLClient(`${apiUrl}/graphql`, {
credentials: "include",
fetch: fetchFn || globalThis.fetch,
});
export const getAssetUrl = (
id: string | null | undefined,
transform?: "mini" | "thumbnail" | "preview" | "medium" | "banner",
) => {
if (!id) {
return null;
}
return `${apiUrl}/assets/${id}${transform ? "?transform=" + transform : ""}`;
};
export const isModel = (user: CurrentUser) => {
return user.role === "model";
};

View File

@@ -1,31 +1,30 @@
<script lang="ts">
import { _ } from "svelte-i18n";
import {
import { _ } from "svelte-i18n";
import {
Dialog,
DialogContent,
DialogDescription,
DialogHeader,
DialogTitle,
} from "$lib/components/ui/dialog";
import { Button } from "$lib/components/ui/button";
import { Separator } from "$lib/components/ui/separator";
import { onMount } from "svelte";
} from "$lib/components/ui/dialog";
import { Button } from "$lib/components/ui/button";
import { Separator } from "$lib/components/ui/separator";
import { onMount } from "svelte";
const AGE_VERIFICATION_KEY = "age-verified";
const AGE_VERIFICATION_KEY = "age-verified";
let isOpen = true;
let isOpen = $state(false);
function handleAgeConfirmation() {
function handleAgeConfirmation() {
localStorage.setItem(AGE_VERIFICATION_KEY, "true");
isOpen = false;
}
onMount(() => {
const storedVerification = localStorage.getItem(AGE_VERIFICATION_KEY);
if (storedVerification === "true") {
isOpen = false;
}
});
onMount(() => {
if (localStorage.getItem(AGE_VERIFICATION_KEY) !== "true") {
isOpen = true;
}
});
</script>
<Dialog bind:open={isOpen}>
@@ -40,9 +39,7 @@ onMount(() => {
<div
class="w-10 h-10 shrink-0 grow-0 rounded-full bg-gradient-to-br from-primary to-purple-600 flex items-center justify-center"
>
<span class="text-primary-foreground text-sm"
>{$_("age_verification_dialog.age")}</span
>
<span class="text-primary-foreground text-sm">{$_("age_verification_dialog.age")}</span>
</div>
<div class="">
<DialogTitle class="text-left text-xl font-semibold text-primary-foreground"
@@ -63,12 +60,7 @@ onMount(() => {
<Button variant="destructive" href={$_("age_verification_dialog.exit_url")} size="sm">
{$_("age_verification_dialog.exit")}
</Button>
<Button
variant="default"
size="sm"
onclick={handleAgeConfirmation}
class="cursor-pointer"
>
<Button variant="default" size="sm" onclick={handleAgeConfirmation} class="cursor-pointer">
<span class="icon-[ri--check-line]"></span>
{$_("age_verification_dialog.confirm")}
</Button>

View File

@@ -1,12 +1,8 @@
<script lang="ts">
const { isMobileMenuOpen = $bindable(), label, onclick } = $props();
const { isMobileMenuOpen = $bindable(), label, onclick } = $props();
</script>
<button
class="block rounded-full cursor-pointer"
onclick={onclick}
aria-label={label}
>
<button class="block rounded-full cursor-pointer" {onclick} aria-label={label}>
<div
class="relative flex overflow-hidden items-center justify-center rounded-full w-[50px] h-[50px] transform transition-all duration-200 shadow-md opacity-90 translate-x-3"
>
@@ -14,23 +10,23 @@ const { isMobileMenuOpen = $bindable(), label, onclick } = $props();
class="flex flex-col justify-between w-[16px] h-[10px] transform transition-all duration-300 origin-center overflow-hidden"
>
<div
class={`bg-white h-[2px] w-7 transform transition-all duration-300 origin-left ${isMobileMenuOpen ? 'translate-x-10' : ''}`}
class={`bg-white h-[2px] w-7 transform transition-all duration-300 origin-left ${isMobileMenuOpen ? "translate-x-10" : ""}`}
></div>
<div
class={`bg-white h-[2px] w-7 rounded transform transition-all duration-300 delay-75 ${isMobileMenuOpen ? 'translate-x-10' : ''}`}
class={`bg-white h-[2px] w-7 rounded transform transition-all duration-300 delay-75 ${isMobileMenuOpen ? "translate-x-10" : ""}`}
></div>
<div
class={`bg-white h-[2px] w-7 transform transition-all duration-300 origin-left delay-150 ${isMobileMenuOpen ? 'translate-x-10' : ''}`}
class={`bg-white h-[2px] w-7 transform transition-all duration-300 origin-left delay-150 ${isMobileMenuOpen ? "translate-x-10" : ""}`}
></div>
<div
class={`absolute items-center justify-between transform transition-all duration-500 top-6.5 -translate-x-10 flex w-0 ${isMobileMenuOpen ? 'translate-x-0 w-12' : ''}`}
class={`absolute items-center justify-between transform transition-all duration-500 top-6.5 -translate-x-10 flex w-0 ${isMobileMenuOpen ? "translate-x-0 w-12" : ""}`}
>
<div
class={`absolute bg-white h-[2px] w-4 transform transition-all duration-500 rotate-0 delay-300 ${isMobileMenuOpen ? 'rotate-45' : ''}`}
class={`absolute bg-white h-[2px] w-4 transform transition-all duration-500 rotate-0 delay-300 ${isMobileMenuOpen ? "rotate-45" : ""}`}
></div>
<div
class={`absolute bg-white h-[2px] w-4 transform transition-all duration-500 -rotate-0 delay-300 ${isMobileMenuOpen ? '-rotate-45' : ''}`}
class={`absolute bg-white h-[2px] w-4 transform transition-all duration-500 -rotate-0 delay-300 ${isMobileMenuOpen ? "-rotate-45" : ""}`}
></div>
</div>
</div>

View File

@@ -1,46 +1,46 @@
<script lang="ts">
import { cn } from "$lib/utils";
import { Slider } from "$lib/components/ui/slider";
import { Label } from "$lib/components/ui/label";
import { Card, CardContent, CardHeader } from "$lib/components/ui/card";
import type { BluetoothDevice } from "$lib/types";
import { _ } from "svelte-i18n";
import { cn } from "$lib/utils";
import { Slider } from "$lib/components/ui/slider";
import { Label } from "$lib/components/ui/label";
import { Card, CardContent, CardHeader } from "$lib/components/ui/card";
import type { BluetoothDevice } from "$lib/types";
import { _ } from "svelte-i18n";
interface Props {
interface Props {
device: BluetoothDevice;
onChange: (scalarIndex: number, val: number) => void;
onStop: () => void;
}
}
let { device, onChange, onStop }: Props = $props();
let { device, onChange, onStop }: Props = $props();
function getBatteryColor(level: number) {
function getBatteryColor(level: number) {
if (!device.hasBattery) {
return "text-gray-400";
}
if (level > 60) return "text-green-400";
if (level > 30) return "text-yellow-400";
return "text-red-400";
}
}
function getBatteryBgColor(level: number) {
function getBatteryBgColor(level: number) {
if (!device.hasBattery) {
return "bg-gray-400/20";
}
if (level > 60) return "bg-green-400/20";
if (level > 30) return "bg-yellow-400/20";
return "bg-red-400/20";
}
}
function getScalarAnimations() {
function getScalarAnimations() {
return device.actuators
.filter((a) => a.value > 0)
.map((a) => `animate-${a.outputType.toLowerCase()}`);
}
}
function isActive() {
function isActive() {
return device.actuators.some((a) => a.value > 0);
}
}
</script>
<Card
@@ -52,12 +52,12 @@ function isActive() {
<div
class="p-2 rounded-lg bg-gradient-to-br from-primary/20 to-accent/20 border border-primary/30 flex shrink-0 grow-0"
>
<span class={cn([...getScalarAnimations(), "icon-[ri--rocket-line] w-5 h-5 text-primary"])}></span>
<span
class={cn([...getScalarAnimations(), "icon-[ri--rocket-line] w-5 h-5 text-primary"])}
></span>
</div>
<div>
<h3
class={`font-semibold text-card-foreground group-hover:text-primary transition-colors`}
>
<h3 class="font-semibold text-card-foreground group-hover:text-primary transition-colors">
{device.name}
</h3>
<!-- <p class="text-sm text-muted-foreground">
@@ -65,27 +65,20 @@ function isActive() {
</p> -->
</div>
</div>
<button class={`${isActive() ? "cursor-pointer" : ""} flex items-center gap-2`} onclick={() => isActive() && onStop()}>
<button
class={`${isActive() ? "cursor-pointer" : ""} flex items-center gap-2`}
onclick={() => isActive() && onStop()}
>
<div class="relative">
<div
class="w-2 h-2 rounded-full {isActive()
? 'bg-green-400'
: 'bg-red-400'}"
></div>
<div class="w-2 h-2 rounded-full {isActive() ? 'bg-green-400' : 'bg-red-400'}"></div>
{#if isActive()}
<div
class="absolute inset-0 w-2 h-2 rounded-full bg-green-400 animate-ping opacity-75"
></div>
{/if}
</div>
<span
class="text-xs font-medium {isActive()
? 'text-green-400'
: 'text-red-400'}"
>
{isActive()
? $_("device_card.active")
: $_("device_card.paused")}
<span class="text-xs font-medium {isActive() ? 'text-green-400' : 'text-red-400'}">
{isActive() ? $_("device_card.active") : $_("device_card.paused")}
</span>
</button>
</div>
@@ -108,9 +101,7 @@ function isActive() {
<div class="flex items-center justify-between">
<div class="flex items-center gap-2">
<span
class="icon-[ri--battery-2-charge-line] w-4 h-4 {getBatteryColor(
device.batteryLevel,
)}"
class="icon-[ri--battery-2-charge-line] w-4 h-4 {getBatteryColor(device.batteryLevel)}"
></span>
<span class="text-sm text-muted-foreground">{$_("device_card.battery")}</span>
</div>
@@ -139,12 +130,10 @@ function isActive() {
</div> -->
<!-- Action Button -->
{#each device.actuators as actuator, idx}
{#each device.actuators as actuator, idx (idx)}
<div class="space-y-2">
<Label for={`device-${device.info.index}-${actuator.featureIndex}-${actuator.outputType}`}
>{$_(
`device_card.actuator_types.${actuator.outputType.toLowerCase()}`,
)}</Label
>{$_(`device_card.actuator_types.${actuator.outputType.toLowerCase()}`)}</Label
>
<Slider
id={`device-${device.info.index}-${actuator.featureIndex}-${actuator.outputType}`}

View File

@@ -1,7 +1,6 @@
<script lang="ts">
import { _ } from "svelte-i18n";
import PeonyIcon from "$lib/components/icon/peony-icon.svelte";
import Logo from "../logo/logo.svelte";
import { _ } from "svelte-i18n";
import Logo from "../logo/logo.svelte";
</script>
<footer

View File

@@ -1,120 +0,0 @@
<div class="w-full h-auto">
<svg
version="1.0"
xmlns="http://www.w3.org/2000/svg"
viewBox="0 0 1280.000000 904.000000"
stroke-width="5"
stroke="#ce47eb"
preserveAspectRatio="xMidYMid meet"
>
<metadata>
Created by potrace 1.15, written by Peter Selinger 2001-2017
</metadata>
<g transform="translate(0.000000,904.000000) scale(0.100000,-0.100000)">
<path
d="M7930 7043 c-73 -10 -95 -18 -134 -51 -25 -20 -66 -53 -91 -72 -26
-19 -69 -66 -96 -104 -116 -164 -130 -314 -59 -664 32 -164 36 -217 18 -256
-13 -30 -14 -30 -140 -52 -75 -12 -105 -13 -129 -5 -18 6 -59 11 -93 11 -123
-1 -213 -66 -379 -275 -245 -308 -501 -567 -686 -693 l-92 -64 -82 7 c-53 5
-88 13 -100 23 -21 18 -66 20 -167 7 -73 -9 -124 -31 -159 -69 -22 -23 -23
-31 -18 -94 6 -58 4 -71 -11 -84 -44 -40 -203 -119 -295 -149 -56 -18 -144
-50 -195 -71 -50 -21 -138 -51 -195 -67 -232 -65 -369 -131 -595 -284 -182
-124 -172 -123 -208 -27 -23 60 -39 81 -189 245 -279 305 -319 354 -368 458
-46 94 -47 98 -32 127 8 16 15 36 15 43 0 8 14 41 30 72 17 31 30 63 30 70 0
7 7 18 15 25 8 7 15 26 15 42 0 42 15 65 49 71 17 4 37 17 46 30 14 23 14 30
-9 101 -28 88 -21 130 22 141 20 5 23 10 18 31 -4 13 -1 34 5 46 13 25 33 239
31 336 0 42 -8 78 -23 108 -31 65 -121 158 -209 217 -41 28 -77 55 -79 60 -2
5 -17 24 -33 43 -23 26 -48 39 -111 58 -183 55 -239 61 -361 36 -156 -33 -333
-185 -425 -368 -72 -143 -93 -280 -96 -622 -2 -240 -5 -288 -24 -379 -12 -57
-30 -120 -40 -140 -11 -20 -61 -84 -113 -142 -52 -58 -105 -121 -118 -140 -13
-19 -45 -58 -72 -88 -93 -106 -127 -193 -237 -616 -33 -127 -67 -251 -76 -275
-9 -25 -48 -153 -86 -285 -78 -264 -163 -502 -334 -935 -135 -340 -194 -526
-290 -910 -20 -80 -47 -180 -61 -223 -13 -43 -24 -92 -24 -109 0 -42 -43 -79
-132 -112 -56 -20 -108 -52 -213 -132 -77 -58 -162 -117 -190 -131 -85 -43
-107 -75 -62 -89 12 -3 30 -15 40 -25 10 -11 30 -19 45 -19 29 0 146 52 175
77 9 9 19 14 22 12 2 -3 -21 -24 -51 -47 -55 -43 -63 -59 -42 -80 30 -30 130
5 198 69 54 52 127 109 139 109 20 0 11 -27 -25 -80 -38 -56 -38 -74 0 -91 33
-16 67 7 135 89 31 37 70 71 95 84 l42 20 82 -21 c45 -11 95 -21 111 -21 17 0
50 -11 75 -25 58 -32 136 -35 166 -5 35 35 26 57 -40 90 -59 30 -156 132 -186
195 -30 63 -31 124 -3 258 43 213 95 336 279 657 126 219 231 423 267 520 14
36 40 128 58 205 19 77 50 185 69 240 55 159 182 450 195 447 7 -1 9 7 5 23
-10 38 0 30 37 -30 42 -69 60 -53 28 27 -36 92 -39 98 -34 98 3 0 14 -18 25
-41 14 -26 26 -39 35 -35 9 3 28 -22 59 -81 65 -121 162 -266 237 -353 35 -41
174 -196 309 -345 359 -394 379 -421 409 -549 25 -103 90 -214 169 -287 74
-67 203 -135 332 -173 110 -33 472 -112 575 -125 325 -44 688 -30 1453 54 172
19 352 35 400 35 112 1 156 11 272 66 139 66 171 103 171 197 0 64 -11 95 -52
141 -17 20 -30 38 -28 39 2 1 13 7 24 13 11 6 21 23 23 38 2 14 12 31 23 36
12 7 19 21 19 38 0 19 7 30 23 37 14 6 23 21 25 39 2 16 10 36 18 44 10 9 13
24 9 41 -4 20 -1 28 16 36 58 26 47 86 -21 106 -38 12 -40 14 -40 51 0 51 -18
82 -82 145 -73 70 -132 105 -358 213 -547 260 -919 419 -1210 517 -13 5 -13 6
0 10 8 3 22 13 30 22 23 26 363 124 434 125 l60 1 21 -85 c29 -118 59 -175
129 -245 118 -117 234 -156 461 -158 171 -1 271 17 445 80 268 96 361 157 602
396 93 92 171 159 246 209 155 105 513 381 595 458 131 122 189 224 277 485
109 325 149 342 163 70 9 -163 30 -242 143 -531 53 -137 98 -258 101 -270 3
-14 -5 -28 -29 -46 -18 -14 -94 -80 -168 -147 -137 -123 -261 -216 -306 -227
-17 -4 -46 4 -92 27 -60 29 -80 34 -192 41 -69 4 -144 11 -166 14 -103 15
-115 -61 -15 -95 19 -6 46 -11 61 -11 44 0 91 -20 88 -38 -2 -8 -15 -24 -30
-35 -22 -17 -30 -18 -42 -7 -21 16 -46 6 -46 -19 0 -25 -29 -35 -110 -35 -57
-1 -65 -3 -68 -21 -4 -29 44 -54 120 -62 35 -3 66 -12 71 -19 4 -7 31 -25 59
-39 41 -21 60 -24 93 -19 25 3 45 2 49 -4 3 -5 34 -9 69 -7 52 1 72 7 108 32
58 40 97 59 135 66 32 6 462 230 516 269 18 12 33 17 35 12 2 -6 30 -62 62
-126 l58 -116 -3 -112 c-2 -61 -6 -115 -9 -119 -2 -5 -100 -8 -217 -8 -221 0
-452 -23 -868 -88 -85 -13 -225 -33 -310 -45 -189 -26 -314 -52 -440 -92 -203
-65 -284 -132 -304 -254 -15 -90 30 -173 137 -251 28 -20 113 -85 187 -142 74
-58 171 -129 215 -158 105 -71 324 -181 563 -283 106 -45 194 -86 197 -90 9
-14 -260 -265 -361 -337 -100 -71 -130 -102 -188 -193 -16 -24 -53 -73 -82
-107 -30 -35 -67 -89 -83 -121 -20 -41 -63 -92 -135 -163 -86 -87 -106 -112
-112 -144 -4 -22 -15 -53 -26 -70 -23 -38 -23 -73 -1 -105 39 -56 94 -81 132
-60 18 9 21 8 21 -9 0 -33 11 -51 41 -67 20 -10 35 -12 46 -5 13 7 21 3 36
-15 11 -14 29 -24 44 -24 15 0 34 -7 44 -16 9 -8 27 -16 40 -16 13 -1 33 -8
44 -15 11 -7 29 -13 40 -13 50 0 129 132 140 232 21 203 78 389 136 444 17 16
51 56 74 89 89 124 200 212 433 343 l142 81 14 -27 c16 -32 36 -151 36 -220 0
-35 6 -54 21 -71 43 -46 143 -68 168 -37 6 8 14 37 18 65 5 46 11 56 47 85 23
18 61 44 86 58 91 53 151 145 153 234 0 38 -5 50 -33 79 -19 19 -53 42 -77 51
-24 9 -43 19 -43 23 0 3 28 24 62 46 81 52 213 178 298 284 63 79 75 89 148
122 l80 37 32 -49 c79 -122 233 -192 370 -170 222 37 395 196 428 396 18 107
35 427 30 560 -9 217 -63 344 -223 514 -52 56 -95 106 -95 111 0 5 4 12 10 15
55 34 235 523 290 785 10 52 28 118 39 145 10 28 29 103 41 169 27 142 24 271
-7 352 -28 72 -115 215 -185 303 -65 82 -118 184 -125 241 -11 82 59 182 93
135 9 -12 17 -14 31 -7 10 6 25 7 33 2 8 -4 27 -6 41 -3 28 5 44 45 33 80 -5
15 -4 15 4 4 12 -17 17 -6 76 144 39 99 43 100 22 10 -8 -33 -13 -62 -10 -64
10 -10 65 154 83 249 6 30 16 80 22 110 19 85 16 216 -5 278 -11 32 -22 50
-29 45 -7 -4 -8 0 -3 13 4 10 4 15 0 12 -6 -7 -89 109 -89 124 0 4 -6 13 -14
20 -10 10 -12 10 -7 1 14 -24 -10 -13 -40 19 -16 17 -23 27 -15 23 9 -5 12 -4
8 2 -11 18 -131 71 -188 82 -50 11 -127 14 -259 12 -25 -1 -57 -7 -72 -15 -17
-9 -28 -11 -28 -4 0 6 -9 8 -22 3 -13 -4 -31 -7 -41 -6 -9 0 -15 -4 -12 -9 3
-6 0 -7 -8 -4 -20 7 -127 -84 -176 -149 -43 -57 -111 -185 -111 -208 0 -19
-55 -135 -69 -143 -6 -4 -11 -12 -11 -18 0 -19 29 13 66 73 19 33 37 59 40 59
10 0 -65 -126 -103 -173 -30 -36 -39 -53 -30 -59 9 -6 9 -8 0 -8 -9 0 -10 -7
-2 -27 6 -16 10 -29 10 -30 -1 -11 23 -63 29 -63 4 0 20 10 36 22 30 24 26 14
-13 -39 -13 -18 -20 -33 -14 -33 19 0 74 65 97 115 13 27 24 43 24 34 0 -25
-21 -81 -42 -111 -23 -34 -23 -46 0 -25 18 16 19 14 21 -70 3 -183 25 -289 76
-381 26 -46 33 -96 15 -107 -6 -3 -86 -17 -178 -30 -240 -35 -301 -61 -360
-152 -62 -96 -73 -147 -83 -378 -9 -214 -20 -312 -32 -285 -20 45 -77 356 -91
492 -18 174 -34 243 -72 325 -58 121 -120 163 -243 163 -63 0 -80 3 -85 16
-11 29 -6 103 13 196 43 209 51 282 51 479 -1 301 -22 464 -76 571 -32 64
-132 168 -191 200 -79 43 -224 72 -303 61z m2438 -421 c18 -14 38 -35 44 -46
9 -16 -39 22 -102 82 -11 11 27 -13 58 -36z m142 -188 c17 -52 7 -51 -11 1 -9
25 -13 42 -8 40 4 -3 13 -21 19 -41z m-1000 -42 c0 -5 -7 -17 -15 -28 -14 -18
-14 -17 -4 9 12 27 19 34 19 19z m1037 -14 c-3 -8 -6 -5 -6 6 -1 11 2 17 5 13
3 -3 4 -12 1 -19z m10 -40 c-3 -8 -6 -5 -6 6 -1 11 2 17 5 13 3 -3 4 -12 1
-19z m-53 -327 c-4 -23 -9 -40 -11 -37 -3 3 -2 23 2 46 4 23 9 39 11 37 3 -2
2 -23 -2 -46z m-17 -73 c-3 -8 -6 -5 -6 6 -1 11 2 17 5 13 3 -3 4 -12 1 -19z
m-3487 -790 c-17 -35 -55 -110 -84 -168 -29 -58 -72 -163 -96 -235 -45 -134
-64 -175 -84 -175 -6 1 -23 18 -38 40 -31 44 -71 60 -155 60 -29 0 -53 3 -52
8 0 4 63 59 141 122 182 149 293 258 347 343 24 37 45 67 47 67 3 0 -10 -28
-26 -62z m-4768 -415 c-37 -46 -160 -176 -140 -148 21 29 160 185 165 185 3 0
-9 -17 -25 -37z m38 -52 c-11 -21 -30 -37 -30 -25 0 8 30 44 37 44 2 0 -1 -9
-7 -19z m1692 -588 c22 -30 39 -56 36 -58 -5 -5 -107 115 -122 143 -15 28 42
-29 86 -85z m-100 -108 c6 -11 -13 3 -42 30 -28 28 -56 59 -62 70 -6 11 13 -2
42 -30 28 -27 56 -59 62 -70z m1587 -1 c29 -6 22 -10 -71 -40 -57 -19 -128
-41 -158 -49 -58 -15 -288 -41 -296 -33 -2 3 23 19 56 37 45 24 98 40 208 61
153 29 208 34 261 24z m-860 -1488 c150 -59 299 -94 495 -114 l68 -7 -42 -27
-42 -28 -111 20 c-62 11 -196 28 -300 38 -103 10 -189 21 -192 23 -2 3 -1 21
4 40 5 19 12 46 15 62 4 15 9 27 13 27 3 0 45 -15 92 -34z m3893 -371 l37 -6
-55 -72 c-31 -40 -59 -72 -62 -73 -4 -1 -51 44 -104 100 l-97 101 122 -22 c67
-13 139 -25 159 -28z"
/>
</g>
</svg>
</div>

View File

@@ -1,52 +1,48 @@
<script lang="ts">
import { _ } from "svelte-i18n";
import { page } from "$app/state";
import PeonyIcon from "$lib/components/icon/peony-icon.svelte";
import { Button } from "$lib/components/ui/button";
import type { AuthStatus } from "$lib/types";
import { logout } from "$lib/services";
import { goto } from "$app/navigation";
import { getAssetUrl, isModel } from "$lib/directus";
import LogoutButton from "../logout-button/logout-button.svelte";
import Separator from "../ui/separator/separator.svelte";
import { Avatar, AvatarFallback, AvatarImage } from "$lib/components/ui/avatar";
import { getUserInitials } from "$lib/utils";
import BurgerMenuButton from "../burger-menu-button/burger-menu-button.svelte";
import Girls from "../girls/girls.svelte";
import Logo from "../logo/logo.svelte";
import { _ } from "svelte-i18n";
import { page } from "$app/state";
import { Button } from "$lib/components/ui/button";
import type { AuthStatus } from "$lib/types";
import { logout } from "$lib/services";
import { goto } from "$app/navigation";
import { getAssetUrl } from "$lib/api";
import LogoutButton from "../logout-button/logout-button.svelte";
import Separator from "../ui/separator/separator.svelte";
import BurgerMenuButton from "../burger-menu-button/burger-menu-button.svelte";
import Logo from "../logo/logo.svelte";
interface Props {
interface Props {
authStatus: AuthStatus;
}
}
let { authStatus }: Props = $props();
let { authStatus }: Props = $props();
let isMobileMenuOpen = $state(false);
let isMobileMenuOpen = $state(false);
const navLinks = [
const navLinks = [
{ name: $_("header.home"), href: "/" },
{ name: $_("header.models"), href: "/models" },
{ name: $_("header.videos"), href: "/videos" },
{ name: $_("header.magazine"), href: "/magazine" },
{ name: $_("header.about"), href: "/about" },
];
];
async function handleLogout() {
async function handleLogout() {
closeMenu();
await logout();
goto("/login", { invalidateAll: true });
}
}
function closeMenu() {
function closeMenu() {
isMobileMenuOpen = false;
}
}
function isActiveLink(link: any) {
function isActiveLink(link: any) {
return (
(page.url.pathname === "/" && link === navLinks[0]) ||
(page.url.pathname.startsWith(link.href) && link !== navLinks[0])
);
}
}
</script>
<header
@@ -64,331 +60,266 @@ function isActiveLink(link: any) {
<!-- Desktop Navigation -->
<nav class="hidden w-full lg:flex items-center justify-center gap-8">
{#each navLinks as link}
{#each navLinks as link (link.href)}
<a
href={link.href}
class={`text-sm hover:text-foreground transition-colors duration-200 font-medium relative group ${
isActiveLink(link) ? 'text-foreground' : 'text-foreground/85'
isActiveLink(link) ? "text-foreground" : "text-foreground/85"
}`}
>
{link.name}
<span
class={`absolute -bottom-1 left-0 w-0 h-0.5 bg-gradient-to-r from-primary to-accent transition-all duration-300 ${isActiveLink(link) ? 'w-full' : 'group-hover:w-full'}`}
class={`absolute -bottom-1 left-0 w-0 h-0.5 bg-gradient-to-r from-primary to-accent transition-all duration-300 ${isActiveLink(link) ? "w-full" : "group-hover:w-full"}`}
></span>
</a>
{/each}
</nav>
<!-- Desktop Login Button -->
<!-- Desktop Auth Actions -->
{#if authStatus.authenticated}
<div class="w-full flex items-center justify-end">
<div class="w-full hidden lg:flex items-center justify-end">
<div class="flex items-center gap-2 rounded-full bg-muted/30 p-1">
<!-- Notifications -->
<!-- <Button variant="ghost" size="sm" class="relative h-9 w-9 rounded-full p-0 hover:bg-background/80">
<BellIcon class="h-4 w-4" />
<Badge class="absolute -right-1 -top-1 h-5 w-5 rounded-full bg-gradient-to-r from-primary to-accent p-0 text-xs text-primary-foreground">3</Badge>
<span class="sr-only">Notifications</span>
</Button> -->
<!-- <Separator orientation="vertical" class="mx-1 h-6 bg-border/50" /> -->
<!-- User Actions -->
<Button
variant="link"
size="icon"
class={`hidden sm:flex h-9 w-9 rounded-full p-0 relative text-foreground/80 group ${isActiveLink({ href: '/me' }) ? 'text-foreground' : 'hover:text-foreground'}`}
class={`h-9 w-9 rounded-full p-0 relative text-foreground/80 group ${isActiveLink({ href: "/me" }) ? "text-foreground" : "hover:text-foreground"}`}
href="/me"
title={$_('header.dashboard')}
title={$_("header.dashboard")}
>
<span class="icon-[ri--dashboard-2-line] h-4 w-4"></span>
<span
class={`absolute -bottom-1 left-0 w-0 h-0.5 bg-gradient-to-r from-primary to-accent transition-all duration-300 ${isActiveLink({ href: '/me' }) ? 'w-full' : 'group-hover:w-full'}`}
class={`absolute -bottom-1 left-0 w-0 h-0.5 bg-gradient-to-r from-primary to-accent transition-all duration-300 ${isActiveLink({ href: "/me" }) ? "w-full" : "group-hover:w-full"}`}
></span>
<span class="sr-only">{$_('header.dashboard')}</span>
<span class="sr-only">{$_("header.dashboard")}</span>
</Button>
<Button
variant="link"
size="icon"
class={`hidden sm:flex h-9 w-9 rounded-full p-0 relative text-foreground/80 group ${isActiveLink({ href: '/play' }) ? 'text-foreground' : 'hover:text-foreground'}`}
class={`h-9 w-9 rounded-full p-0 relative text-foreground/80 group ${isActiveLink({ href: "/play" }) ? "text-foreground" : "hover:text-foreground"}`}
href="/play"
title={$_('header.play')}
title={$_("header.play")}
>
<span class="icon-[ri--rocket-line] h-4 w-4"></span>
<span
class={`absolute -bottom-1 left-0 w-0 h-0.5 bg-gradient-to-r from-primary to-accent transition-all duration-300 ${isActiveLink({ href: '/play' }) ? 'w-full' : 'group-hover:w-full'}`}
class={`absolute -bottom-1 left-0 w-0 h-0.5 bg-gradient-to-r from-primary to-accent transition-all duration-300 ${isActiveLink({ href: "/play" }) ? "w-full" : "group-hover:w-full"}`}
></span>
<span class="sr-only">{$_('header.play')}</span>
<span class="sr-only">{$_("header.play")}</span>
</Button>
<Separator orientation="vertical" class="hidden md:flex mx-1 h-6 bg-border/50" />
{#if authStatus.user?.is_admin}
<Button
variant="link"
size="icon"
class={`h-9 w-9 rounded-full p-0 relative text-foreground/80 group ${isActiveLink({ href: "/admin" }) ? "text-foreground" : "hover:text-foreground"}`}
href="/admin/users"
title="Admin"
>
<span class="icon-[ri--settings-3-line] h-4 w-4"></span>
<span
class={`absolute -bottom-1 left-0 w-0 h-0.5 bg-gradient-to-r from-primary to-accent transition-all duration-300 ${isActiveLink({ href: "/admin" }) ? "w-full" : "group-hover:w-full"}`}
></span>
<span class="sr-only">Admin</span>
</Button>
{/if}
<!-- Slide Logout Button -->
<Separator orientation="vertical" class="mx-1 h-6 bg-border/50" />
<LogoutButton
user={{
name: authStatus.user!.artist_name || authStatus.user!.email.split('@')[0] || 'User',
avatar: getAssetUrl(authStatus.user!.avatar?.id, 'mini')!,
email: authStatus.user!.email
name:
authStatus.user!.artist_name || authStatus.user!.email.split("@")[0] || "User",
avatar: getAssetUrl(authStatus.user!.avatar, "mini")!,
email: authStatus.user!.email,
}}
onLogout={handleLogout}
/>
</div>
</div>
{:else}
<div class="flex w-full items-center justify-end gap-4">
<Button variant="outline" class="font-medium" href="/login"
>{$_('header.login')}</Button
>
<div class="hidden lg:flex w-full items-center justify-end gap-4">
<Button variant="outline" class="font-medium" href="/login">{$_("header.login")}</Button>
<Button
href="/signup"
class="bg-gradient-to-r from-primary to-accent hover:from-primary/90 hover:to-accent/90 font-medium"
>{$_('header.signup')}</Button
>{$_("header.signup")}</Button
>
</div>
{/if}
<!-- Burger button — mobile/tablet only -->
<div class="lg:hidden ml-auto">
<BurgerMenuButton
label={$_('header.navigation')}
label={$_("header.navigation")}
bind:isMobileMenuOpen
onclick={() => (isMobileMenuOpen = !isMobileMenuOpen)}
/>
</div>
</div>
<!-- Mobile Navigation -->
<div
class={`border-t border-border/20 bg-background/95 bg-gradient-to-br from-primary to-accent backdrop-blur-xl max-h-[calc(100vh-4rem)] overflow-y-auto shadow-xl/30 transition-all duration-250 ${isMobileMenuOpen ? 'opacity-100' : 'opacity-0'}`}
>
{#if isMobileMenuOpen}
<div class="container mx-auto grid grid-cols-1 lg:grid-cols-3">
<div class="hidden lg:flex col-span-2">
<Girls />
</div>
<div class="py-6 px-4 space-y-6 lg:col-start-3 border-t border-border/20 bg-background/95 ">
<!-- User Profile Card -->
</header>
<!-- Backdrop -->
<!-- svelte-ignore a11y_click_events_have_key_events a11y_no_static_element_interactions a11y_no_noninteractive_element_interactions -->
<div
role="presentation"
class={`fixed inset-0 z-40 bg-black/60 backdrop-blur-sm transition-opacity duration-300 lg:hidden ${isMobileMenuOpen ? "opacity-100 pointer-events-auto" : "opacity-0 pointer-events-none"}`}
onclick={closeMenu}
></div>
<!-- Flyout panel -->
<div
class={`fixed inset-y-0 left-0 z-50 w-80 max-w-[85vw] bg-card/95 backdrop-blur-xl shadow-2xl shadow-primary/20 border-r border-border/30 transform transition-transform duration-300 ease-in-out lg:hidden overflow-y-auto flex flex-col ${isMobileMenuOpen ? "translate-x-0" : "-translate-x-full"}`}
inert={!isMobileMenuOpen || undefined}
>
<!-- Panel header -->
<div class="flex items-center px-5 h-16 shrink-0 border-b border-border/30">
<Logo hideName={true} />
</div>
<div class="flex-1 py-6 px-5 space-y-6">
<!-- User logout slider -->
{#if authStatus.authenticated}
<div
class="relative overflow-hidden rounded-2xl border border-border/50 bg-gradient-to-br from-card to-card/50 p-4 backdrop-blur-sm"
>
<div
class="absolute inset-0 bg-gradient-to-br from-primary/5 to-accent/5"
></div>
<div class="relative flex items-center gap-4">
<Avatar class="h-14 w-14 ring-2 ring-primary/30">
<AvatarImage
src={getAssetUrl(authStatus.user!.avatar?.id, 'mini')}
alt={authStatus.user!.artist_name}
<LogoutButton
user={{
name: authStatus.user!.artist_name || authStatus.user!.email.split("@")[0] || "User",
avatar: getAssetUrl(authStatus.user!.avatar, "mini")!,
email: authStatus.user!.email,
}}
onLogout={handleLogout}
class="w-full"
/>
<AvatarFallback
class="bg-gradient-to-br from-primary to-accent text-primary-foreground font-semibold"
>
{getUserInitials(authStatus.user!.artist_name)}
</AvatarFallback>
</Avatar>
<div class="flex flex-1 flex-col gap-1">
<p class="text-base font-semibold text-foreground">
{authStatus.user!.artist_name}
</p>
<p class="text-sm text-muted-foreground">
{authStatus.user!.email}
</p>
<div class="flex items-center gap-2 mt-1">
<div class="h-2 w-2 rounded-full bg-green-500"></div>
<span class="text-xs text-muted-foreground">Online</span>
</div>
</div>
<!-- Notifications Badge -->
<!-- <Button
variant="ghost"
size="sm"
class="relative h-10 w-10 rounded-full p-0"
>
<BellIcon class="h-4 w-4" />
<Badge
class="absolute -right-1 -top-1 h-5 w-5 rounded-full bg-gradient-to-r from-primary to-accent p-0 text-xs text-primary-foreground"
>3</Badge
>
</Button> -->
</div>
</div>
{/if}
<!-- Navigation Cards -->
<div class="space-y-3">
<h3
class="px-1 text-xs font-semibold text-muted-foreground uppercase tracking-wider"
>
{$_('header.navigation')}
<!-- Navigation -->
<div class="space-y-2">
<h3 class="px-1 text-xs font-semibold text-muted-foreground uppercase tracking-wider">
{$_("header.navigation")}
</h3>
<div class="grid gap-2">
{#each navLinks as link}
<div class="grid gap-1.5">
{#each navLinks as link (link.href)}
<a
href={link.href}
class="flex items-center justify-between rounded-xl border border-border/50 bg-card/50 p-4 backdrop-blur-sm transition-all hover:bg-card hover:border-primary/20 {isActiveLink(
link
)
? 'border-primary/30 bg-primary/5'
: ''}"
onclick={() => (isMobileMenuOpen = false)}
class={`flex items-center justify-between rounded-xl border px-4 py-3 transition-all duration-200 hover:border-primary/30 hover:bg-primary/5 ${
isActiveLink(link)
? "border-primary/40 bg-primary/8 text-foreground"
: "border-border/40 bg-card/50 text-foreground/85"
}`}
onclick={closeMenu}
>
<span class="font-medium text-foreground">{link.name}</span>
<div class="flex items-center gap-2">
<!-- {#if isActiveLink(link)}
<div class="h-2 w-2 rounded-full bg-primary"></div>
{/if} -->
<span
class="icon-[ri--arrow-drop-right-line] h-6 w-6 text-muted-foreground"
></span>
</div>
<span class="font-medium text-sm">{link.name}</span>
<span class="icon-[ri--arrow-right-s-line] h-4 w-4 text-muted-foreground"></span>
</a>
{/each}
</div>
</div>
<!-- Account Actions -->
<div class="space-y-3">
<h3
class="px-1 text-xs font-semibold text-muted-foreground uppercase tracking-wider"
>
{$_('header.account')}
<!-- Account -->
<div class="space-y-2">
<h3 class="px-1 text-xs font-semibold text-muted-foreground uppercase tracking-wider">
{$_("header.account")}
</h3>
<div class="grid gap-2">
<div class="grid gap-1.5">
{#if authStatus.authenticated}
<a
class={`flex items-center gap-4 rounded-xl border border-border/50 bg-card/50 p-4 text-left backdrop-blur-sm transition-all group hover:bg-card hover:border-primary/20 ${isActiveLink({ href: '/me' }) ? 'border-primary/30 bg-primary/5' : ''}`}
class={`flex items-center gap-3 rounded-xl border px-4 py-3 transition-all duration-200 group hover:border-primary/30 hover:bg-primary/5 ${isActiveLink({ href: "/me" }) ? "border-primary/40 bg-primary/8" : "border-border/40 bg-card/50"}`}
href="/me"
onclick={closeMenu}
>
<div
class={`flex h-10 w-10 items-center justify-center rounded-xl bg-gradient-to-br from-muted to-muted/50 transition-all group-hover:bg-card group-hover:from-primary/10 group-hover:to-accent/10`}
class="flex h-8 w-8 items-center justify-center rounded-lg bg-muted/60 group-hover:bg-primary/10 transition-colors"
>
<span
class="icon-[ri--dashboard-2-line] h-4 w-4 text-muted-foreground group-hover:text-foreground transition-colors"
class="icon-[ri--dashboard-2-line] h-4 w-4 text-muted-foreground group-hover:text-primary transition-colors"
></span>
</div>
<div class="flex flex-1 flex-col gap-1">
<div class="flex items-center gap-2">
<span class="font-medium text-foreground"
>{$_('header.dashboard')}</span
>
<div class="flex flex-1 flex-col gap-0.5">
<span class="text-sm font-medium text-foreground">{$_("header.dashboard")}</span>
<span class="text-xs text-muted-foreground">{$_("header.dashboard_hint")}</span>
</div>
<span class="text-sm text-muted-foreground"
>{$_('header.dashboard_hint')}</span
>
</div>
<span
class="icon-[ri--arrow-drop-right-line] h-6 w-6 text-muted-foreground transition-all"
></span>
<span class="icon-[ri--arrow-right-s-line] h-4 w-4 text-muted-foreground"></span>
</a>
<a
class={`flex items-center gap-4 rounded-xl border border-border/50 bg-card/50 p-4 text-left backdrop-blur-sm transition-all group hover:bg-card hover:border-primary/20 ${isActiveLink({ href: '/play' }) ? 'border-primary/30 bg-primary/5' : ''}`}
class={`flex items-center gap-3 rounded-xl border px-4 py-3 transition-all duration-200 group hover:border-primary/30 hover:bg-primary/5 ${isActiveLink({ href: "/play" }) ? "border-primary/40 bg-primary/8" : "border-border/40 bg-card/50"}`}
href="/play"
onclick={closeMenu}
>
<div
class={`flex h-10 w-10 items-center justify-center rounded-xl bg-gradient-to-br from-muted to-muted/50 transition-all group-hover:bg-card group-hover:from-primary/10 group-hover:to-accent/10`}
class="flex h-8 w-8 items-center justify-center rounded-lg bg-muted/60 group-hover:bg-primary/10 transition-colors"
>
<span
class="icon-[ri--rocket-line] h-4 w-4 text-muted-foreground group-hover:text-foreground transition-colors"
class="icon-[ri--rocket-line] h-4 w-4 text-muted-foreground group-hover:text-primary transition-colors"
></span>
</div>
<div class="flex flex-1 flex-col gap-1">
<div class="flex items-center gap-2">
<span class="font-medium text-foreground"
>{$_('header.play')}</span
>
<div class="flex flex-1 flex-col gap-0.5">
<span class="text-sm font-medium text-foreground">{$_("header.play")}</span>
<span class="text-xs text-muted-foreground">{$_("header.play_hint")}</span>
</div>
<span class="text-sm text-muted-foreground"
>{$_('header.play_hint')}</span
>
</div>
<span
class="icon-[ri--arrow-drop-right-line] h-6 w-6 text-muted-foreground transition-all"
></span>
<span class="icon-[ri--arrow-right-s-line] h-4 w-4 text-muted-foreground"></span>
</a>
{#if authStatus.user?.is_admin}
<a
class={`flex items-center gap-3 rounded-xl border px-4 py-3 transition-all duration-200 group hover:border-primary/30 hover:bg-primary/5 ${isActiveLink({ href: "/admin" }) ? "border-primary/40 bg-primary/8" : "border-border/40 bg-card/50"}`}
href="/admin/users"
onclick={closeMenu}
>
<div
class="flex h-8 w-8 items-center justify-center rounded-lg bg-muted/60 group-hover:bg-primary/10 transition-colors"
>
<span
class="icon-[ri--settings-3-line] h-4 w-4 text-muted-foreground group-hover:text-primary transition-colors"
></span>
</div>
<div class="flex flex-1 flex-col gap-0.5">
<span class="text-sm font-medium text-foreground">Admin</span>
<span class="text-xs text-muted-foreground">Manage content</span>
</div>
<span class="icon-[ri--arrow-right-s-line] h-4 w-4 text-muted-foreground"></span>
</a>
{/if}
{:else}
<a
class={`flex items-center gap-4 rounded-xl border border-border/50 bg-card/50 p-4 text-left backdrop-blur-sm transition-all group hover:bg-card hover:border-primary/20 ${isActiveLink({ href: '/login' }) ? 'border-primary/30 bg-primary/5' : ''}`}
class={`flex items-center gap-3 rounded-xl border px-4 py-3 transition-all duration-200 group hover:border-primary/30 hover:bg-primary/5 ${isActiveLink({ href: "/login" }) ? "border-primary/40 bg-primary/8" : "border-border/40 bg-card/50"}`}
href="/login"
onclick={closeMenu}
>
<div
class={`flex h-10 w-10 items-center justify-center rounded-xl bg-gradient-to-br from-muted to-muted/50 transition-all group-hover:bg-card group-hover:from-primary/10 group-hover:to-accent/10`}
class="flex h-8 w-8 items-center justify-center rounded-lg bg-muted/60 group-hover:bg-primary/10 transition-colors"
>
<span
class="icon-[ri--login-circle-line] h-4 w-4 text-muted-foreground group-hover:text-foreground transition-colors"
class="icon-[ri--login-circle-line] h-4 w-4 text-muted-foreground group-hover:text-primary transition-colors"
></span>
</div>
<div class="flex flex-1 flex-col gap-1">
<div class="flex items-center gap-2">
<span class="font-medium text-foreground"
>{$_('header.login')}</span
>
<div class="flex flex-1 flex-col gap-0.5">
<span class="text-sm font-medium text-foreground">{$_("header.login")}</span>
<span class="text-xs text-muted-foreground">{$_("header.login_hint")}</span>
</div>
<span class="text-sm text-muted-foreground"
>{$_('header.login_hint')}</span
>
</div>
<span
class="icon-[ri--arrow-drop-right-line] h-6 w-6 text-muted-foreground transition-all"
></span>
<span class="icon-[ri--arrow-right-s-line] h-4 w-4 text-muted-foreground"></span>
</a>
<a
class={`flex items-center gap-4 rounded-xl border border-border/50 bg-card/50 p-4 text-left backdrop-blur-sm transition-all group hover:bg-card hover:border-primary/20 ${isActiveLink({ href: '/signup' }) ? 'border-primary/30 bg-primary/5' : ''}`}
class={`flex items-center gap-3 rounded-xl border px-4 py-3 transition-all duration-200 group hover:border-primary/30 hover:bg-primary/5 ${isActiveLink({ href: "/signup" }) ? "border-primary/40 bg-primary/8" : "border-border/40 bg-card/50"}`}
href="/signup"
onclick={closeMenu}
>
<div
class={`flex h-10 w-10 items-center justify-center rounded-xl bg-gradient-to-br from-muted to-muted/50 transition-all group-hover:bg-card group-hover:from-primary/10 group-hover:to-accent/10`}
class="flex h-8 w-8 items-center justify-center rounded-lg bg-muted/60 group-hover:bg-accent/10 transition-colors"
>
<span
class="icon-[ri--heart-add-2-line] h-4 w-4 text-muted-foreground group-hover:text-foreground transition-colors"
class="icon-[ri--heart-add-2-line] h-4 w-4 text-muted-foreground group-hover:text-accent transition-colors"
></span>
</div>
<div class="flex flex-1 flex-col gap-1">
<div class="flex items-center gap-2">
<span class="font-medium text-foreground"
>{$_('header.signup')}</span
>
<div class="flex flex-1 flex-col gap-0.5">
<span class="text-sm font-medium text-foreground">{$_("header.signup")}</span>
<span class="text-xs text-muted-foreground">{$_("header.signup_hint")}</span>
</div>
<span class="text-sm text-muted-foreground"
>{$_('header.signup_hint')}</span
>
</div>
<span
class="icon-[ri--arrow-drop-right-line] h-6 w-6 text-muted-foreground transition-all"
></span>
<span class="icon-[ri--arrow-right-s-line] h-4 w-4 text-muted-foreground"></span>
</a>
{/if}
</div>
</div>
{#if authStatus.authenticated}
<!-- Logout Button -->
<button
class="cursor-pointer flex w-full items-center gap-4 rounded-xl border border-destructive/20 bg-destructive/5 p-4 text-left backdrop-blur-sm transition-all hover:bg-destructive/10 hover:border-destructive/30 group"
onclick={handleLogout}
>
<div
class="flex h-10 w-10 items-center justify-center rounded-xl bg-destructive/10 group-hover:bg-destructive/20 transition-all"
>
<span
class="icon-[ri--logout-circle-r-line] h-4 w-4 text-destructive"
></span>
</div>
<div class="flex flex-1 flex-col gap-1">
<span class="font-medium text-foreground"
>{$_('header.logout')}</span
>
<span class="text-sm text-muted-foreground"
>{$_('header.logout_hint')}</span
>
</div>
</button>
{/if}
</div>
</div>
{/if}
</div>
</header>
</div>

View File

@@ -1,10 +1,10 @@
<script lang="ts">
interface Props {
interface Props {
class?: string;
size?: string | number;
}
}
let { class: className = "", size = "24" }: Props = $props();
let { class: className = "", size = "24" }: Props = $props();
</script>
<svg
@@ -19,7 +19,6 @@ let { class: className = "", size = "24" }: Props = $props();
d="M418.813 30.625c-21.178 26.27-49.712 50.982-84.125 70.844-36.778 21.225-75.064 33.62-110.313 38.06a310.317 310.317 0 0 0 6.813 18.25c16.01.277 29.366-.434 36.406-1.5l9.47-1.53 8.436-1.28.22 10.186a307.48 307.48 0 0 1-1.095 18.72l56.625 8.843c.86-.095 1.713-.15 2.563-.157 11.188-.114 21.44 7.29 24.468 18.593.657 2.448.922 4.903.845 7.313 5.972-2.075 11.753-4.305 17.28-6.72l9.595-4.188 2.313 10.22a340.211 340.211 0 0 1 7.375 48.062C438.29 247.836 468.438 225.71 493 197.5c-3.22-36.73-16.154-78.04-39.125-117.813a290.509 290.509 0 0 0-2.22-3.78l-27.56 71.374c5.154.762 10.123 3.158 14.092 7.126 9.81 9.807 9.813 25.69 0 35.5-9.812 9.81-25.722 9.807-35.53 0-8.86-8.858-9.69-22.68-2.532-32.5l38.938-100.844a322.02 322.02 0 0 0-20.25-25.937zM51.842 118.72c-8.46 17.373-15.76 36.198-21.187 56.436-14.108 52.617-13.96 103.682-2.812 143.438 13.3-2.605 26.442-3.96 39.312-4.03 1.855-.012 3.688.02 5.53.06 20.857.48 40.98 4.332 59.97 11.5a355.064 355.064 0 0 1-1.656-34.218c0-27.8 3.135-54.377 9-78.937l2.47-10.407 9.655 4.562c29.467 13.98 66.194 23.424 106.28 25.22 5.136-20.05 8.19-39.78 9.408-58.75-35.198 4.83-75.387 2.766-116.407-8.22-38.363-10.272-72.314-26.78-99.562-46.656zm230.594 82.218c-1.535 10.452-3.615 21.03-6.218 31.687a312.754 312.754 0 0 0 46-3.97 24.98 24.98 0 0 1-1.532-21.748l-38.25-5.97zM105 201.375l4.156 18.22-21.594 4.905c8.75 5.174 13.353 15.703 10.594 26-3.32 12.394-16.045 19.758-28.437 16.438-12.394-3.32-19.76-16.075-16.44-28.47a23.235 23.235 0 0 1 3.126-6.874l-21.062 4.78-4.125-18.218 73.78-16.78zm388.594 22.813c-25.53 25.46-55.306 45.445-86.906 60.5.05 2.397.093 4.8.093 7.218 0 9.188-.354 18.232-1.03 27.125 16.635 1.33 32.045-1.7 45.344-9.374 25.925-14.962 40.608-45.694 42.5-85.47zm-338.844 3c-4.03 19.993-6.33 41.31-6.406 63.593l.125-.342c30.568 10.174 62.622 17.572 95.25 21.375l7.5.875.718 7.5 5.687 60.125-18.625 1.75-2.53-26.75a23.117 23.117 0 0 1-14.845.968c-12.393-3.32-19.76-16.042-16.438-28.436.285-1.06.647-2.08 1.063-3.063a496.627 496.627 0 0 1-57.406-14.53c2.69 49.62 16.154 94.04 36.094 126.656 22.366 36.588 52.13 57.78 83.968 57.78 31.838.003 61.602-21.19 83.97-57.78 19.536-31.96 32.846-75.244 35.905-123.656a499.132 499.132 0 0 1-48.25 11.656c1.914 4.57 2.415 9.78 1.033 14.938-3.322 12.394-16.045 19.758-28.438 16.437a23.01 23.01 0 0 1-2.125-.686l-2.5 26.47-18.594-1.752 5.688-60.125.72-7.5 7.498-.875c29.245-3.407 57.995-9.717 85.657-18.312v-1.594c0-21.573-2.27-42.23-6.064-61.75C351.132 242.653 313.092 250 272.312 250c-43.59 0-83.986-8.658-117.562-22.813zm-87.5 105.968c-10.87.102-21.995 1.22-33.375 3.313 12.695 31.62 33.117 53.07 59 60 16.9 4.523 34.896 2.536 52.813-5.25-4.382-13.89-7.874-28.606-10.344-43.97-21.115-9.623-43.934-14.32-68.094-14.094zm137.5 80.22h130.813c-40.082 44.594-92.623 42.844-130.813 0z"
fill-opacity="1"
style="fill: currentColor; stroke: #ce47eb; stroke-width: 10px;"
></path></g
></svg
>

View File

@@ -1,46 +1,46 @@
<script lang="ts">
import { _ } from "svelte-i18n";
import { browser } from "$app/environment";
import { onMount, onDestroy } from "svelte";
import Button from "../ui/button/button.svelte";
import { _ } from "svelte-i18n";
import { browser } from "$app/environment";
import { onMount, onDestroy } from "svelte";
import Button from "../ui/button/button.svelte";
const { images = [] } = $props();
const { images = [] } = $props();
let isViewerOpen = $state(false);
let currentImageIndex = $state(0);
let imageLoading = $state(false);
let isViewerOpen = $state(false);
let currentImageIndex = $state(0);
let imageLoading = $state(false);
let currentImage = $derived(images[currentImageIndex]);
let canGoPrev = $derived(currentImageIndex > 0);
let canGoNext = $derived(currentImageIndex < images.length - 1);
let currentImage = $derived(images[currentImageIndex]);
let canGoPrev = $derived(currentImageIndex > 0);
let canGoNext = $derived(currentImageIndex < images.length - 1);
function openViewer(index) {
function openViewer(index) {
currentImageIndex = index;
isViewerOpen = true;
imageLoading = true;
document.body.style.overflow = "hidden";
}
}
function closeViewer() {
function closeViewer() {
isViewerOpen = false;
document.body.style.overflow = "";
}
}
function navigatePrev() {
function navigatePrev() {
if (canGoPrev) {
currentImageIndex--;
imageLoading = true;
}
}
}
function navigateNext() {
function navigateNext() {
if (canGoNext) {
currentImageIndex++;
imageLoading = true;
}
}
}
function downloadImage() {
function downloadImage() {
const link = document.createElement("a");
link.href = currentImage.url;
link.download = currentImage.title.replace(/\\s+/g, "_") + ".jpg";
@@ -48,9 +48,9 @@ function downloadImage() {
document.body.appendChild(link);
link.click();
document.body.removeChild(link);
}
}
function handleKeydown(event) {
function handleKeydown(event) {
if (!isViewerOpen) return;
switch (event.key) {
@@ -72,13 +72,13 @@ function handleKeydown(event) {
downloadImage();
break;
}
}
}
function handleImageLoad() {
function handleImageLoad() {
imageLoading = false;
}
}
onMount(() => {
onMount(() => {
if (!browser) {
return;
}
@@ -88,23 +88,21 @@ onMount(() => {
const preload = new Image();
preload.src = img.url;
});
});
});
onDestroy(() => {
onDestroy(() => {
if (!browser) {
return;
}
window.removeEventListener("keydown", handleKeydown);
document.body.style.overflow = "";
});
});
</script>
<!-- Gallery Grid -->
<div class="w-full mx-auto">
<div
class="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-3 xl:grid-cols-4 gap-6 animate-fade-in"
>
{#each images as image, index}
<div class="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-3 xl:grid-cols-4 gap-6 animate-fade-in">
{#each images as image, index (index)}
<button
onclick={() => openViewer(index)}
class="group relative aspect-square overflow-hidden rounded-xl bg-zinc-900 border border-zinc-800 transition-all duration-300 hover:scale-[1.03] hover:border-primary/50 hover:shadow-2xl hover:shadow-primary/20 focus:outline-none focus:ring-2 focus:ring-primary focus:ring-offset-2 focus:ring-offset-zinc-950"
@@ -145,14 +143,9 @@ onDestroy(() => {
<!-- Image Viewer Modal -->
{#if isViewerOpen}
<div
class="fixed inset-0 z-50 flex items-center justify-center animate-fade-in"
>
<div class="fixed inset-0 z-50 flex items-center justify-center animate-fade-in">
<!-- Backdrop -->
<div
class="absolute inset-0 bg-black/95 backdrop-blur-xl"
onclick={closeViewer}
></div>
<div class="absolute inset-0 bg-black/95 backdrop-blur-xl" onclick={closeViewer}></div>
<!-- Viewer Content -->
<div class="relative w-[90vw] h-[90vh] flex flex-col animate-slide-up">
@@ -167,8 +160,8 @@ onDestroy(() => {
{$_("image_viewer.index", {
values: {
index: currentImageIndex + 1,
size: images.length
}
size: images.length,
},
})}
</div>
<p class="text-zinc-400 max-w-2xl">

View File

@@ -1,8 +1,8 @@
<script lang="ts">
import { _ } from "svelte-i18n";
import PeonyIcon from "../icon/peony-icon.svelte";
import { _ } from "svelte-i18n";
import PeonyIcon from "../icon/peony-icon.svelte";
const { hideName = false } = $props();
const { hideName = false } = $props();
</script>
<div class="relative">
@@ -11,11 +11,11 @@ const { hideName = false } = $props();
<span
class={`logo text-3xl text-foreground opacity-90 tracking-wide font-extrabold drop-shadow-x ${hideName ? "hidden sm:inline-block" : ""}`}
>
{$_('brand.name')}
{$_("brand.name")}
</span>
<style>
.logo {
font-family: 'Dancing Script', cursive;
font-family: "Dancing Script", cursive;
}
</style>

Some files were not shown because too many files have changed in this diff Show More