This website requires JavaScript.
Explore
Help
Register
Sign In
valknar
0 Followers
·
0 Following
Joined on
2025-11-15
Block a user
Blocking a user prevents them from interacting with repositories, such as opening or commenting on pull requests or issues. Learn more about blocking a user.
User to block:
Optional note:
The note is not visible to the blocked user.
Cancel
Block
Repositories
27
Projects
Packages
Public Activity
Starred Repositories
valknar
pushed to
main
at
valknar/docker-compose
2025-11-21 18:57:13 +01:00
81d4058c5d
revert: back to openai prefix for vLLM OpenAI-compatible endpoint
valknar
pushed to
main
at
valknar/docker-compose
2025-11-21 18:54:50 +01:00
4a575bc0da
fix: use hosted_vllm prefix instead of openai for vLLM streaming compatibility
valknar
pushed to
main
at
valknar/docker-compose
2025-11-21 18:46:44 +01:00
01a345979b
fix: disable drop_params to preserve streaming metadata in LiteLLM
valknar
pushed to
main
at
valknar/docker-compose
2025-11-21 18:42:49 +01:00
c58b5d36ba
revert: remove direct WebUI connection, focus on fixing LiteLLM streaming
valknar
pushed to
main
at
valknar/docker-compose
2025-11-21 18:38:42 +01:00
62fcf832da
feat: add direct RunPod orchestrator connection to WebUI for streaming bypass
valknar
pushed to
main
at
valknar/paint-ui
2025-11-21 18:31:31 +01:00
54aac626a2
feat: add Import Image functionality to add images as new layers
valknar
pushed to
main
at
valknar/runpod
2025-11-21 18:25:53 +01:00
7f1890517d
fix: enable eager execution for proper token streaming in vLLM
valknar
pushed to
main
at
valknar/runpod
2025-11-21 18:10:56 +01:00
94080da341
fix: remove incorrect start-vllm.sh that would break orchestrator architecture
valknar
pushed to
main
at
valknar/runpod
2025-11-21 18:08:23 +01:00
6944e4ebd5
feat: add vllm serve script with proper streaming support
valknar
pushed to
main
at
valknar/docker-compose
2025-11-21 18:01:03 +01:00
dfde1df72f
fix: add /v1 suffix to vLLM api_base for proper endpoint routing
valknar
pushed to
main
at
valknar/docker-compose
2025-11-21 17:55:12 +01:00
42a68bc0b5
fix: revert to openai prefix, remove /v1 suffix from api_base
valknar
pushed to
main
at
valknar/paint-ui
2025-11-21 17:54:01 +01:00
6a47efc164
fix: resolve TypeScript errors in mini-map and layer-effects-panel
valknar
pushed to
main
at
valknar/docker-compose
2025-11-21 17:52:36 +01:00
699c8537b0
fix: use LiteLLM vLLM pass-through for qwen model
valknar
pushed to
main
at
valknar/paint-ui
2025-11-21 17:51:43 +01:00
5c4763cb62
feat(phase-12): add professional UI polish with status bar, navigator, and shortcuts help
valknar
pushed to
main
at
valknar/docker-compose
2025-11-21 17:43:37 +01:00
ed4d537499
Enable verbose logging in LiteLLM for streaming debug
valknar
pushed to
main
at
valknar/paint-ui
2025-11-21 17:42:46 +01:00
9aa6e2d5d9
feat(phase-11): implement comprehensive non-destructive layer effects system
valknar
pushed to
main
at
valknar/runpod
2025-11-21 17:23:35 +01:00
d21caa56bc
fix: implement incremental streaming deltas for vLLM chat completions
valknar
pushed to
main
at
valknar/paint-ui
2025-11-21 17:16:15 +01:00
63a6801155
feat: implement comprehensive canvas context menu system
valknar
pushed to
main
at
valknar/docker-compose
2025-11-21 17:13:52 +01:00
103bbbad51
debug: enable INFO logging in LiteLLM for troubleshooting
valknar
pushed to
main
at
valknar/docker-compose
2025-11-21 17:06:03 +01:00
92a7436716
fix(ai): add 600s timeout for qwen model requests via Tailscale
First
Previous
...
25
26
27
28
29
...
Next
Last