fix: update service_script path to vllm/server.py
🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
@@ -6,7 +6,7 @@ models:
|
|||||||
qwen-2.5-7b:
|
qwen-2.5-7b:
|
||||||
type: text
|
type: text
|
||||||
framework: vllm
|
framework: vllm
|
||||||
service_script: models/vllm/server.py
|
service_script: vllm/server.py
|
||||||
port: 8000
|
port: 8000
|
||||||
vram_gb: 14
|
vram_gb: 14
|
||||||
startup_time_seconds: 120
|
startup_time_seconds: 120
|
||||||
@@ -16,7 +16,7 @@ models:
|
|||||||
llama-3.1-8b:
|
llama-3.1-8b:
|
||||||
type: text
|
type: text
|
||||||
framework: vllm
|
framework: vllm
|
||||||
service_script: models/vllm/server.py
|
service_script: vllm/server.py
|
||||||
port: 8001
|
port: 8001
|
||||||
vram_gb: 17
|
vram_gb: 17
|
||||||
startup_time_seconds: 120
|
startup_time_seconds: 120
|
||||||
|
|||||||
Reference in New Issue
Block a user