feat: expose LiteLLM publicly for Codex CLI integration

Added Traefik configuration to make LiteLLM accessible at llm.ai.pivoine.art
for use with @openai/codex CLI tool.

Changes:
- Added AI_LITELLM_TRAEFIK_HOST to arty.yml (llm.ai.pivoine.art)
- Updated ai/compose.yaml litellm service with full Traefik labels
- HTTP to HTTPS redirect
- SSL termination via Let's Encrypt
- Compression and security headers

This allows external tools like Codex to use Claude models via
OpenAI-compatible API endpoint.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
2025-11-09 17:56:34 +01:00
parent e41335d2ee
commit 1d69107ebb
2 changed files with 16 additions and 2 deletions

View File

@@ -182,6 +182,7 @@ envs:
AI_VECTOR_DB: pgvector
AI_CRAWL4AI_PORT: 11235
AI_OPENAI_API_BASE_URLS: https://api.anthropic.com/v1
AI_LITELLM_TRAEFIK_HOST: llm.ai.pivoine.art
# Asciinema
ASCIINEMA_TRAEFIK_ENABLED: true
ASCIINEMA_COMPOSE_PROJECT_NAME: asciinema