feat: add LiteLLM proxy for Anthropic Claude models

Added LiteLLM as an OpenAI-compatible proxy for Anthropic's API to
enable Claude models in Open WebUI.

**New Service: litellm**
- Image: ghcr.io/berriai/litellm:main-latest
- Internal proxy on port 4000
- Converts Anthropic API to OpenAI-compatible format
- Health check with 30s intervals
- Not exposed via Traefik (internal only)

**LiteLLM Configuration (litellm-config.yaml)**
- Claude Sonnet 4 (claude-sonnet-4-20250514)
- Claude Sonnet 4.5 (claude-sonnet-4-5-20250929)
- Claude 3.5 Sonnet (claude-3-5-sonnet-20241022)
- Claude 3 Opus (claude-3-opus-20240229)
- Claude 3 Haiku (claude-3-haiku-20240307)

**Open WebUI Configuration Updates**
- Changed OPENAI_API_BASE_URLS to point to LiteLLM proxy
- URL: http://litellm:4000/v1
- Added litellm as dependency for webui service
- Dummy API key for proxy authentication

**Why LiteLLM?**
Anthropic's API uses different endpoint structure and authentication
headers compared to OpenAI. LiteLLM acts as a translation layer,
allowing Open WebUI to use Claude models through its OpenAI-compatible
interface.

**Available Models in Open WebUI**
- claude-sonnet-4 (latest Claude Sonnet 4)
- claude-sonnet-4.5 (Claude Sonnet 4.5)
- claude-3-5-sonnet
- claude-3-opus
- claude-3-haiku

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
2025-11-08 22:58:09 +01:00
parent d5a3d26c97
commit 8eae3c650f
2 changed files with 59 additions and 3 deletions

View File

@@ -33,9 +33,9 @@ services:
# Database configuration
DATABASE_URL: postgresql://${AI_DB_USER}:${AI_DB_PASSWORD}@ai_postgres:5432/${AI_DB_NAME}
# OpenAI API configuration (for Claude via Anthropic API)
OPENAI_API_BASE_URLS: ${AI_OPENAI_API_BASE_URLS:-https://api.anthropic.com/v1}
OPENAI_API_KEYS: ${ANTHROPIC_API_KEY}
# OpenAI API configuration (pointing to LiteLLM proxy)
OPENAI_API_BASE_URLS: http://litellm:4000/v1
OPENAI_API_KEYS: sk-1234 # Dummy key for LiteLLM proxy
# WebUI configuration
WEBUI_NAME: ${AI_WEBUI_NAME:-Pivoine AI}
@@ -65,6 +65,7 @@ services:
- ai_webui_data:/app/backend/data
depends_on:
- ai_postgres
- litellm
networks:
- compose_network
labels:
@@ -86,6 +87,32 @@ services:
# Watchtower
- 'com.centurylinklabs.watchtower.enable=${WATCHTOWER_LABEL_ENABLE}'
# LiteLLM - Proxy to convert Anthropic API to OpenAI-compatible format
litellm:
image: ghcr.io/berriai/litellm:main-latest
container_name: ${AI_COMPOSE_PROJECT_NAME}_litellm
restart: unless-stopped
environment:
TZ: ${TIMEZONE:-Europe/Berlin}
ANTHROPIC_API_KEY: ${ANTHROPIC_API_KEY}
LITELLM_MASTER_KEY: ${AI_WEBUI_SECRET_KEY}
volumes:
- ./ai/litellm-config.yaml:/app/config.yaml:ro
command: ["--config", "/app/config.yaml", "--port", "4000", "--num_workers", "1"]
networks:
- compose_network
healthcheck:
test: ["CMD-SHELL", "curl -f http://localhost:4000/health || exit 1"]
interval: 30s
timeout: 10s
retries: 3
start_period: 20s
labels:
# No Traefik exposure - internal only
- 'traefik.enable=false'
# Watchtower
- 'com.centurylinklabs.watchtower.enable=${WATCHTOWER_LABEL_ENABLE}'
# Crawl4AI - Web scraping for LLMs (internal API, no public access)
crawl4ai:
image: ${AI_CRAWL4AI_IMAGE:-unclecode/crawl4ai:latest}