Files
docker-compose/ai/litellm-config.yaml
Sebastian Krüger cabac4b767 fix: use additional_drop_params to explicitly drop prompt_cache_key
According to litellm docs, drop_params only drops OpenAI parameters.
Since prompt_cache_key is an Anthropic-specific parameter, we need
to use additional_drop_params to explicitly drop it.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-11 12:33:10 +01:00

48 lines
1.2 KiB
YAML

model_list:
- model_name: claude-sonnet-4
litellm_params:
model: anthropic/claude-sonnet-4-20250514
api_key: os.environ/ANTHROPIC_API_KEY
- model_name: claude-sonnet-4.5
litellm_params:
model: anthropic/claude-sonnet-4-5-20250929
api_key: os.environ/ANTHROPIC_API_KEY
drop_params: true
additional_drop_params: ["prompt_cache_key"]
- model_name: claude-3-5-sonnet
litellm_params:
model: anthropic/claude-3-5-sonnet-20241022
api_key: os.environ/ANTHROPIC_API_KEY
- model_name: claude-3-opus
litellm_params:
model: anthropic/claude-3-opus-20240229
api_key: os.environ/ANTHROPIC_API_KEY
- model_name: claude-3-haiku
litellm_params:
model: anthropic/claude-3-haiku-20240307
api_key: os.environ/ANTHROPIC_API_KEY
litellm_settings:
drop_params: true
set_verbose: true
# Disable prompt caching features
cache: false
# Force strip specific parameters
allowed_fails: 0
# Modify params before sending to provider
modify_params: true
router_settings:
allowed_fails: 0
# Drop unsupported parameters
default_litellm_params:
drop_params: true
general_settings:
disable_responses_id_security: true