Files
docker-compose/ai/litellm-config.yaml
Sebastian Krüger da0dc2363a fix: disable prompt caching and responses API in litellm
- Add LITELLM_DROP_PARAMS environment variable
- Disable cache in litellm_settings
- Attempt to disable responses API endpoint
- Remove invalid supports_prompt_caching parameter

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-11 12:27:06 +01:00

44 lines
1.1 KiB
YAML

model_list:
- model_name: claude-sonnet-4
litellm_params:
model: anthropic/claude-sonnet-4-20250514
api_key: os.environ/ANTHROPIC_API_KEY
- model_name: claude-sonnet-4.5
litellm_params:
model: anthropic/claude-sonnet-4-5-20250929
api_key: os.environ/ANTHROPIC_API_KEY
- model_name: claude-3-5-sonnet
litellm_params:
model: anthropic/claude-3-5-sonnet-20241022
api_key: os.environ/ANTHROPIC_API_KEY
- model_name: claude-3-opus
litellm_params:
model: anthropic/claude-3-opus-20240229
api_key: os.environ/ANTHROPIC_API_KEY
- model_name: claude-3-haiku
litellm_params:
model: anthropic/claude-3-haiku-20240307
api_key: os.environ/ANTHROPIC_API_KEY
litellm_settings:
drop_params: true
set_verbose: true
# Disable prompt caching features
cache: false
router_settings:
allowed_fails: 0
# Drop unsupported parameters
default_litellm_params:
drop_params: true
general_settings:
disable_responses_id_security: true
# Explicitly disable responses API endpoint
disable_responses_api: true