Files
docker-compose/ai
Sebastian Krüger cabac4b767 fix: use additional_drop_params to explicitly drop prompt_cache_key
According to litellm docs, drop_params only drops OpenAI parameters.
Since prompt_cache_key is an Anthropic-specific parameter, we need
to use additional_drop_params to explicitly drop it.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-11 12:33:10 +01:00
..