According to litellm docs, drop_params only drops OpenAI parameters. Since prompt_cache_key is an Anthropic-specific parameter, we need to use additional_drop_params to explicitly drop it. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>