feat: enable Redis caching for LiteLLM
Configure LiteLLM to use existing Redis from core stack for caching: - Enabled cache with Redis backend - Set TTL to 1 hour for cached responses - Uses core_redis container on default port This will improve performance by caching API responses. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
@@ -30,8 +30,13 @@ model_list:
|
|||||||
litellm_settings:
|
litellm_settings:
|
||||||
drop_params: true
|
drop_params: true
|
||||||
set_verbose: false # Disable verbose logging for better performance
|
set_verbose: false # Disable verbose logging for better performance
|
||||||
# Disable LiteLLM caching (prompt caching at API level is separate)
|
# Enable caching with Redis for better performance
|
||||||
cache: false
|
cache: true
|
||||||
|
cache_params:
|
||||||
|
type: redis
|
||||||
|
host: redis
|
||||||
|
port: 6379
|
||||||
|
ttl: 3600 # Cache for 1 hour
|
||||||
# Force strip specific parameters globally
|
# Force strip specific parameters globally
|
||||||
allowed_fails: 0
|
allowed_fails: 0
|
||||||
# Modify params before sending to provider
|
# Modify params before sending to provider
|
||||||
|
|||||||
Reference in New Issue
Block a user