- Added LiteLLM as built-in model provider in model_provider_info.rs:
- Default base_url: http://localhost:4000/v1 (configurable via LITELLM_BASE_URL)
- Uses Chat wire API (OpenAI-compatible)
- Requires LITELLM_API_KEY environment variable
- No OpenAI auth required (simple bearer token)
- Positioned as first provider in list
- Updated default models to use LiteLLM format:
- Changed from "gpt-5-codex" to "anthropic/claude-sonnet-4-20250514"
- Updated all default model constants (OPENAI_DEFAULT_MODEL, etc.)
- Uses provider/model format compatible with LiteLLM
- Provider configuration:
- Supports base_url override via environment variable
- Includes helpful env_key_instructions pointing to LiteLLM docs
- Uses standard retry/timeout defaults
This makes LLMX work out-of-the-box with LiteLLM proxy, supporting
multiple providers (Anthropic, OpenAI, etc.) through a single interface.
🤖 Generated with Claude Code