feat: honor OPENAI_BASE_URL for the built-in openai provider (#1487)

Some users have proxies or other setups where they are ultimately
hitting OpenAI endpoints, but need a custom `base_url` rather than the
default value of `"https://api.openai.com/v1"`. This PR makes it
possible to override the `base_url` for the `openai` provider via the
`OPENAI_BASE_URL` environment variable.
This commit is contained in:
Michael Bolin
2025-07-08 12:39:52 -07:00
committed by GitHub
parent cc58f1086d
commit 8d35ad0ef7
2 changed files with 11 additions and 3 deletions

View File

@@ -94,15 +94,15 @@ env_http_headers = { "X-Example-Features": "EXAMPLE_FEATURES" }
## model_provider
Identifies which provider to use from the `model_providers` map. Defaults to `"openai"`.
Identifies which provider to use from the `model_providers` map. Defaults to `"openai"`. You can override the `base_url` for the built-in `openai` provider via the `OPENAI_BASE_URL` environment variable.
Note that if you override `model_provider`, then you likely want to override
`model`, as well. For example, if you are running ollama with Mistral locally,
then you would need to add the following to your config in addition to the new entry in the `model_providers` map:
```toml
model = "mistral"
model_provider = "ollama"
model = "mistral"
```
## approval_policy