2025-04-20 23:59:34 -04:00
|
|
|
export const providers: Record<
|
|
|
|
|
string,
|
|
|
|
|
{ name: string; baseURL: string; envKey: string }
|
|
|
|
|
> = {
|
|
|
|
|
openai: {
|
|
|
|
|
name: "OpenAI",
|
|
|
|
|
baseURL: "https://api.openai.com/v1",
|
|
|
|
|
envKey: "OPENAI_API_KEY",
|
|
|
|
|
},
|
|
|
|
|
openrouter: {
|
|
|
|
|
name: "OpenRouter",
|
|
|
|
|
baseURL: "https://openrouter.ai/api/v1",
|
|
|
|
|
envKey: "OPENROUTER_API_KEY",
|
|
|
|
|
},
|
Adds Azure OpenAI support (#769)
## Summary
This PR introduces support for Azure OpenAI as a provider within the
Codex CLI. Users can now configure the tool to leverage their Azure
OpenAI deployments by specifying `"azure"` as the provider in
`config.json` and setting the corresponding `AZURE_OPENAI_API_KEY` and
`AZURE_OPENAI_API_VERSION` environment variables. This functionality is
added alongside the existing provider options (OpenAI, OpenRouter,
etc.).
Related to #92
**Note:** This PR is currently in **Draft** status because tests on the
`main` branch are failing. It will be marked as ready for review once
the `main` branch is stable and tests are passing.
---
## What’s Changed
- **Configuration (`config.ts`, `providers.ts`, `README.md`):**
- Added `"azure"` to the supported `providers` list in `providers.ts`,
specifying its name, default base URL structure, and environment
variable key (`AZURE_OPENAI_API_KEY`).
- Defined the `AZURE_OPENAI_API_VERSION` environment variable in
`config.ts` with a default value (`2025-03-01-preview`).
- Updated `README.md` to:
- Include "azure" in the list of providers.
- Add a configuration section for Azure OpenAI, detailing the required
environment variables (`AZURE_OPENAI_API_KEY`,
`AZURE_OPENAI_API_VERSION`) with examples.
- **Client Instantiation (`terminal-chat.tsx`, `singlepass-cli-app.tsx`,
`agent-loop.ts`, `compact-summary.ts`, `model-utils.ts`):**
- Modified various components and utility functions where the OpenAI
client is initialized.
- Added conditional logic to check if the configured `provider` is
`"azure"`.
- If the provider is Azure, the `AzureOpenAI` client from the `openai`
package is instantiated, using the configured `baseURL`, `apiKey` (from
`AZURE_OPENAI_API_KEY`), and `apiVersion` (from
`AZURE_OPENAI_API_VERSION`).
- Otherwise, the standard `OpenAI` client is instantiated as before.
- **Dependencies:**
- Relies on the `openai` package's built-in support for `AzureOpenAI`.
No *new* external dependencies were added specifically for this Azure
implementation beyond the `openai` package itself.
---
## How to Test
*This has been tested locally and confirmed working with Azure OpenAI.*
1. **Configure `config.json`:**
Ensure your `~/.codex/config.json` (or project-specific config) includes
Azure and sets it as the active provider:
```json
{
"providers": {
// ... other providers
"azure": {
"name": "AzureOpenAI",
"baseURL": "https://YOUR_RESOURCE_NAME.openai.azure.com", // Replace
with your Azure endpoint
"envKey": "AZURE_OPENAI_API_KEY"
}
},
"provider": "azure", // Set Azure as the active provider
"model": "o4-mini" // Use your Azure deployment name here
// ... other config settings
}
```
2. **Set up Environment Variables:**
```bash
# Set the API Key for your Azure OpenAI resource
export AZURE_OPENAI_API_KEY="your-azure-api-key-here"
# Set the API Version (Optional - defaults to `2025-03-01-preview` if
not set)
# Ensure this version is supported by your Azure deployment and endpoint
export AZURE_OPENAI_API_VERSION="2025-03-01-preview"
```
3. **Get the Codex CLI by building from this PR branch:**
Clone your fork, checkout this branch (`feat/azure-openai`), navigate to
`codex-cli`, and build:
```bash
# cd /path/to/your/fork/codex
git checkout feat/azure-openai # Or your branch name
cd codex-cli
corepack enable
pnpm install
pnpm build
```
4. **Invoke Codex:**
Run the locally built CLI using `node` from the `codex-cli` directory:
```bash
node ./dist/cli.js "Explain the purpose of this PR"
```
*(Alternatively, if you ran `pnpm link` after building, you can use
`codex "Explain the purpose of this PR"` from anywhere)*.
5. **Verify:** Confirm that the command executes successfully and
interacts with your configured Azure OpenAI deployment.
---
## Tests
- [x] Tested locally against an Azure OpenAI deployment using API Key
authentication. Basic commands and interactions confirmed working.
---
## Checklist
- [x] Added Azure provider details to configuration files
(`providers.ts`, `config.ts`).
- [x] Implemented conditional `AzureOpenAI` client initialization based
on provider setting.
- [x] Ensured `apiVersion` is passed correctly to the Azure client.
- [x] Updated `README.md` with Azure OpenAI setup instructions.
- [x] Manually tested core functionality against a live Azure OpenAI
endpoint.
- [x] Add/update automated tests for the Azure code path (pending `main`
stability).
cc @theabhinavdas @nikodem-wrona @fouad-openai @tibo-openai (adjust as
needed)
---
I have read the CLA Document and I hereby sign the CLA
2025-05-09 18:11:32 -07:00
|
|
|
azure: {
|
|
|
|
|
name: "AzureOpenAI",
|
|
|
|
|
baseURL: "https://YOUR_PROJECT_NAME.openai.azure.com/openai",
|
|
|
|
|
envKey: "AZURE_OPENAI_API_KEY",
|
|
|
|
|
},
|
2025-04-20 23:59:34 -04:00
|
|
|
gemini: {
|
|
|
|
|
name: "Gemini",
|
|
|
|
|
baseURL: "https://generativelanguage.googleapis.com/v1beta/openai",
|
|
|
|
|
envKey: "GEMINI_API_KEY",
|
|
|
|
|
},
|
|
|
|
|
ollama: {
|
|
|
|
|
name: "Ollama",
|
|
|
|
|
baseURL: "http://localhost:11434/v1",
|
|
|
|
|
envKey: "OLLAMA_API_KEY",
|
|
|
|
|
},
|
|
|
|
|
mistral: {
|
|
|
|
|
name: "Mistral",
|
|
|
|
|
baseURL: "https://api.mistral.ai/v1",
|
|
|
|
|
envKey: "MISTRAL_API_KEY",
|
|
|
|
|
},
|
|
|
|
|
deepseek: {
|
|
|
|
|
name: "DeepSeek",
|
|
|
|
|
baseURL: "https://api.deepseek.com",
|
|
|
|
|
envKey: "DEEPSEEK_API_KEY",
|
|
|
|
|
},
|
|
|
|
|
xai: {
|
|
|
|
|
name: "xAI",
|
|
|
|
|
baseURL: "https://api.x.ai/v1",
|
|
|
|
|
envKey: "XAI_API_KEY",
|
|
|
|
|
},
|
|
|
|
|
groq: {
|
|
|
|
|
name: "Groq",
|
|
|
|
|
baseURL: "https://api.groq.com/openai/v1",
|
|
|
|
|
envKey: "GROQ_API_KEY",
|
|
|
|
|
},
|
|
|
|
|
};
|