Adds Azure OpenAI support (#769)
## Summary This PR introduces support for Azure OpenAI as a provider within the Codex CLI. Users can now configure the tool to leverage their Azure OpenAI deployments by specifying `"azure"` as the provider in `config.json` and setting the corresponding `AZURE_OPENAI_API_KEY` and `AZURE_OPENAI_API_VERSION` environment variables. This functionality is added alongside the existing provider options (OpenAI, OpenRouter, etc.). Related to #92 **Note:** This PR is currently in **Draft** status because tests on the `main` branch are failing. It will be marked as ready for review once the `main` branch is stable and tests are passing. --- ## What’s Changed - **Configuration (`config.ts`, `providers.ts`, `README.md`):** - Added `"azure"` to the supported `providers` list in `providers.ts`, specifying its name, default base URL structure, and environment variable key (`AZURE_OPENAI_API_KEY`). - Defined the `AZURE_OPENAI_API_VERSION` environment variable in `config.ts` with a default value (`2025-03-01-preview`). - Updated `README.md` to: - Include "azure" in the list of providers. - Add a configuration section for Azure OpenAI, detailing the required environment variables (`AZURE_OPENAI_API_KEY`, `AZURE_OPENAI_API_VERSION`) with examples. - **Client Instantiation (`terminal-chat.tsx`, `singlepass-cli-app.tsx`, `agent-loop.ts`, `compact-summary.ts`, `model-utils.ts`):** - Modified various components and utility functions where the OpenAI client is initialized. - Added conditional logic to check if the configured `provider` is `"azure"`. - If the provider is Azure, the `AzureOpenAI` client from the `openai` package is instantiated, using the configured `baseURL`, `apiKey` (from `AZURE_OPENAI_API_KEY`), and `apiVersion` (from `AZURE_OPENAI_API_VERSION`). - Otherwise, the standard `OpenAI` client is instantiated as before. - **Dependencies:** - Relies on the `openai` package's built-in support for `AzureOpenAI`. No *new* external dependencies were added specifically for this Azure implementation beyond the `openai` package itself. --- ## How to Test *This has been tested locally and confirmed working with Azure OpenAI.* 1. **Configure `config.json`:** Ensure your `~/.codex/config.json` (or project-specific config) includes Azure and sets it as the active provider: ```json { "providers": { // ... other providers "azure": { "name": "AzureOpenAI", "baseURL": "https://YOUR_RESOURCE_NAME.openai.azure.com", // Replace with your Azure endpoint "envKey": "AZURE_OPENAI_API_KEY" } }, "provider": "azure", // Set Azure as the active provider "model": "o4-mini" // Use your Azure deployment name here // ... other config settings } ``` 2. **Set up Environment Variables:** ```bash # Set the API Key for your Azure OpenAI resource export AZURE_OPENAI_API_KEY="your-azure-api-key-here" # Set the API Version (Optional - defaults to `2025-03-01-preview` if not set) # Ensure this version is supported by your Azure deployment and endpoint export AZURE_OPENAI_API_VERSION="2025-03-01-preview" ``` 3. **Get the Codex CLI by building from this PR branch:** Clone your fork, checkout this branch (`feat/azure-openai`), navigate to `codex-cli`, and build: ```bash # cd /path/to/your/fork/codex git checkout feat/azure-openai # Or your branch name cd codex-cli corepack enable pnpm install pnpm build ``` 4. **Invoke Codex:** Run the locally built CLI using `node` from the `codex-cli` directory: ```bash node ./dist/cli.js "Explain the purpose of this PR" ``` *(Alternatively, if you ran `pnpm link` after building, you can use `codex "Explain the purpose of this PR"` from anywhere)*. 5. **Verify:** Confirm that the command executes successfully and interacts with your configured Azure OpenAI deployment. --- ## Tests - [x] Tested locally against an Azure OpenAI deployment using API Key authentication. Basic commands and interactions confirmed working. --- ## Checklist - [x] Added Azure provider details to configuration files (`providers.ts`, `config.ts`). - [x] Implemented conditional `AzureOpenAI` client initialization based on provider setting. - [x] Ensured `apiVersion` is passed correctly to the Azure client. - [x] Updated `README.md` with Azure OpenAI setup instructions. - [x] Manually tested core functionality against a live Azure OpenAI endpoint. - [x] Add/update automated tests for the Azure code path (pending `main` stability). cc @theabhinavdas @nikodem-wrona @fouad-openai @tibo-openai (adjust as needed) --- I have read the CLA Document and I hereby sign the CLA
This commit is contained in:
committed by
GitHub
parent
78843c3940
commit
7795272282
10
README.md
10
README.md
@@ -98,6 +98,7 @@ export OPENAI_API_KEY="your-api-key-here"
|
|||||||
>
|
>
|
||||||
> - openai (default)
|
> - openai (default)
|
||||||
> - openrouter
|
> - openrouter
|
||||||
|
> - azure
|
||||||
> - gemini
|
> - gemini
|
||||||
> - ollama
|
> - ollama
|
||||||
> - mistral
|
> - mistral
|
||||||
@@ -394,6 +395,11 @@ Below is a comprehensive example of `config.json` with multiple custom providers
|
|||||||
"baseURL": "https://api.openai.com/v1",
|
"baseURL": "https://api.openai.com/v1",
|
||||||
"envKey": "OPENAI_API_KEY"
|
"envKey": "OPENAI_API_KEY"
|
||||||
},
|
},
|
||||||
|
"azure": {
|
||||||
|
"name": "AzureOpenAI",
|
||||||
|
"baseURL": "https://YOUR_PROJECT_NAME.openai.azure.com/openai",
|
||||||
|
"envKey": "AZURE_OPENAI_API_KEY"
|
||||||
|
},
|
||||||
"openrouter": {
|
"openrouter": {
|
||||||
"name": "OpenRouter",
|
"name": "OpenRouter",
|
||||||
"baseURL": "https://openrouter.ai/api/v1",
|
"baseURL": "https://openrouter.ai/api/v1",
|
||||||
@@ -455,6 +461,10 @@ For each AI provider, you need to set the corresponding API key in your environm
|
|||||||
# OpenAI
|
# OpenAI
|
||||||
export OPENAI_API_KEY="your-api-key-here"
|
export OPENAI_API_KEY="your-api-key-here"
|
||||||
|
|
||||||
|
# Azure OpenAI
|
||||||
|
export AZURE_OPENAI_API_KEY="your-azure-api-key-here"
|
||||||
|
export AZURE_OPENAI_API_VERSION="2025-03-01-preview" (Optional)
|
||||||
|
|
||||||
# OpenRouter
|
# OpenRouter
|
||||||
export OPENROUTER_API_KEY="your-openrouter-key-here"
|
export OPENROUTER_API_KEY="your-openrouter-key-here"
|
||||||
|
|
||||||
|
|||||||
@@ -13,7 +13,7 @@ import { useTerminalSize } from "../../hooks/use-terminal-size.js";
|
|||||||
import { AgentLoop } from "../../utils/agent/agent-loop.js";
|
import { AgentLoop } from "../../utils/agent/agent-loop.js";
|
||||||
import { ReviewDecision } from "../../utils/agent/review.js";
|
import { ReviewDecision } from "../../utils/agent/review.js";
|
||||||
import { generateCompactSummary } from "../../utils/compact-summary.js";
|
import { generateCompactSummary } from "../../utils/compact-summary.js";
|
||||||
import { getBaseUrl, getApiKey, saveConfig } from "../../utils/config.js";
|
import { saveConfig } from "../../utils/config.js";
|
||||||
import { extractAppliedPatches as _extractAppliedPatches } from "../../utils/extract-applied-patches.js";
|
import { extractAppliedPatches as _extractAppliedPatches } from "../../utils/extract-applied-patches.js";
|
||||||
import { getGitDiff } from "../../utils/get-diff.js";
|
import { getGitDiff } from "../../utils/get-diff.js";
|
||||||
import { createInputItem } from "../../utils/input-utils.js";
|
import { createInputItem } from "../../utils/input-utils.js";
|
||||||
@@ -23,6 +23,7 @@ import {
|
|||||||
calculateContextPercentRemaining,
|
calculateContextPercentRemaining,
|
||||||
uniqueById,
|
uniqueById,
|
||||||
} from "../../utils/model-utils.js";
|
} from "../../utils/model-utils.js";
|
||||||
|
import { createOpenAIClient } from "../../utils/openai-client.js";
|
||||||
import { CLI_VERSION } from "../../utils/session.js";
|
import { CLI_VERSION } from "../../utils/session.js";
|
||||||
import { shortCwd } from "../../utils/short-path.js";
|
import { shortCwd } from "../../utils/short-path.js";
|
||||||
import { saveRollout } from "../../utils/storage/save-rollout.js";
|
import { saveRollout } from "../../utils/storage/save-rollout.js";
|
||||||
@@ -34,7 +35,6 @@ import ModelOverlay from "../model-overlay.js";
|
|||||||
import chalk from "chalk";
|
import chalk from "chalk";
|
||||||
import { Box, Text } from "ink";
|
import { Box, Text } from "ink";
|
||||||
import { spawn } from "node:child_process";
|
import { spawn } from "node:child_process";
|
||||||
import OpenAI from "openai";
|
|
||||||
import React, { useEffect, useMemo, useRef, useState } from "react";
|
import React, { useEffect, useMemo, useRef, useState } from "react";
|
||||||
import { inspect } from "util";
|
import { inspect } from "util";
|
||||||
|
|
||||||
@@ -78,10 +78,7 @@ async function generateCommandExplanation(
|
|||||||
): Promise<string> {
|
): Promise<string> {
|
||||||
try {
|
try {
|
||||||
// Create a temporary OpenAI client
|
// Create a temporary OpenAI client
|
||||||
const oai = new OpenAI({
|
const oai = createOpenAIClient(config);
|
||||||
apiKey: getApiKey(config.provider),
|
|
||||||
baseURL: getBaseUrl(config.provider),
|
|
||||||
});
|
|
||||||
|
|
||||||
// Format the command for display
|
// Format the command for display
|
||||||
const commandForDisplay = formatCommandForDisplay(command);
|
const commandForDisplay = formatCommandForDisplay(command);
|
||||||
|
|||||||
@@ -5,13 +5,7 @@ import type { FileOperation } from "../utils/singlepass/file_ops";
|
|||||||
|
|
||||||
import Spinner from "./vendor/ink-spinner"; // Third‑party / vendor components
|
import Spinner from "./vendor/ink-spinner"; // Third‑party / vendor components
|
||||||
import TextInput from "./vendor/ink-text-input";
|
import TextInput from "./vendor/ink-text-input";
|
||||||
import {
|
import { createOpenAIClient } from "../utils/openai-client";
|
||||||
OPENAI_TIMEOUT_MS,
|
|
||||||
OPENAI_ORGANIZATION,
|
|
||||||
OPENAI_PROJECT,
|
|
||||||
getBaseUrl,
|
|
||||||
getApiKey,
|
|
||||||
} from "../utils/config";
|
|
||||||
import {
|
import {
|
||||||
generateDiffSummary,
|
generateDiffSummary,
|
||||||
generateEditSummary,
|
generateEditSummary,
|
||||||
@@ -26,7 +20,6 @@ import { EditedFilesSchema } from "../utils/singlepass/file_ops";
|
|||||||
import * as fsSync from "fs";
|
import * as fsSync from "fs";
|
||||||
import * as fsPromises from "fs/promises";
|
import * as fsPromises from "fs/promises";
|
||||||
import { Box, Text, useApp, useInput } from "ink";
|
import { Box, Text, useApp, useInput } from "ink";
|
||||||
import OpenAI from "openai";
|
|
||||||
import { zodResponseFormat } from "openai/helpers/zod";
|
import { zodResponseFormat } from "openai/helpers/zod";
|
||||||
import path from "path";
|
import path from "path";
|
||||||
import React, { useEffect, useState, useRef } from "react";
|
import React, { useEffect, useState, useRef } from "react";
|
||||||
@@ -399,20 +392,7 @@ export function SinglePassApp({
|
|||||||
files,
|
files,
|
||||||
});
|
});
|
||||||
|
|
||||||
const headers: Record<string, string> = {};
|
const openai = createOpenAIClient(config);
|
||||||
if (OPENAI_ORGANIZATION) {
|
|
||||||
headers["OpenAI-Organization"] = OPENAI_ORGANIZATION;
|
|
||||||
}
|
|
||||||
if (OPENAI_PROJECT) {
|
|
||||||
headers["OpenAI-Project"] = OPENAI_PROJECT;
|
|
||||||
}
|
|
||||||
|
|
||||||
const openai = new OpenAI({
|
|
||||||
apiKey: getApiKey(config.provider),
|
|
||||||
baseURL: getBaseUrl(config.provider),
|
|
||||||
timeout: OPENAI_TIMEOUT_MS,
|
|
||||||
defaultHeaders: headers,
|
|
||||||
});
|
|
||||||
const chatResp = await openai.beta.chat.completions.parse({
|
const chatResp = await openai.beta.chat.completions.parse({
|
||||||
model: config.model,
|
model: config.model,
|
||||||
...(config.flexMode ? { service_tier: "flex" } : {}),
|
...(config.flexMode ? { service_tier: "flex" } : {}),
|
||||||
|
|||||||
@@ -17,6 +17,7 @@ import {
|
|||||||
OPENAI_PROJECT,
|
OPENAI_PROJECT,
|
||||||
getApiKey,
|
getApiKey,
|
||||||
getBaseUrl,
|
getBaseUrl,
|
||||||
|
AZURE_OPENAI_API_VERSION,
|
||||||
} from "../config.js";
|
} from "../config.js";
|
||||||
import { log } from "../logger/log.js";
|
import { log } from "../logger/log.js";
|
||||||
import { parseToolCallArguments } from "../parsers.js";
|
import { parseToolCallArguments } from "../parsers.js";
|
||||||
@@ -31,7 +32,7 @@ import {
|
|||||||
import { handleExecCommand } from "./handle-exec-command.js";
|
import { handleExecCommand } from "./handle-exec-command.js";
|
||||||
import { HttpsProxyAgent } from "https-proxy-agent";
|
import { HttpsProxyAgent } from "https-proxy-agent";
|
||||||
import { randomUUID } from "node:crypto";
|
import { randomUUID } from "node:crypto";
|
||||||
import OpenAI, { APIConnectionTimeoutError } from "openai";
|
import OpenAI, { APIConnectionTimeoutError, AzureOpenAI } from "openai";
|
||||||
|
|
||||||
// Wait time before retrying after rate limit errors (ms).
|
// Wait time before retrying after rate limit errors (ms).
|
||||||
const RATE_LIMIT_RETRY_WAIT_MS = parseInt(
|
const RATE_LIMIT_RETRY_WAIT_MS = parseInt(
|
||||||
@@ -322,6 +323,25 @@ export class AgentLoop {
|
|||||||
...(timeoutMs !== undefined ? { timeout: timeoutMs } : {}),
|
...(timeoutMs !== undefined ? { timeout: timeoutMs } : {}),
|
||||||
});
|
});
|
||||||
|
|
||||||
|
if (this.provider.toLowerCase() === "azure") {
|
||||||
|
this.oai = new AzureOpenAI({
|
||||||
|
apiKey,
|
||||||
|
baseURL,
|
||||||
|
apiVersion: AZURE_OPENAI_API_VERSION,
|
||||||
|
defaultHeaders: {
|
||||||
|
originator: ORIGIN,
|
||||||
|
version: CLI_VERSION,
|
||||||
|
session_id: this.sessionId,
|
||||||
|
...(OPENAI_ORGANIZATION
|
||||||
|
? { "OpenAI-Organization": OPENAI_ORGANIZATION }
|
||||||
|
: {}),
|
||||||
|
...(OPENAI_PROJECT ? { "OpenAI-Project": OPENAI_PROJECT } : {}),
|
||||||
|
},
|
||||||
|
httpAgent: PROXY_URL ? new HttpsProxyAgent(PROXY_URL) : undefined,
|
||||||
|
...(timeoutMs !== undefined ? { timeout: timeoutMs } : {}),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
setSessionId(this.sessionId);
|
setSessionId(this.sessionId);
|
||||||
setCurrentModel(this.model);
|
setCurrentModel(this.model);
|
||||||
|
|
||||||
|
|||||||
@@ -1,12 +1,14 @@
|
|||||||
import type { AppConfig } from "./config.js";
|
import type { AppConfig } from "./config.js";
|
||||||
import type { ResponseItem } from "openai/resources/responses/responses.mjs";
|
import type { ResponseItem } from "openai/resources/responses/responses.mjs";
|
||||||
|
|
||||||
import { getBaseUrl, getApiKey } from "./config.js";
|
import { createOpenAIClient } from "./openai-client.js";
|
||||||
import OpenAI from "openai";
|
|
||||||
/**
|
/**
|
||||||
* Generate a condensed summary of the conversation items.
|
* Generate a condensed summary of the conversation items.
|
||||||
* @param items The list of conversation items to summarize
|
* @param items The list of conversation items to summarize
|
||||||
* @param model The model to use for generating the summary
|
* @param model The model to use for generating the summary
|
||||||
|
* @param flexMode Whether to use the flex-mode service tier
|
||||||
|
* @param config The configuration object
|
||||||
* @returns A concise structured summary string
|
* @returns A concise structured summary string
|
||||||
*/
|
*/
|
||||||
/**
|
/**
|
||||||
@@ -23,10 +25,7 @@ export async function generateCompactSummary(
|
|||||||
flexMode = false,
|
flexMode = false,
|
||||||
config: AppConfig,
|
config: AppConfig,
|
||||||
): Promise<string> {
|
): Promise<string> {
|
||||||
const oai = new OpenAI({
|
const oai = createOpenAIClient(config);
|
||||||
apiKey: getApiKey(config.provider),
|
|
||||||
baseURL: getBaseUrl(config.provider),
|
|
||||||
});
|
|
||||||
|
|
||||||
const conversationText = items
|
const conversationText = items
|
||||||
.filter(
|
.filter(
|
||||||
|
|||||||
@@ -68,6 +68,9 @@ export const OPENAI_TIMEOUT_MS =
|
|||||||
export const OPENAI_BASE_URL = process.env["OPENAI_BASE_URL"] || "";
|
export const OPENAI_BASE_URL = process.env["OPENAI_BASE_URL"] || "";
|
||||||
export let OPENAI_API_KEY = process.env["OPENAI_API_KEY"] || "";
|
export let OPENAI_API_KEY = process.env["OPENAI_API_KEY"] || "";
|
||||||
|
|
||||||
|
export const AZURE_OPENAI_API_VERSION =
|
||||||
|
process.env["AZURE_OPENAI_API_VERSION"] || "2025-03-01-preview";
|
||||||
|
|
||||||
export const DEFAULT_REASONING_EFFORT = "high";
|
export const DEFAULT_REASONING_EFFORT = "high";
|
||||||
export const OPENAI_ORGANIZATION = process.env["OPENAI_ORGANIZATION"] || "";
|
export const OPENAI_ORGANIZATION = process.env["OPENAI_ORGANIZATION"] || "";
|
||||||
export const OPENAI_PROJECT = process.env["OPENAI_PROJECT"] || "";
|
export const OPENAI_PROJECT = process.env["OPENAI_PROJECT"] || "";
|
||||||
|
|||||||
@@ -1,14 +1,9 @@
|
|||||||
import type { ResponseItem } from "openai/resources/responses/responses.mjs";
|
import type { ResponseItem } from "openai/resources/responses/responses.mjs";
|
||||||
|
|
||||||
import { approximateTokensUsed } from "./approximate-tokens-used.js";
|
import { approximateTokensUsed } from "./approximate-tokens-used.js";
|
||||||
import {
|
import { getApiKey } from "./config.js";
|
||||||
OPENAI_ORGANIZATION,
|
|
||||||
OPENAI_PROJECT,
|
|
||||||
getBaseUrl,
|
|
||||||
getApiKey,
|
|
||||||
} from "./config";
|
|
||||||
import { type SupportedModelId, openAiModelInfo } from "./model-info.js";
|
import { type SupportedModelId, openAiModelInfo } from "./model-info.js";
|
||||||
import OpenAI from "openai";
|
import { createOpenAIClient } from "./openai-client.js";
|
||||||
|
|
||||||
const MODEL_LIST_TIMEOUT_MS = 2_000; // 2 seconds
|
const MODEL_LIST_TIMEOUT_MS = 2_000; // 2 seconds
|
||||||
export const RECOMMENDED_MODELS: Array<string> = ["o4-mini", "o3"];
|
export const RECOMMENDED_MODELS: Array<string> = ["o4-mini", "o3"];
|
||||||
@@ -27,19 +22,7 @@ async function fetchModels(provider: string): Promise<Array<string>> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
try {
|
try {
|
||||||
const headers: Record<string, string> = {};
|
const openai = createOpenAIClient({ provider });
|
||||||
if (OPENAI_ORGANIZATION) {
|
|
||||||
headers["OpenAI-Organization"] = OPENAI_ORGANIZATION;
|
|
||||||
}
|
|
||||||
if (OPENAI_PROJECT) {
|
|
||||||
headers["OpenAI-Project"] = OPENAI_PROJECT;
|
|
||||||
}
|
|
||||||
|
|
||||||
const openai = new OpenAI({
|
|
||||||
apiKey: getApiKey(provider),
|
|
||||||
baseURL: getBaseUrl(provider),
|
|
||||||
defaultHeaders: headers,
|
|
||||||
});
|
|
||||||
const list = await openai.models.list();
|
const list = await openai.models.list();
|
||||||
const models: Array<string> = [];
|
const models: Array<string> = [];
|
||||||
for await (const model of list as AsyncIterable<{ id?: string }>) {
|
for await (const model of list as AsyncIterable<{ id?: string }>) {
|
||||||
|
|||||||
51
codex-cli/src/utils/openai-client.ts
Normal file
51
codex-cli/src/utils/openai-client.ts
Normal file
@@ -0,0 +1,51 @@
|
|||||||
|
import type { AppConfig } from "./config.js";
|
||||||
|
|
||||||
|
import {
|
||||||
|
getBaseUrl,
|
||||||
|
getApiKey,
|
||||||
|
AZURE_OPENAI_API_VERSION,
|
||||||
|
OPENAI_TIMEOUT_MS,
|
||||||
|
OPENAI_ORGANIZATION,
|
||||||
|
OPENAI_PROJECT,
|
||||||
|
} from "./config.js";
|
||||||
|
import OpenAI, { AzureOpenAI } from "openai";
|
||||||
|
|
||||||
|
type OpenAIClientConfig = {
|
||||||
|
provider: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Creates an OpenAI client instance based on the provided configuration.
|
||||||
|
* Handles both standard OpenAI and Azure OpenAI configurations.
|
||||||
|
*
|
||||||
|
* @param config The configuration containing provider information
|
||||||
|
* @returns An instance of either OpenAI or AzureOpenAI client
|
||||||
|
*/
|
||||||
|
export function createOpenAIClient(
|
||||||
|
config: OpenAIClientConfig | AppConfig,
|
||||||
|
): OpenAI | AzureOpenAI {
|
||||||
|
const headers: Record<string, string> = {};
|
||||||
|
if (OPENAI_ORGANIZATION) {
|
||||||
|
headers["OpenAI-Organization"] = OPENAI_ORGANIZATION;
|
||||||
|
}
|
||||||
|
if (OPENAI_PROJECT) {
|
||||||
|
headers["OpenAI-Project"] = OPENAI_PROJECT;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (config.provider?.toLowerCase() === "azure") {
|
||||||
|
return new AzureOpenAI({
|
||||||
|
apiKey: getApiKey(config.provider),
|
||||||
|
baseURL: getBaseUrl(config.provider),
|
||||||
|
apiVersion: AZURE_OPENAI_API_VERSION,
|
||||||
|
timeout: OPENAI_TIMEOUT_MS,
|
||||||
|
defaultHeaders: headers,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
return new OpenAI({
|
||||||
|
apiKey: getApiKey(config.provider),
|
||||||
|
baseURL: getBaseUrl(config.provider),
|
||||||
|
timeout: OPENAI_TIMEOUT_MS,
|
||||||
|
defaultHeaders: headers,
|
||||||
|
});
|
||||||
|
}
|
||||||
@@ -12,6 +12,11 @@ export const providers: Record<
|
|||||||
baseURL: "https://openrouter.ai/api/v1",
|
baseURL: "https://openrouter.ai/api/v1",
|
||||||
envKey: "OPENROUTER_API_KEY",
|
envKey: "OPENROUTER_API_KEY",
|
||||||
},
|
},
|
||||||
|
azure: {
|
||||||
|
name: "AzureOpenAI",
|
||||||
|
baseURL: "https://YOUR_PROJECT_NAME.openai.azure.com/openai",
|
||||||
|
envKey: "AZURE_OPENAI_API_KEY",
|
||||||
|
},
|
||||||
gemini: {
|
gemini: {
|
||||||
name: "Gemini",
|
name: "Gemini",
|
||||||
baseURL: "https://generativelanguage.googleapis.com/v1beta/openai",
|
baseURL: "https://generativelanguage.googleapis.com/v1beta/openai",
|
||||||
|
|||||||
Reference in New Issue
Block a user