2025-05-07 17:38:28 -07:00
|
|
|
//! Registry of model providers supported by Codex.
|
|
|
|
|
//!
|
|
|
|
|
//! Providers can be defined in two places:
|
|
|
|
|
//! 1. Built-in defaults compiled into the binary so Codex works out-of-the-box.
|
|
|
|
|
//! 2. User-defined entries inside `~/.codex/config.toml` under the `model_providers`
|
|
|
|
|
//! key. These override or extend the defaults at runtime.
|
|
|
|
|
|
|
|
|
|
use serde::Deserialize;
|
|
|
|
|
use serde::Serialize;
|
|
|
|
|
use std::collections::HashMap;
|
2025-05-08 21:46:06 -07:00
|
|
|
use std::env::VarError;
|
|
|
|
|
|
|
|
|
|
use crate::error::EnvVarError;
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
use crate::openai_api_key::get_openai_api_key;
|
2025-05-08 21:46:06 -07:00
|
|
|
|
|
|
|
|
/// Wire protocol that the provider speaks. Most third-party services only
|
|
|
|
|
/// implement the classic OpenAI Chat Completions JSON schema, whereas OpenAI
|
|
|
|
|
/// itself (and a handful of others) additionally expose the more modern
|
|
|
|
|
/// *Responses* API. The two protocols use different request/response shapes
|
|
|
|
|
/// and *cannot* be auto-detected at runtime, therefore each provider entry
|
|
|
|
|
/// must declare which one it expects.
|
|
|
|
|
#[derive(Debug, Clone, Copy, Default, PartialEq, Eq, Serialize, Deserialize)]
|
|
|
|
|
#[serde(rename_all = "lowercase")]
|
|
|
|
|
pub enum WireApi {
|
|
|
|
|
/// The experimental “Responses” API exposed by OpenAI at `/v1/responses`.
|
|
|
|
|
#[default]
|
|
|
|
|
Responses,
|
|
|
|
|
/// Regular Chat Completions compatible with `/v1/chat/completions`.
|
|
|
|
|
Chat,
|
|
|
|
|
}
|
2025-05-07 17:38:28 -07:00
|
|
|
|
|
|
|
|
/// Serializable representation of a provider definition.
|
2025-05-13 16:52:52 -07:00
|
|
|
#[derive(Debug, Clone, Deserialize, Serialize, PartialEq)]
|
2025-05-07 17:38:28 -07:00
|
|
|
pub struct ModelProviderInfo {
|
|
|
|
|
/// Friendly display name.
|
|
|
|
|
pub name: String,
|
|
|
|
|
/// Base URL for the provider's OpenAI-compatible API.
|
|
|
|
|
pub base_url: String,
|
|
|
|
|
/// Environment variable that stores the user's API key for this provider.
|
2025-05-08 21:46:06 -07:00
|
|
|
pub env_key: Option<String>,
|
|
|
|
|
|
|
|
|
|
/// Optional instructions to help the user get a valid value for the
|
|
|
|
|
/// variable and set it.
|
|
|
|
|
pub env_key_instructions: Option<String>,
|
|
|
|
|
|
|
|
|
|
/// Which wire protocol this provider expects.
|
|
|
|
|
pub wire_api: WireApi,
|
2025-05-07 17:38:28 -07:00
|
|
|
}
|
|
|
|
|
|
|
|
|
|
impl ModelProviderInfo {
|
2025-05-08 21:46:06 -07:00
|
|
|
/// If `env_key` is Some, returns the API key for this provider if present
|
|
|
|
|
/// (and non-empty) in the environment. If `env_key` is required but
|
|
|
|
|
/// cannot be found, returns an error.
|
|
|
|
|
pub fn api_key(&self) -> crate::error::Result<Option<String>> {
|
|
|
|
|
match &self.env_key {
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
Some(env_key) => {
|
|
|
|
|
let env_value = if env_key == crate::openai_api_key::OPENAI_API_KEY_ENV_VAR {
|
|
|
|
|
get_openai_api_key().map_or_else(|| Err(VarError::NotPresent), Ok)
|
|
|
|
|
} else {
|
|
|
|
|
std::env::var(env_key)
|
|
|
|
|
};
|
|
|
|
|
env_value
|
|
|
|
|
.and_then(|v| {
|
|
|
|
|
if v.trim().is_empty() {
|
|
|
|
|
Err(VarError::NotPresent)
|
|
|
|
|
} else {
|
|
|
|
|
Ok(Some(v))
|
|
|
|
|
}
|
2025-05-08 21:46:06 -07:00
|
|
|
})
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
.map_err(|_| {
|
|
|
|
|
crate::error::CodexErr::EnvVar(EnvVarError {
|
|
|
|
|
var: env_key.clone(),
|
|
|
|
|
instructions: self.env_key_instructions.clone(),
|
|
|
|
|
})
|
|
|
|
|
})
|
|
|
|
|
}
|
2025-05-08 21:46:06 -07:00
|
|
|
None => Ok(None),
|
|
|
|
|
}
|
2025-05-07 17:38:28 -07:00
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
/// Built-in default provider list.
|
|
|
|
|
pub fn built_in_model_providers() -> HashMap<String, ModelProviderInfo> {
|
|
|
|
|
use ModelProviderInfo as P;
|
|
|
|
|
|
2025-06-27 14:49:55 -07:00
|
|
|
// We do not want to be in the business of adjucating which third-party
|
|
|
|
|
// providers are bundled with Codex CLI, so we only include the OpenAI
|
|
|
|
|
// provider by default. Users are encouraged to add to `model_providers`
|
|
|
|
|
// in config.toml to add their own providers.
|
2025-05-07 17:38:28 -07:00
|
|
|
[
|
|
|
|
|
(
|
|
|
|
|
"openai",
|
|
|
|
|
P {
|
|
|
|
|
name: "OpenAI".into(),
|
|
|
|
|
base_url: "https://api.openai.com/v1".into(),
|
2025-05-08 21:46:06 -07:00
|
|
|
env_key: Some("OPENAI_API_KEY".into()),
|
|
|
|
|
env_key_instructions: Some("Create an API key (https://platform.openai.com) and export it as an environment variable.".into()),
|
|
|
|
|
wire_api: WireApi::Responses,
|
2025-05-07 17:38:28 -07:00
|
|
|
},
|
|
|
|
|
),
|
|
|
|
|
]
|
|
|
|
|
.into_iter()
|
|
|
|
|
.map(|(k, v)| (k.to_string(), v))
|
|
|
|
|
.collect()
|
|
|
|
|
}
|