feat: add support for login with ChatGPT (#1212)

This does not implement the full Login with ChatGPT experience, but it
should unblock people.

**What works**

* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.

**What does not work**

* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:


a67a67f325/codex-cli/src/utils/get-api-key.tsx (L84-L89)

**Implementation**

Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.

As such, the most significant files in this PR are:

```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```

Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:

```
codex-rs/core/src/openai_api_key.rs
```

Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.

Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.

**Testing**

Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:

```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```

For reference:

* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
This commit is contained in:
Michael Bolin
2025-06-04 08:44:17 -07:00
committed by GitHub
parent a67a67f325
commit 515b6331bd
18 changed files with 1051 additions and 25 deletions

View File

@@ -27,6 +27,7 @@ mod model_provider_info;
pub use model_provider_info::ModelProviderInfo;
pub use model_provider_info::WireApi;
mod models;
pub mod openai_api_key;
mod openai_tools;
mod project_doc;
pub mod protocol;

View File

@@ -11,6 +11,7 @@ use std::collections::HashMap;
use std::env::VarError;
use crate::error::EnvVarError;
use crate::openai_api_key::get_openai_api_key;
/// Wire protocol that the provider speaks. Most third-party services only
/// implement the classic OpenAI Chat Completions JSON schema, whereas OpenAI
@@ -52,20 +53,27 @@ impl ModelProviderInfo {
/// cannot be found, returns an error.
pub fn api_key(&self) -> crate::error::Result<Option<String>> {
match &self.env_key {
Some(env_key) => std::env::var(env_key)
.and_then(|v| {
if v.trim().is_empty() {
Err(VarError::NotPresent)
} else {
Ok(Some(v))
}
})
.map_err(|_| {
crate::error::CodexErr::EnvVar(EnvVarError {
var: env_key.clone(),
instructions: self.env_key_instructions.clone(),
Some(env_key) => {
let env_value = if env_key == crate::openai_api_key::OPENAI_API_KEY_ENV_VAR {
get_openai_api_key().map_or_else(|| Err(VarError::NotPresent), Ok)
} else {
std::env::var(env_key)
};
env_value
.and_then(|v| {
if v.trim().is_empty() {
Err(VarError::NotPresent)
} else {
Ok(Some(v))
}
})
}),
.map_err(|_| {
crate::error::CodexErr::EnvVar(EnvVarError {
var: env_key.clone(),
instructions: self.env_key_instructions.clone(),
})
})
}
None => Ok(None),
}
}

View File

@@ -0,0 +1,24 @@
use std::env;
use std::sync::LazyLock;
use std::sync::RwLock;
pub const OPENAI_API_KEY_ENV_VAR: &str = "OPENAI_API_KEY";
static OPENAI_API_KEY: LazyLock<RwLock<Option<String>>> = LazyLock::new(|| {
let val = env::var(OPENAI_API_KEY_ENV_VAR)
.ok()
.and_then(|s| if s.is_empty() { None } else { Some(s) });
RwLock::new(val)
});
pub fn get_openai_api_key() -> Option<String> {
#![allow(clippy::unwrap_used)]
OPENAI_API_KEY.read().unwrap().clone()
}
pub fn set_openai_api_key(value: String) {
#![allow(clippy::unwrap_used)]
if !value.is_empty() {
*OPENAI_API_KEY.write().unwrap() = Some(value);
}
}