feat: add support for login with ChatGPT (#1212)

This does not implement the full Login with ChatGPT experience, but it
should unblock people.

**What works**

* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.

**What does not work**

* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:


a67a67f325/codex-cli/src/utils/get-api-key.tsx (L84-L89)

**Implementation**

Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.

As such, the most significant files in this PR are:

```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```

Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:

```
codex-rs/core/src/openai_api_key.rs
```

Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.

Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.

**Testing**

Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:

```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```

For reference:

* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
This commit is contained in:
Michael Bolin
2025-06-04 08:44:17 -07:00
committed by GitHub
parent a67a67f325
commit 515b6331bd
18 changed files with 1051 additions and 25 deletions

View File

@@ -1,5 +1,6 @@
pub mod debug_sandbox;
mod exit_status;
pub mod login;
pub mod proto;
use clap::Parser;

35
codex-rs/cli/src/login.rs Normal file
View File

@@ -0,0 +1,35 @@
use codex_common::CliConfigOverrides;
use codex_core::config::Config;
use codex_core::config::ConfigOverrides;
use codex_login::login_with_chatgpt;
pub async fn run_login_with_chatgpt(cli_config_overrides: CliConfigOverrides) -> ! {
let cli_overrides = match cli_config_overrides.parse_overrides() {
Ok(v) => v,
Err(e) => {
eprintln!("Error parsing -c overrides: {e}");
std::process::exit(1);
}
};
let config_overrides = ConfigOverrides::default();
let config = match Config::load_with_cli_overrides(cli_overrides, config_overrides) {
Ok(config) => config,
Err(e) => {
eprintln!("Error loading configuration: {e}");
std::process::exit(1);
}
};
let capture_output = false;
match login_with_chatgpt(&config.codex_home, capture_output).await {
Ok(_) => {
eprintln!("Successfully logged in");
std::process::exit(0);
}
Err(e) => {
eprintln!("Error logging in: {e}");
std::process::exit(1);
}
}
}

View File

@@ -1,6 +1,7 @@
use clap::Parser;
use codex_cli::LandlockCommand;
use codex_cli::SeatbeltCommand;
use codex_cli::login::run_login_with_chatgpt;
use codex_cli::proto;
use codex_common::CliConfigOverrides;
use codex_exec::Cli as ExecCli;
@@ -36,6 +37,9 @@ enum Subcommand {
#[clap(visible_alias = "e")]
Exec(ExecCli),
/// Login with ChatGPT.
Login(LoginCommand),
/// Experimental: run Codex as an MCP server.
Mcp,
@@ -63,7 +67,10 @@ enum DebugCommand {
}
#[derive(Debug, Parser)]
struct ReplProto {}
struct LoginCommand {
#[clap(skip)]
config_overrides: CliConfigOverrides,
}
fn main() -> anyhow::Result<()> {
codex_linux_sandbox::run_with_sandbox(|codex_linux_sandbox_exe| async move {
@@ -88,6 +95,10 @@ async fn cli_main(codex_linux_sandbox_exe: Option<PathBuf>) -> anyhow::Result<()
Some(Subcommand::Mcp) => {
codex_mcp_server::run_main(codex_linux_sandbox_exe).await?;
}
Some(Subcommand::Login(mut login_cli)) => {
prepend_config_flags(&mut login_cli.config_overrides, cli.config_overrides);
run_login_with_chatgpt(login_cli.config_overrides).await;
}
Some(Subcommand::Proto(mut proto_cli)) => {
prepend_config_flags(&mut proto_cli.config_overrides, cli.config_overrides);
proto::run_main(proto_cli).await?;