This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
a67a67f325/codex-cli/src/utils/get-api-key.tsx (L84-L89)
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
69 lines
1.5 KiB
TOML
69 lines
1.5 KiB
TOML
[package]
|
|
name = "codex-core"
|
|
version = { workspace = true }
|
|
edition = "2024"
|
|
|
|
[lib]
|
|
name = "codex_core"
|
|
path = "src/lib.rs"
|
|
|
|
[lints]
|
|
workspace = true
|
|
|
|
[dependencies]
|
|
anyhow = "1"
|
|
async-channel = "2.3.1"
|
|
base64 = "0.21"
|
|
bytes = "1.10.1"
|
|
codex-apply-patch = { path = "../apply-patch" }
|
|
codex-login = { path = "../login" }
|
|
codex-mcp-client = { path = "../mcp-client" }
|
|
dirs = "6"
|
|
env-flags = "0.1.1"
|
|
eventsource-stream = "0.2.3"
|
|
fs2 = "0.4.3"
|
|
fs-err = "3.1.0"
|
|
futures = "0.3"
|
|
mcp-types = { path = "../mcp-types" }
|
|
mime_guess = "2.0"
|
|
patch = "0.7"
|
|
path-absolutize = "3.1.1"
|
|
rand = "0.9"
|
|
reqwest = { version = "0.12", features = ["json", "stream"] }
|
|
serde = { version = "1", features = ["derive"] }
|
|
serde_json = "1"
|
|
strum = "0.27.1"
|
|
strum_macros = "0.27.1"
|
|
thiserror = "2.0.12"
|
|
time = { version = "0.3", features = ["formatting", "local-offset", "macros"] }
|
|
tokio = { version = "1", features = [
|
|
"io-std",
|
|
"macros",
|
|
"process",
|
|
"rt-multi-thread",
|
|
"signal",
|
|
] }
|
|
tokio-util = "0.7.14"
|
|
toml = "0.8.20"
|
|
tracing = { version = "0.1.41", features = ["log"] }
|
|
tree-sitter = "0.25.3"
|
|
tree-sitter-bash = "0.23.3"
|
|
uuid = { version = "1", features = ["serde", "v4"] }
|
|
wildmatch = "2.4.0"
|
|
|
|
[target.'cfg(target_os = "linux")'.dependencies]
|
|
landlock = "0.4.1"
|
|
seccompiler = "0.5.0"
|
|
|
|
# Build OpenSSL from source for musl builds.
|
|
[target.x86_64-unknown-linux-musl.dependencies]
|
|
openssl-sys = { version = "*", features = ["vendored"] }
|
|
|
|
[dev-dependencies]
|
|
assert_cmd = "2"
|
|
maplit = "1.0.2"
|
|
predicates = "3"
|
|
pretty_assertions = "1.4.1"
|
|
tempfile = "3"
|
|
wiremock = "0.6"
|