feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
use chrono::DateTime;
|
2025-07-30 12:40:15 -07:00
|
|
|
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
use chrono::Utc;
|
|
|
|
|
use serde::Deserialize;
|
|
|
|
|
use serde::Serialize;
|
2025-07-30 12:40:15 -07:00
|
|
|
use std::env;
|
2025-08-06 15:22:14 -07:00
|
|
|
use std::fs::File;
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
use std::fs::OpenOptions;
|
2025-08-07 01:17:33 -07:00
|
|
|
use std::fs::remove_file;
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
use std::io::Read;
|
|
|
|
|
use std::io::Write;
|
|
|
|
|
#[cfg(unix)]
|
|
|
|
|
use std::os::unix::fs::OpenOptionsExt;
|
|
|
|
|
use std::path::Path;
|
2025-07-30 12:40:15 -07:00
|
|
|
use std::path::PathBuf;
|
2025-08-06 15:22:14 -07:00
|
|
|
use std::process::Child;
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
use std::process::Stdio;
|
2025-07-30 12:40:15 -07:00
|
|
|
use std::sync::Arc;
|
|
|
|
|
use std::sync::Mutex;
|
2025-07-28 17:25:14 -07:00
|
|
|
use std::time::Duration;
|
2025-08-08 14:57:16 -03:00
|
|
|
use tempfile::NamedTempFile;
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
use tokio::process::Command;
|
|
|
|
|
|
2025-08-07 01:27:45 -07:00
|
|
|
pub use crate::token_data::TokenData;
|
|
|
|
|
use crate::token_data::parse_id_token;
|
|
|
|
|
|
|
|
|
|
mod token_data;
|
|
|
|
|
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
const SOURCE_FOR_PYTHON_SERVER: &str = include_str!("./login_with_chatgpt.py");
|
|
|
|
|
|
|
|
|
|
const CLIENT_ID: &str = "app_EMoamEEZ73f0CkXaXp7hrann";
|
2025-07-31 10:48:49 -07:00
|
|
|
pub const OPENAI_API_KEY_ENV_VAR: &str = "OPENAI_API_KEY";
|
2025-07-30 12:40:15 -07:00
|
|
|
|
2025-08-04 18:07:49 -07:00
|
|
|
#[derive(Clone, Debug, PartialEq, Copy)]
|
2025-07-30 12:40:15 -07:00
|
|
|
pub enum AuthMode {
|
|
|
|
|
ApiKey,
|
|
|
|
|
ChatGPT,
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
#[derive(Debug, Clone)]
|
|
|
|
|
pub struct CodexAuth {
|
|
|
|
|
pub mode: AuthMode,
|
2025-08-07 16:40:01 -07:00
|
|
|
|
|
|
|
|
api_key: Option<String>,
|
2025-07-30 12:40:15 -07:00
|
|
|
auth_dot_json: Arc<Mutex<Option<AuthDotJson>>>,
|
|
|
|
|
auth_file: PathBuf,
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
impl PartialEq for CodexAuth {
|
|
|
|
|
fn eq(&self, other: &Self) -> bool {
|
|
|
|
|
self.mode == other.mode
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
impl CodexAuth {
|
2025-08-07 16:55:33 -07:00
|
|
|
pub fn from_api_key(api_key: &str) -> Self {
|
2025-07-30 12:40:15 -07:00
|
|
|
Self {
|
2025-08-07 16:55:33 -07:00
|
|
|
api_key: Some(api_key.to_owned()),
|
2025-07-30 12:40:15 -07:00
|
|
|
mode: AuthMode::ApiKey,
|
|
|
|
|
auth_file: PathBuf::new(),
|
|
|
|
|
auth_dot_json: Arc::new(Mutex::new(None)),
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
2025-08-07 16:49:37 -07:00
|
|
|
/// Loads the available auth information from the auth.json or
|
|
|
|
|
/// OPENAI_API_KEY environment variable.
|
|
|
|
|
pub fn from_codex_home(codex_home: &Path) -> std::io::Result<Option<CodexAuth>> {
|
|
|
|
|
load_auth(codex_home, true)
|
|
|
|
|
}
|
|
|
|
|
|
2025-07-30 12:40:15 -07:00
|
|
|
pub async fn get_token_data(&self) -> Result<TokenData, std::io::Error> {
|
2025-08-07 18:24:34 -07:00
|
|
|
let auth_dot_json: Option<AuthDotJson> = self.get_current_auth_json();
|
2025-07-30 12:40:15 -07:00
|
|
|
match auth_dot_json {
|
2025-07-31 10:48:49 -07:00
|
|
|
Some(AuthDotJson {
|
|
|
|
|
tokens: Some(mut tokens),
|
|
|
|
|
last_refresh: Some(last_refresh),
|
|
|
|
|
..
|
|
|
|
|
}) => {
|
|
|
|
|
if last_refresh < Utc::now() - chrono::Duration::days(28) {
|
2025-07-30 12:40:15 -07:00
|
|
|
let refresh_response = tokio::time::timeout(
|
|
|
|
|
Duration::from_secs(60),
|
2025-07-31 10:48:49 -07:00
|
|
|
try_refresh_token(tokens.refresh_token.clone()),
|
2025-07-30 12:40:15 -07:00
|
|
|
)
|
|
|
|
|
.await
|
|
|
|
|
.map_err(|_| {
|
|
|
|
|
std::io::Error::other("timed out while refreshing OpenAI API key")
|
|
|
|
|
})?
|
|
|
|
|
.map_err(std::io::Error::other)?;
|
|
|
|
|
|
|
|
|
|
let updated_auth_dot_json = update_tokens(
|
|
|
|
|
&self.auth_file,
|
|
|
|
|
refresh_response.id_token,
|
|
|
|
|
refresh_response.access_token,
|
|
|
|
|
refresh_response.refresh_token,
|
|
|
|
|
)
|
|
|
|
|
.await?;
|
|
|
|
|
|
2025-07-31 10:48:49 -07:00
|
|
|
tokens = updated_auth_dot_json
|
|
|
|
|
.tokens
|
|
|
|
|
.clone()
|
|
|
|
|
.ok_or(std::io::Error::other(
|
|
|
|
|
"Token data is not available after refresh.",
|
|
|
|
|
))?;
|
|
|
|
|
|
2025-07-30 12:40:15 -07:00
|
|
|
#[expect(clippy::unwrap_used)]
|
2025-07-31 10:48:49 -07:00
|
|
|
let mut auth_lock = self.auth_dot_json.lock().unwrap();
|
|
|
|
|
*auth_lock = Some(updated_auth_dot_json);
|
2025-07-30 12:40:15 -07:00
|
|
|
}
|
2025-07-31 10:48:49 -07:00
|
|
|
|
|
|
|
|
Ok(tokens)
|
2025-07-30 12:40:15 -07:00
|
|
|
}
|
2025-07-31 10:48:49 -07:00
|
|
|
_ => Err(std::io::Error::other("Token data is not available.")),
|
2025-07-30 12:40:15 -07:00
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
pub async fn get_token(&self) -> Result<String, std::io::Error> {
|
|
|
|
|
match self.mode {
|
|
|
|
|
AuthMode::ApiKey => Ok(self.api_key.clone().unwrap_or_default()),
|
|
|
|
|
AuthMode::ChatGPT => {
|
|
|
|
|
let id_token = self.get_token_data().await?.access_token;
|
|
|
|
|
|
|
|
|
|
Ok(id_token)
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
2025-07-31 15:40:19 -07:00
|
|
|
|
2025-08-07 18:24:34 -07:00
|
|
|
pub fn get_account_id(&self) -> Option<String> {
|
|
|
|
|
self.get_current_token_data()
|
|
|
|
|
.and_then(|t| t.account_id.clone())
|
|
|
|
|
}
|
2025-07-31 15:40:19 -07:00
|
|
|
|
2025-08-07 18:24:34 -07:00
|
|
|
pub fn get_plan_type(&self) -> Option<String> {
|
|
|
|
|
self.get_current_token_data()
|
|
|
|
|
.and_then(|t| t.id_token.chatgpt_plan_type.as_ref().map(|p| p.as_string()))
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn get_current_auth_json(&self) -> Option<AuthDotJson> {
|
|
|
|
|
#[expect(clippy::unwrap_used)]
|
|
|
|
|
self.auth_dot_json.lock().unwrap().clone()
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn get_current_token_data(&self) -> Option<TokenData> {
|
|
|
|
|
self.get_current_auth_json().and_then(|t| t.tokens.clone())
|
2025-08-07 16:33:29 -07:00
|
|
|
}
|
|
|
|
|
|
|
|
|
|
/// Consider this private to integration tests.
|
|
|
|
|
pub fn create_dummy_chatgpt_auth_for_testing() -> Self {
|
|
|
|
|
let auth_dot_json = AuthDotJson {
|
|
|
|
|
openai_api_key: None,
|
|
|
|
|
tokens: Some(TokenData {
|
|
|
|
|
id_token: Default::default(),
|
|
|
|
|
access_token: "Access Token".to_string(),
|
|
|
|
|
refresh_token: "test".to_string(),
|
|
|
|
|
account_id: Some("account_id".to_string()),
|
|
|
|
|
}),
|
|
|
|
|
last_refresh: Some(Utc::now()),
|
|
|
|
|
};
|
|
|
|
|
|
|
|
|
|
let auth_dot_json = Arc::new(Mutex::new(Some(auth_dot_json)));
|
|
|
|
|
Self {
|
|
|
|
|
api_key: None,
|
|
|
|
|
mode: AuthMode::ChatGPT,
|
|
|
|
|
auth_file: PathBuf::new(),
|
|
|
|
|
auth_dot_json,
|
|
|
|
|
}
|
2025-07-31 15:40:19 -07:00
|
|
|
}
|
2025-07-30 12:40:15 -07:00
|
|
|
}
|
|
|
|
|
|
2025-08-07 16:49:37 -07:00
|
|
|
fn load_auth(codex_home: &Path, include_env_var: bool) -> std::io::Result<Option<CodexAuth>> {
|
2025-08-07 18:00:31 -07:00
|
|
|
// First, check to see if there is a valid auth.json file. If not, we fall
|
|
|
|
|
// back to AuthMode::ApiKey using the OPENAI_API_KEY environment variable
|
|
|
|
|
// (if it is set).
|
2025-07-31 10:48:49 -07:00
|
|
|
let auth_file = get_auth_file(codex_home);
|
2025-08-07 18:00:31 -07:00
|
|
|
let auth_dot_json = match try_read_auth_json(&auth_file) {
|
|
|
|
|
Ok(auth) => auth,
|
|
|
|
|
// If auth.json does not exist, try to read the OPENAI_API_KEY from the
|
|
|
|
|
// environment variable.
|
|
|
|
|
Err(e) if e.kind() == std::io::ErrorKind::NotFound && include_env_var => {
|
|
|
|
|
return match read_openai_api_key_from_env() {
|
|
|
|
|
Some(api_key) => Ok(Some(CodexAuth::from_api_key(&api_key))),
|
|
|
|
|
None => Ok(None),
|
|
|
|
|
};
|
|
|
|
|
}
|
|
|
|
|
// Though if auth.json exists but is malformed, do not fall back to the
|
|
|
|
|
// env var because the user may be expecting to use AuthMode::ChatGPT.
|
|
|
|
|
Err(e) => {
|
|
|
|
|
return Err(e);
|
|
|
|
|
}
|
2025-07-31 10:48:49 -07:00
|
|
|
};
|
|
|
|
|
|
2025-08-07 18:00:31 -07:00
|
|
|
let AuthDotJson {
|
|
|
|
|
openai_api_key: auth_json_api_key,
|
|
|
|
|
tokens,
|
|
|
|
|
last_refresh,
|
|
|
|
|
} = auth_dot_json;
|
|
|
|
|
|
|
|
|
|
// If the auth.json has an API key AND does not appear to be on a plan that
|
|
|
|
|
// should prefer AuthMode::ChatGPT, use AuthMode::ApiKey.
|
|
|
|
|
if let Some(api_key) = &auth_json_api_key {
|
|
|
|
|
// Should any of these be AuthMode::ChatGPT with the api_key set?
|
|
|
|
|
// Does AuthMode::ChatGPT indicate that there is an auth.json that is
|
|
|
|
|
// "refreshable" even if we are using the API key for auth?
|
|
|
|
|
match &tokens {
|
|
|
|
|
Some(tokens) => {
|
|
|
|
|
if tokens.is_plan_that_should_use_api_key() {
|
|
|
|
|
return Ok(Some(CodexAuth::from_api_key(api_key)));
|
|
|
|
|
} else {
|
|
|
|
|
// Ignore the API key and fall through to ChatGPT auth.
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
None => {
|
|
|
|
|
// We have an API key but no tokens in the auth.json file.
|
|
|
|
|
// Perhaps the user ran `codex login --api-key <KEY>` or updated
|
|
|
|
|
// auth.json by hand. Either way, let's assume they are trying
|
|
|
|
|
// to use their API key.
|
|
|
|
|
return Ok(Some(CodexAuth::from_api_key(api_key)));
|
|
|
|
|
}
|
|
|
|
|
}
|
2025-07-30 12:40:15 -07:00
|
|
|
}
|
|
|
|
|
|
2025-08-07 18:00:31 -07:00
|
|
|
// For the AuthMode::ChatGPT variant, perhaps neither api_key nor
|
|
|
|
|
// openai_api_key should exist?
|
2025-07-30 12:40:15 -07:00
|
|
|
Ok(Some(CodexAuth {
|
2025-08-07 18:00:31 -07:00
|
|
|
api_key: None,
|
|
|
|
|
mode: AuthMode::ChatGPT,
|
2025-07-30 12:40:15 -07:00
|
|
|
auth_file,
|
2025-08-07 18:00:31 -07:00
|
|
|
auth_dot_json: Arc::new(Mutex::new(Some(AuthDotJson {
|
|
|
|
|
openai_api_key: None,
|
|
|
|
|
tokens,
|
|
|
|
|
last_refresh,
|
|
|
|
|
}))),
|
2025-07-30 12:40:15 -07:00
|
|
|
}))
|
|
|
|
|
}
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
|
2025-08-07 18:00:31 -07:00
|
|
|
fn read_openai_api_key_from_env() -> Option<String> {
|
|
|
|
|
env::var(OPENAI_API_KEY_ENV_VAR)
|
|
|
|
|
.ok()
|
|
|
|
|
.filter(|s| !s.is_empty())
|
|
|
|
|
}
|
|
|
|
|
|
2025-08-07 01:27:45 -07:00
|
|
|
pub fn get_auth_file(codex_home: &Path) -> PathBuf {
|
2025-07-31 10:48:49 -07:00
|
|
|
codex_home.join("auth.json")
|
|
|
|
|
}
|
|
|
|
|
|
2025-08-07 01:17:33 -07:00
|
|
|
/// Delete the auth.json file inside `codex_home` if it exists. Returns `Ok(true)`
|
|
|
|
|
/// if a file was removed, `Ok(false)` if no auth file was present.
|
|
|
|
|
pub fn logout(codex_home: &Path) -> std::io::Result<bool> {
|
|
|
|
|
let auth_file = get_auth_file(codex_home);
|
|
|
|
|
match remove_file(&auth_file) {
|
|
|
|
|
Ok(_) => Ok(true),
|
|
|
|
|
Err(err) if err.kind() == std::io::ErrorKind::NotFound => Ok(false),
|
|
|
|
|
Err(err) => Err(err),
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
2025-08-06 15:22:14 -07:00
|
|
|
/// Represents a running login subprocess. The child can be killed by holding
|
|
|
|
|
/// the mutex and calling `kill()`.
|
|
|
|
|
#[derive(Debug, Clone)]
|
|
|
|
|
pub struct SpawnedLogin {
|
|
|
|
|
pub child: Arc<Mutex<Child>>,
|
|
|
|
|
pub stdout: Arc<Mutex<Vec<u8>>>,
|
|
|
|
|
pub stderr: Arc<Mutex<Vec<u8>>>,
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
/// Spawn the ChatGPT login Python server as a child process and return a handle to its process.
|
|
|
|
|
pub fn spawn_login_with_chatgpt(codex_home: &Path) -> std::io::Result<SpawnedLogin> {
|
2025-08-08 14:57:16 -03:00
|
|
|
let script_path = write_login_script_to_disk()?;
|
2025-08-06 15:22:14 -07:00
|
|
|
let mut cmd = std::process::Command::new("python3");
|
2025-08-08 14:57:16 -03:00
|
|
|
cmd.arg(&script_path)
|
2025-08-06 15:22:14 -07:00
|
|
|
.env("CODEX_HOME", codex_home)
|
|
|
|
|
.env("CODEX_CLIENT_ID", CLIENT_ID)
|
|
|
|
|
.stdin(Stdio::null())
|
|
|
|
|
.stdout(Stdio::piped())
|
|
|
|
|
.stderr(Stdio::piped());
|
|
|
|
|
|
|
|
|
|
let mut child = cmd.spawn()?;
|
|
|
|
|
|
|
|
|
|
let stdout_buf = Arc::new(Mutex::new(Vec::new()));
|
|
|
|
|
let stderr_buf = Arc::new(Mutex::new(Vec::new()));
|
|
|
|
|
|
|
|
|
|
if let Some(mut out) = child.stdout.take() {
|
|
|
|
|
let buf = stdout_buf.clone();
|
|
|
|
|
std::thread::spawn(move || {
|
|
|
|
|
let mut tmp = Vec::new();
|
|
|
|
|
let _ = std::io::copy(&mut out, &mut tmp);
|
|
|
|
|
if let Ok(mut b) = buf.lock() {
|
|
|
|
|
b.extend_from_slice(&tmp);
|
|
|
|
|
}
|
|
|
|
|
});
|
|
|
|
|
}
|
|
|
|
|
if let Some(mut err) = child.stderr.take() {
|
|
|
|
|
let buf = stderr_buf.clone();
|
|
|
|
|
std::thread::spawn(move || {
|
|
|
|
|
let mut tmp = Vec::new();
|
|
|
|
|
let _ = std::io::copy(&mut err, &mut tmp);
|
|
|
|
|
if let Ok(mut b) = buf.lock() {
|
|
|
|
|
b.extend_from_slice(&tmp);
|
|
|
|
|
}
|
|
|
|
|
});
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
Ok(SpawnedLogin {
|
|
|
|
|
child: Arc::new(Mutex::new(child)),
|
|
|
|
|
stdout: stdout_buf,
|
|
|
|
|
stderr: stderr_buf,
|
|
|
|
|
})
|
|
|
|
|
}
|
|
|
|
|
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
/// Run `python3 -c {{SOURCE_FOR_PYTHON_SERVER}}` with the CODEX_HOME
|
|
|
|
|
/// environment variable set to the provided `codex_home` path. If the
|
|
|
|
|
/// subprocess exits 0, read the OPENAI_API_KEY property out of
|
|
|
|
|
/// CODEX_HOME/auth.json and return Ok(OPENAI_API_KEY). Otherwise, return Err
|
|
|
|
|
/// with any information from the subprocess.
|
|
|
|
|
///
|
|
|
|
|
/// If `capture_output` is true, the subprocess's output will be captured and
|
|
|
|
|
/// recorded in memory. Otherwise, the subprocess's output will be sent to the
|
|
|
|
|
/// current process's stdout/stderr.
|
2025-07-30 12:40:15 -07:00
|
|
|
pub async fn login_with_chatgpt(codex_home: &Path, capture_output: bool) -> std::io::Result<()> {
|
2025-08-08 14:57:16 -03:00
|
|
|
let script_path = write_login_script_to_disk()?;
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
let child = Command::new("python3")
|
2025-08-08 14:57:16 -03:00
|
|
|
.arg(&script_path)
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
.env("CODEX_HOME", codex_home)
|
2025-07-30 12:40:15 -07:00
|
|
|
.env("CODEX_CLIENT_ID", CLIENT_ID)
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
.stdin(Stdio::null())
|
|
|
|
|
.stdout(if capture_output {
|
|
|
|
|
Stdio::piped()
|
|
|
|
|
} else {
|
|
|
|
|
Stdio::inherit()
|
|
|
|
|
})
|
|
|
|
|
.stderr(if capture_output {
|
|
|
|
|
Stdio::piped()
|
|
|
|
|
} else {
|
|
|
|
|
Stdio::inherit()
|
|
|
|
|
})
|
|
|
|
|
.spawn()?;
|
|
|
|
|
|
|
|
|
|
let output = child.wait_with_output().await?;
|
|
|
|
|
if output.status.success() {
|
2025-07-30 12:40:15 -07:00
|
|
|
Ok(())
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
} else {
|
|
|
|
|
let stderr = String::from_utf8_lossy(&output.stderr);
|
|
|
|
|
Err(std::io::Error::other(format!(
|
|
|
|
|
"login_with_chatgpt subprocess failed: {stderr}"
|
|
|
|
|
)))
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
2025-08-08 14:57:16 -03:00
|
|
|
fn write_login_script_to_disk() -> std::io::Result<PathBuf> {
|
|
|
|
|
// Write the embedded Python script to a file to avoid very long
|
|
|
|
|
// command-line arguments (Windows error 206).
|
|
|
|
|
let mut tmp = NamedTempFile::new()?;
|
|
|
|
|
tmp.write_all(SOURCE_FOR_PYTHON_SERVER.as_bytes())?;
|
|
|
|
|
tmp.flush()?;
|
|
|
|
|
|
|
|
|
|
let (_file, path) = tmp.keep()?;
|
|
|
|
|
Ok(path)
|
|
|
|
|
}
|
|
|
|
|
|
2025-07-31 10:48:49 -07:00
|
|
|
pub fn login_with_api_key(codex_home: &Path, api_key: &str) -> std::io::Result<()> {
|
|
|
|
|
let auth_dot_json = AuthDotJson {
|
|
|
|
|
openai_api_key: Some(api_key.to_string()),
|
|
|
|
|
tokens: None,
|
|
|
|
|
last_refresh: None,
|
|
|
|
|
};
|
|
|
|
|
write_auth_json(&get_auth_file(codex_home), &auth_dot_json)
|
|
|
|
|
}
|
|
|
|
|
|
2025-07-11 13:30:11 -04:00
|
|
|
/// Attempt to read and refresh the `auth.json` file in the given `CODEX_HOME` directory.
|
|
|
|
|
/// Returns the full AuthDotJson structure after refreshing if necessary.
|
2025-07-30 12:40:15 -07:00
|
|
|
pub fn try_read_auth_json(auth_file: &Path) -> std::io::Result<AuthDotJson> {
|
2025-08-06 15:22:14 -07:00
|
|
|
let mut file = File::open(auth_file)?;
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
let mut contents = String::new();
|
|
|
|
|
file.read_to_string(&mut contents)?;
|
|
|
|
|
let auth_dot_json: AuthDotJson = serde_json::from_str(&contents)?;
|
|
|
|
|
|
2025-07-30 12:40:15 -07:00
|
|
|
Ok(auth_dot_json)
|
|
|
|
|
}
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
|
2025-07-31 10:48:49 -07:00
|
|
|
fn write_auth_json(auth_file: &Path, auth_dot_json: &AuthDotJson) -> std::io::Result<()> {
|
|
|
|
|
let json_data = serde_json::to_string_pretty(auth_dot_json)?;
|
2025-07-30 12:40:15 -07:00
|
|
|
let mut options = OpenOptions::new();
|
|
|
|
|
options.truncate(true).write(true).create(true);
|
|
|
|
|
#[cfg(unix)]
|
|
|
|
|
{
|
|
|
|
|
options.mode(0o600);
|
|
|
|
|
}
|
2025-07-31 10:48:49 -07:00
|
|
|
let mut file = options.open(auth_file)?;
|
|
|
|
|
file.write_all(json_data.as_bytes())?;
|
|
|
|
|
file.flush()?;
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
async fn update_tokens(
|
|
|
|
|
auth_file: &Path,
|
|
|
|
|
id_token: String,
|
|
|
|
|
access_token: Option<String>,
|
|
|
|
|
refresh_token: Option<String>,
|
|
|
|
|
) -> std::io::Result<AuthDotJson> {
|
2025-07-30 12:40:15 -07:00
|
|
|
let mut auth_dot_json = try_read_auth_json(auth_file)?;
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
|
2025-07-31 10:48:49 -07:00
|
|
|
let tokens = auth_dot_json.tokens.get_or_insert_with(TokenData::default);
|
2025-08-07 01:27:45 -07:00
|
|
|
tokens.id_token = parse_id_token(&id_token).map_err(std::io::Error::other)?;
|
2025-07-30 12:40:15 -07:00
|
|
|
if let Some(access_token) = access_token {
|
2025-07-31 10:48:49 -07:00
|
|
|
tokens.access_token = access_token.to_string();
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
}
|
2025-07-30 12:40:15 -07:00
|
|
|
if let Some(refresh_token) = refresh_token {
|
2025-07-31 10:48:49 -07:00
|
|
|
tokens.refresh_token = refresh_token.to_string();
|
2025-07-30 12:40:15 -07:00
|
|
|
}
|
2025-07-31 10:48:49 -07:00
|
|
|
auth_dot_json.last_refresh = Some(Utc::now());
|
|
|
|
|
write_auth_json(auth_file, &auth_dot_json)?;
|
2025-07-30 12:40:15 -07:00
|
|
|
Ok(auth_dot_json)
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
}
|
|
|
|
|
|
2025-07-30 12:40:15 -07:00
|
|
|
async fn try_refresh_token(refresh_token: String) -> std::io::Result<RefreshResponse> {
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
let refresh_request = RefreshRequest {
|
|
|
|
|
client_id: CLIENT_ID,
|
|
|
|
|
grant_type: "refresh_token",
|
2025-07-30 12:40:15 -07:00
|
|
|
refresh_token,
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
scope: "openid profile email",
|
|
|
|
|
};
|
|
|
|
|
|
|
|
|
|
let client = reqwest::Client::new();
|
|
|
|
|
let response = client
|
|
|
|
|
.post("https://auth.openai.com/oauth/token")
|
|
|
|
|
.header("Content-Type", "application/json")
|
|
|
|
|
.json(&refresh_request)
|
|
|
|
|
.send()
|
|
|
|
|
.await
|
|
|
|
|
.map_err(std::io::Error::other)?;
|
|
|
|
|
|
|
|
|
|
if response.status().is_success() {
|
|
|
|
|
let refresh_response = response
|
|
|
|
|
.json::<RefreshResponse>()
|
|
|
|
|
.await
|
|
|
|
|
.map_err(std::io::Error::other)?;
|
|
|
|
|
Ok(refresh_response)
|
|
|
|
|
} else {
|
|
|
|
|
Err(std::io::Error::other(format!(
|
|
|
|
|
"Failed to refresh token: {}",
|
|
|
|
|
response.status()
|
|
|
|
|
)))
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
#[derive(Serialize)]
|
|
|
|
|
struct RefreshRequest {
|
|
|
|
|
client_id: &'static str,
|
|
|
|
|
grant_type: &'static str,
|
|
|
|
|
refresh_token: String,
|
|
|
|
|
scope: &'static str,
|
|
|
|
|
}
|
|
|
|
|
|
2025-07-30 12:40:15 -07:00
|
|
|
#[derive(Deserialize, Clone)]
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
struct RefreshResponse {
|
|
|
|
|
id_token: String,
|
2025-07-30 12:40:15 -07:00
|
|
|
access_token: Option<String>,
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
refresh_token: Option<String>,
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
/// Expected structure for $CODEX_HOME/auth.json.
|
2025-07-30 12:40:15 -07:00
|
|
|
#[derive(Deserialize, Serialize, Clone, Debug, PartialEq)]
|
2025-07-11 13:30:11 -04:00
|
|
|
pub struct AuthDotJson {
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
#[serde(rename = "OPENAI_API_KEY")]
|
2025-07-30 12:40:15 -07:00
|
|
|
pub openai_api_key: Option<String>,
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
|
2025-07-31 10:48:49 -07:00
|
|
|
#[serde(default, skip_serializing_if = "Option::is_none")]
|
|
|
|
|
pub tokens: Option<TokenData>,
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
|
2025-07-31 10:48:49 -07:00
|
|
|
#[serde(default, skip_serializing_if = "Option::is_none")]
|
|
|
|
|
pub last_refresh: Option<DateTime<Utc>>,
|
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
https://github.com/openai/codex/blob/a67a67f3258fc21e147b6786a143fe3e15e6d5ba/codex-cli/src/utils/get-api-key.tsx#L84-L89
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
|
|
|
}
|
|
|
|
|
|
2025-07-31 10:48:49 -07:00
|
|
|
#[cfg(test)]
|
|
|
|
|
mod tests {
|
2025-08-07 18:00:31 -07:00
|
|
|
#![expect(clippy::expect_used, clippy::unwrap_used)]
|
2025-07-31 10:48:49 -07:00
|
|
|
use super::*;
|
2025-08-07 01:27:45 -07:00
|
|
|
use crate::token_data::IdTokenInfo;
|
2025-08-07 18:00:31 -07:00
|
|
|
use crate::token_data::KnownPlan;
|
|
|
|
|
use crate::token_data::PlanType;
|
2025-08-07 01:27:45 -07:00
|
|
|
use base64::Engine;
|
|
|
|
|
use pretty_assertions::assert_eq;
|
2025-08-07 18:00:31 -07:00
|
|
|
use serde_json::json;
|
2025-07-31 10:48:49 -07:00
|
|
|
use tempfile::tempdir;
|
|
|
|
|
|
2025-08-07 18:00:31 -07:00
|
|
|
const LAST_REFRESH: &str = "2025-08-06T20:41:36.232376Z";
|
|
|
|
|
|
2025-07-31 10:48:49 -07:00
|
|
|
#[test]
|
|
|
|
|
fn writes_api_key_and_loads_auth() {
|
|
|
|
|
let dir = tempdir().unwrap();
|
|
|
|
|
login_with_api_key(dir.path(), "sk-test-key").unwrap();
|
2025-08-07 16:49:37 -07:00
|
|
|
let auth = load_auth(dir.path(), false).unwrap().unwrap();
|
2025-07-31 10:48:49 -07:00
|
|
|
assert_eq!(auth.mode, AuthMode::ApiKey);
|
|
|
|
|
assert_eq!(auth.api_key.as_deref(), Some("sk-test-key"));
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
#[test]
|
|
|
|
|
fn loads_from_env_var_if_env_var_exists() {
|
|
|
|
|
let dir = tempdir().unwrap();
|
|
|
|
|
|
|
|
|
|
let env_var = std::env::var(OPENAI_API_KEY_ENV_VAR);
|
|
|
|
|
|
|
|
|
|
if let Ok(env_var) = env_var {
|
2025-08-07 16:49:37 -07:00
|
|
|
let auth = load_auth(dir.path(), true).unwrap().unwrap();
|
2025-07-31 10:48:49 -07:00
|
|
|
assert_eq!(auth.mode, AuthMode::ApiKey);
|
|
|
|
|
assert_eq!(auth.api_key, Some(env_var));
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
#[tokio::test]
|
2025-08-07 18:00:31 -07:00
|
|
|
async fn pro_account_with_no_api_key_uses_chatgpt_auth() {
|
|
|
|
|
let codex_home = tempdir().unwrap();
|
|
|
|
|
write_auth_file(
|
|
|
|
|
AuthFileParams {
|
|
|
|
|
openai_api_key: None,
|
|
|
|
|
chatgpt_plan_type: "pro".to_string(),
|
|
|
|
|
},
|
|
|
|
|
codex_home.path(),
|
2025-07-31 10:48:49 -07:00
|
|
|
)
|
2025-08-07 18:00:31 -07:00
|
|
|
.expect("failed to write auth file");
|
2025-07-31 10:48:49 -07:00
|
|
|
|
2025-08-07 01:27:45 -07:00
|
|
|
let CodexAuth {
|
|
|
|
|
api_key,
|
|
|
|
|
mode,
|
|
|
|
|
auth_dot_json,
|
2025-08-07 18:00:31 -07:00
|
|
|
auth_file: _,
|
|
|
|
|
} = load_auth(codex_home.path(), false).unwrap().unwrap();
|
2025-08-07 01:27:45 -07:00
|
|
|
assert_eq!(None, api_key);
|
|
|
|
|
assert_eq!(AuthMode::ChatGPT, mode);
|
|
|
|
|
|
|
|
|
|
let guard = auth_dot_json.lock().unwrap();
|
|
|
|
|
let auth_dot_json = guard.as_ref().expect("AuthDotJson should exist");
|
2025-08-07 18:00:31 -07:00
|
|
|
assert_eq!(
|
|
|
|
|
&AuthDotJson {
|
|
|
|
|
openai_api_key: None,
|
|
|
|
|
tokens: Some(TokenData {
|
|
|
|
|
id_token: IdTokenInfo {
|
|
|
|
|
email: Some("user@example.com".to_string()),
|
|
|
|
|
chatgpt_plan_type: Some(PlanType::Known(KnownPlan::Pro)),
|
|
|
|
|
},
|
|
|
|
|
access_token: "test-access-token".to_string(),
|
|
|
|
|
refresh_token: "test-refresh-token".to_string(),
|
|
|
|
|
account_id: None,
|
|
|
|
|
}),
|
|
|
|
|
last_refresh: Some(
|
|
|
|
|
DateTime::parse_from_rfc3339(LAST_REFRESH)
|
|
|
|
|
.unwrap()
|
|
|
|
|
.with_timezone(&Utc)
|
|
|
|
|
),
|
|
|
|
|
},
|
|
|
|
|
auth_dot_json
|
|
|
|
|
)
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
/// Even if the OPENAI_API_KEY is set in auth.json, if the plan is not in
|
|
|
|
|
/// [`TokenData::is_plan_that_should_use_api_key`], it should use
|
|
|
|
|
/// [`AuthMode::ChatGPT`].
|
|
|
|
|
#[tokio::test]
|
|
|
|
|
async fn pro_account_with_api_key_still_uses_chatgpt_auth() {
|
|
|
|
|
let codex_home = tempdir().unwrap();
|
|
|
|
|
write_auth_file(
|
|
|
|
|
AuthFileParams {
|
|
|
|
|
openai_api_key: Some("sk-test-key".to_string()),
|
|
|
|
|
chatgpt_plan_type: "pro".to_string(),
|
|
|
|
|
},
|
|
|
|
|
codex_home.path(),
|
|
|
|
|
)
|
|
|
|
|
.expect("failed to write auth file");
|
2025-08-07 01:27:45 -07:00
|
|
|
|
2025-08-07 18:00:31 -07:00
|
|
|
let CodexAuth {
|
|
|
|
|
api_key,
|
|
|
|
|
mode,
|
|
|
|
|
auth_dot_json,
|
|
|
|
|
auth_file: _,
|
|
|
|
|
} = load_auth(codex_home.path(), false).unwrap().unwrap();
|
|
|
|
|
assert_eq!(None, api_key);
|
|
|
|
|
assert_eq!(AuthMode::ChatGPT, mode);
|
|
|
|
|
|
|
|
|
|
let guard = auth_dot_json.lock().unwrap();
|
|
|
|
|
let auth_dot_json = guard.as_ref().expect("AuthDotJson should exist");
|
2025-07-31 10:48:49 -07:00
|
|
|
assert_eq!(
|
2025-08-07 01:27:45 -07:00
|
|
|
&AuthDotJson {
|
|
|
|
|
openai_api_key: None,
|
|
|
|
|
tokens: Some(TokenData {
|
|
|
|
|
id_token: IdTokenInfo {
|
|
|
|
|
email: Some("user@example.com".to_string()),
|
2025-08-07 18:00:31 -07:00
|
|
|
chatgpt_plan_type: Some(PlanType::Known(KnownPlan::Pro)),
|
2025-08-07 01:27:45 -07:00
|
|
|
},
|
|
|
|
|
access_token: "test-access-token".to_string(),
|
|
|
|
|
refresh_token: "test-refresh-token".to_string(),
|
|
|
|
|
account_id: None,
|
|
|
|
|
}),
|
|
|
|
|
last_refresh: Some(
|
2025-08-07 18:00:31 -07:00
|
|
|
DateTime::parse_from_rfc3339(LAST_REFRESH)
|
2025-08-07 01:27:45 -07:00
|
|
|
.unwrap()
|
|
|
|
|
.with_timezone(&Utc)
|
|
|
|
|
),
|
|
|
|
|
},
|
|
|
|
|
auth_dot_json
|
|
|
|
|
)
|
|
|
|
|
}
|
|
|
|
|
|
2025-08-07 18:00:31 -07:00
|
|
|
/// If the OPENAI_API_KEY is set in auth.json and it is an enterprise
|
|
|
|
|
/// account, then it should use [`AuthMode::ApiKey`].
|
|
|
|
|
#[tokio::test]
|
|
|
|
|
async fn enterprise_account_with_api_key_uses_chatgpt_auth() {
|
|
|
|
|
let codex_home = tempdir().unwrap();
|
|
|
|
|
write_auth_file(
|
|
|
|
|
AuthFileParams {
|
|
|
|
|
openai_api_key: Some("sk-test-key".to_string()),
|
|
|
|
|
chatgpt_plan_type: "enterprise".to_string(),
|
|
|
|
|
},
|
|
|
|
|
codex_home.path(),
|
|
|
|
|
)
|
|
|
|
|
.expect("failed to write auth file");
|
|
|
|
|
|
|
|
|
|
let CodexAuth {
|
|
|
|
|
api_key,
|
|
|
|
|
mode,
|
|
|
|
|
auth_dot_json,
|
|
|
|
|
auth_file: _,
|
|
|
|
|
} = load_auth(codex_home.path(), false).unwrap().unwrap();
|
|
|
|
|
assert_eq!(Some("sk-test-key".to_string()), api_key);
|
|
|
|
|
assert_eq!(AuthMode::ApiKey, mode);
|
|
|
|
|
|
|
|
|
|
let guard = auth_dot_json.lock().expect("should unwrap");
|
|
|
|
|
assert!(guard.is_none(), "auth_dot_json should be None");
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
struct AuthFileParams {
|
|
|
|
|
openai_api_key: Option<String>,
|
|
|
|
|
chatgpt_plan_type: String,
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn write_auth_file(params: AuthFileParams, codex_home: &Path) -> std::io::Result<()> {
|
|
|
|
|
let auth_file = get_auth_file(codex_home);
|
|
|
|
|
// Create a minimal valid JWT for the id_token field.
|
|
|
|
|
#[derive(Serialize)]
|
|
|
|
|
struct Header {
|
|
|
|
|
alg: &'static str,
|
|
|
|
|
typ: &'static str,
|
|
|
|
|
}
|
|
|
|
|
let header = Header {
|
|
|
|
|
alg: "none",
|
|
|
|
|
typ: "JWT",
|
|
|
|
|
};
|
|
|
|
|
let payload = serde_json::json!({
|
|
|
|
|
"email": "user@example.com",
|
|
|
|
|
"email_verified": true,
|
|
|
|
|
"https://api.openai.com/auth": {
|
|
|
|
|
"chatgpt_account_id": "bc3618e3-489d-4d49-9362-1561dc53ba53",
|
|
|
|
|
"chatgpt_plan_type": params.chatgpt_plan_type,
|
|
|
|
|
"chatgpt_user_id": "user-12345",
|
|
|
|
|
"user_id": "user-12345",
|
|
|
|
|
}
|
|
|
|
|
});
|
|
|
|
|
let b64 = |b: &[u8]| base64::engine::general_purpose::URL_SAFE_NO_PAD.encode(b);
|
|
|
|
|
let header_b64 = b64(&serde_json::to_vec(&header)?);
|
|
|
|
|
let payload_b64 = b64(&serde_json::to_vec(&payload)?);
|
|
|
|
|
let signature_b64 = b64(b"sig");
|
|
|
|
|
let fake_jwt = format!("{header_b64}.{payload_b64}.{signature_b64}");
|
|
|
|
|
|
|
|
|
|
let auth_json_data = json!({
|
|
|
|
|
"OPENAI_API_KEY": params.openai_api_key,
|
|
|
|
|
"tokens": {
|
|
|
|
|
"id_token": fake_jwt,
|
|
|
|
|
"access_token": "test-access-token",
|
|
|
|
|
"refresh_token": "test-refresh-token"
|
|
|
|
|
},
|
|
|
|
|
"last_refresh": LAST_REFRESH,
|
|
|
|
|
});
|
|
|
|
|
let auth_json = serde_json::to_string_pretty(&auth_json_data)?;
|
|
|
|
|
std::fs::write(auth_file, auth_json)
|
|
|
|
|
}
|
|
|
|
|
|
2025-08-07 01:27:45 -07:00
|
|
|
#[test]
|
|
|
|
|
fn id_token_info_handles_missing_fields() {
|
|
|
|
|
// Payload without email or plan should yield None values.
|
|
|
|
|
let header = serde_json::json!({"alg": "none", "typ": "JWT"});
|
|
|
|
|
let payload = serde_json::json!({"sub": "123"});
|
|
|
|
|
let header_b64 = base64::engine::general_purpose::URL_SAFE_NO_PAD
|
|
|
|
|
.encode(serde_json::to_vec(&header).unwrap());
|
|
|
|
|
let payload_b64 = base64::engine::general_purpose::URL_SAFE_NO_PAD
|
|
|
|
|
.encode(serde_json::to_vec(&payload).unwrap());
|
|
|
|
|
let signature_b64 = base64::engine::general_purpose::URL_SAFE_NO_PAD.encode(b"sig");
|
|
|
|
|
let jwt = format!("{header_b64}.{payload_b64}.{signature_b64}");
|
|
|
|
|
|
|
|
|
|
let info = parse_id_token(&jwt).expect("should parse");
|
|
|
|
|
assert!(info.email.is_none());
|
|
|
|
|
assert!(info.chatgpt_plan_type.is_none());
|
2025-07-31 10:48:49 -07:00
|
|
|
}
|
|
|
|
|
|
|
|
|
|
#[tokio::test]
|
|
|
|
|
async fn loads_api_key_from_auth_json() {
|
|
|
|
|
let dir = tempdir().unwrap();
|
|
|
|
|
let auth_file = dir.path().join("auth.json");
|
|
|
|
|
std::fs::write(
|
|
|
|
|
auth_file,
|
|
|
|
|
r#"
|
|
|
|
|
{
|
|
|
|
|
"OPENAI_API_KEY": "sk-test-key",
|
|
|
|
|
"tokens": null,
|
|
|
|
|
"last_refresh": null
|
|
|
|
|
}
|
|
|
|
|
"#,
|
|
|
|
|
)
|
|
|
|
|
.unwrap();
|
|
|
|
|
|
2025-08-07 16:49:37 -07:00
|
|
|
let auth = load_auth(dir.path(), false).unwrap().unwrap();
|
2025-07-31 10:48:49 -07:00
|
|
|
assert_eq!(auth.mode, AuthMode::ApiKey);
|
|
|
|
|
assert_eq!(auth.api_key, Some("sk-test-key".to_string()));
|
|
|
|
|
|
|
|
|
|
assert!(auth.get_token_data().await.is_err());
|
|
|
|
|
}
|
2025-08-07 01:17:33 -07:00
|
|
|
|
|
|
|
|
#[test]
|
|
|
|
|
fn logout_removes_auth_file() -> Result<(), std::io::Error> {
|
|
|
|
|
let dir = tempdir()?;
|
|
|
|
|
login_with_api_key(dir.path(), "sk-test-key")?;
|
|
|
|
|
assert!(dir.path().join("auth.json").exists());
|
|
|
|
|
let removed = logout(dir.path())?;
|
|
|
|
|
assert!(removed);
|
|
|
|
|
assert!(!dir.path().join("auth.json").exists());
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
2025-07-31 10:48:49 -07:00
|
|
|
}
|