Commit Graph

736 Commits

Author SHA1 Message Date
Yaroslav
f146981b73 feat: add JSON schema sanitization for MCP tools to ensure compatibil… (#1975)
…ity with internal JsonSchema enum

Closes: #1973 

Co-authored-by: Dylan Hurd <dylan.hurd@openai.com>
2025-08-10 17:57:39 -07:00
Michael Bolin
bff4435c80 docs: update the docs to explain how to authenticate on a headless machine (#2121)
Users on "headless" machines, such as WSL users, are understandable
having trouble authenticating successfully. To date, I have been
providing one-off user support on issues such as
https://github.com/openai/codex/issues/2000, but we need a more detailed
explanation that we can link to so that users can self-serve. This PR
aims to provide detailed information that we can link to in response to
user issues going forward.

That said, it would also be helpful if we employed heuristics to detect
this issue at runtime, and/or we should just link to these docs as part
of the `codex login` flow.
2025-08-10 14:19:27 -07:00
Michael Bolin
e87974ae83 fix: improve npm release process (#2055)
This improves the release process by introducing
`scripts/publish_to_npm.py` to automate publishing to npm (modulo the
human 2fac step).

As part of this, it updates `.github/workflows/rust-release.yml` to
create the artifact for npm using `npm pack`.

And finally, while it is long overdue, this memorializes the release
process in `docs/release_management.md`.
2025-08-08 19:07:36 -07:00
pakrym-oai
329f01b728 feat: allow esc to interrupt session (#2054)
## Summary
- allow Esc to interrupt the current session when a task is running
- document Esc as an interrupt key in status indicator

## Testing
- `just fmt`
- `just fix` *(fails: E0658 `let` expressions in this position are
unstable)*
- `cargo test --all-features` *(fails: E0658 `let` expressions in this
position are unstable)*

------
https://chatgpt.com/codex/tasks/task_i_689698cf605883208f57b0317ff6a303
2025-08-08 18:59:54 -07:00
aibrahim-oai
4a916ba914 Show ChatGPT login URL during onboarding (#2028)
## Summary
- display authentication URL in the ChatGPT sign-in screen while
onboarding

<img width="684" height="151" alt="image"
src="https://github.com/user-attachments/assets/a8c32cb0-77f6-4a3f-ae3b-6695247c994d"
/>
2025-08-09 01:30:34 +00:00
Dylan
0091930f5a [core] Allow resume after client errors (#2053)
## Summary
Allow tui conversations to resume after the client fails out of retries.
I tested this with exec / mocked api failures as well, and it appears to
be fine. But happy to add an exec integration test as well!

## Testing
- [x] Added integration test
- [x] Tested locally
2025-08-08 18:21:19 -07:00
Dylan
a2b9f46006 [exec] Fix exec sandbox arg (#2034)
## Summary
From codex-cli 😁 
`-s/--sandbox` now correctly affects sandbox mode.

What changed
- In `codex-rs/exec/src/cli.rs`:
- Added `value_enum` to the `--sandbox` flag so Clap parses enum values
into `
SandboxModeCliArg`.
- This ensures values like `-s read-only`, `-s workspace-write`, and `-s
dange
r-full-access` are recognized and propagated.

Why this fixes it
- The enum already derives `ValueEnum`, but without `#[arg(value_enum)]`
Clap ma
y not map the string into the enum, leaving the option ineffective at
runtime. W
ith `value_enum`, `sandbox_mode` is parsed and then converted to
`SandboxMode` i
n `run_main`, which feeds into `ConfigOverrides` and ultimately into the
effecti
ve `sandbox_policy`.
2025-08-08 18:19:40 -07:00
Michael Bolin
408c7ca142 chore: remove the TypeScript code from the repository (#2048)
This deletes the bulk of the `codex-cli` folder and eliminates the logic
that builds the TypeScript code and bundles it into the release.

Since this PR modifies `.github/workflows/rust-release.yml`, to test
changes to the release process, I locally commented out all of the "is
this commit on upstream `main`" checks in
`scripts/create_github_release.sh` and ran:

```
./codex-rs/scripts/create_github_release.sh 0.20.0-alpha.4
```

Which kicked off:

https://github.com/openai/codex/actions/runs/16842085113

And the release artifacts appear legit!

https://github.com/openai/codex/releases/tag/rust-v0.20.0-alpha.4
2025-08-08 16:09:39 -07:00
Dylan
75febbdefa Update README.md (#1989)
Updates the README to clarify auth vs. api key behavior.
2025-08-08 15:19:20 -07:00
Michael Bolin
39a4d4ed8e fix: try building the npm package in CI (#2043)
Historically, the release process for the npm module has been:

- I run `codex-rs/scripts/create_github_release.sh` to kick off a
release for the native artifacts.
- I wait until it is done.
- I run `codex-cli/scripts/stage_rust_release.py` to build the npm
release locally
- I run `npm publish` from my laptop

It has been a longstanding issue to move the npm build to CI. I may
still have to do the `npm publish` manually because it requires 2fac
with `npm`, though I assume we can work that out later.

Note I asked Codex to make these updates, and while they look pretty
good to me, I'm not 100% certain, but let's just merge this and I'll
kick off another alpha build and we'll see what happens?
2025-08-08 15:17:54 -07:00
pakrym-oai
33f266dab3 Use certifi certificate when available (#2042)
certifi has a more consistent set of Mozilla maintained root
certificates
2025-08-08 22:15:35 +00:00
Michael Bolin
d0cf036799 feat: include Windows binary of the CLI in the npm release (#2040)
To date, the build scripts in `codex-cli` still supported building the
old TypeScript version of the Codex CLI to give Windows users something
they can run, but we are just going to have them use the Rust version
like everyone else, so:

- updates `codex-cli/bin/codex.js` so that we run the native binary or
throw if the target platform/arch is not supported (no more conditional
usage based on `CODEX_RUST`, `use-native` file, etc.)
- drops the `--native` flag from `codex-cli/scripts/stage_release.sh`
and updates all the code paths to behave as if `--native` were passed
(i.e., it is the only way to run it now)

Tested this by running:

```
./codex-cli/scripts/stage_rust_release.py --release-version 0.20.0-alpha.2
```
2025-08-08 14:44:35 -07:00
Michael Bolin
8a26ea0fe0 fix: stop building codex-exec and codex-linux-sandbox binaries (#2036)
Release builds are taking awhile and part of the reason that we are
building binaries that we are not really using. Adding Windows binaries
into releases (https://github.com/openai/codex/pull/2035) slows things
down, so we need to get some time back.

- `codex-exec` is basically a standalone `codex exec` that we were
offering because it's a bit smaller as it does not include all the bits
to power the TUI. We were using it in our experimental GitHub Action, so
this PR updates the Action to use `codex exec` instead.
- `codex-linux-sandbox` was a helper binary for the TypeScript version
of the CLI, but I am about to axe that, so we don't need this either.

If we decide to bring `codex-exec` back at some point, we should use a
separate instances so we can build it in parallel with `codex`. (I think
if we had beefier build machines, this wouldn't be so bad, but that's
not the case with the default runners from GitHub.)
2025-08-08 13:42:33 -07:00
Michael Bolin
18eb157000 feat: include windows binaries in GitHub releases (#2035)
We should stop shipping the old TypeScript CLI to Windows users. I did
some light testing of the Rust CLI on Windows in `cmd.exe` and it works
better than I expected!
2025-08-08 13:03:11 -07:00
aibrahim-oai
6cfee15612 Moving the compact prompt near where it's used (#2031)
- Moved the prompt for compact to core
- Renamed it to be more clear
2025-08-08 12:43:43 -07:00
Josh LeBlanc
216e9e2ed0 Fix rust build on windows (#2019)
This pull request implements a fix from #2000, as well as fixed an
additional problem with path lengths on windows that prevents the login
from displaying.

---------

Co-authored-by: Michael Bolin <bolinfest@gmail.com>
Co-authored-by: Michael Bolin <mbolin@openai.com>
2025-08-08 10:57:16 -07:00
Gabriel Peal
c3a8ab8511 Fix multiline exec command rendering (#2023)
With Ratatui, if a single line contains newlines, it increments y but
not x so each subsequent line continued from the same x position as the
previous line ended on.

Before
<img width="2010" height="376" alt="CleanShot 2025-08-08 at 09 13 13"
src="https://github.com/user-attachments/assets/09feefbd-c5ee-4631-8967-93ab108c352a"
/>
After
<img width="1002" height="364" alt="CleanShot 2025-08-08 at 09 11 54"
src="https://github.com/user-attachments/assets/a58b47cf-777f-436a-93d9-ab277046a577"
/>
2025-08-08 13:52:24 -04:00
pakrym-oai
307d9957fa Fix usage limit banner grammar (#2018)
## Summary
- fix typo in usage limit banner text
- update error message tests

## Testing
- `just fmt`
- `RUSTC_BOOTSTRAP=1 just fix` *(fails: `let` expressions in this
position are unstable)*
- `RUSTC_BOOTSTRAP=1 cargo test --all-features` *(fails: `let`
expressions in this position are unstable)*

------
https://chatgpt.com/codex/tasks/task_i_689610fc1fe4832081bdd1118779b60b
2025-08-08 08:50:44 -07:00
pakrym-oai
431c9299d4 Remove part of the error message (#1983) 2025-08-08 02:01:53 +00:00
easong-openai
52e12f2b6c Revert "Streaming markdown (#1920)" (#1981)
This reverts commit 2b7139859e.
2025-08-08 01:38:39 +00:00
easong-openai
2b7139859e Streaming markdown (#1920)
We wait until we have an entire newline, then format it with markdown and stream in to the UI. This reduces time to first token but is the right thing to do with our current rendering model IMO. Also lets us add word wrapping!
2025-08-07 18:26:47 -07:00
pakrym-oai
fa0051190b Adjust error messages (#1969)
<img width="1378" height="285" alt="image"
src="https://github.com/user-attachments/assets/f0283378-f839-4a1f-8331-909694a04b1f"
/>
2025-08-07 18:24:34 -07:00
Michael Bolin
cd06b28d84 fix: default to credits from ChatGPT auth, when possible (#1971)
Uses this rough strategy for authentication:

```
if auth.json
	if auth.json.API_KEY is NULL # new auth
		CHAT
	else # old auth
		if plus or pro or team
			CHAT
		else 
			API_KEY
		
else OPENAI_API_KEY
```

---
[//]: # (BEGIN SAPLING FOOTER)
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with [ReviewStack](https://reviewstack.dev/openai/codex/pull/1970).
* __->__ #1971
* #1970
* #1966
* #1965
* #1962
2025-08-07 18:00:31 -07:00
Michael Bolin
295abf3e51 chore: change CodexAuth::from_api_key() to take &str instead of String (#1970)
Good practice and simplifies some of the call sites.

---
[//]: # (BEGIN SAPLING FOOTER)
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with [ReviewStack](https://reviewstack.dev/openai/codex/pull/1970).
* #1971
* __->__ #1970
* #1966
* #1965
* #1962
2025-08-07 16:55:33 -07:00
Michael Bolin
b991c04f86 chore: move top-level load_auth() to CodexAuth::from_codex_home() (#1966)
There are two valid ways to create an instance of `CodexAuth`:
`from_api_key()` and `from_codex_home()`. Now both are static methods of
`CodexAuth` and are listed first in the implementation.

---
[//]: # (BEGIN SAPLING FOOTER)
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with [ReviewStack](https://reviewstack.dev/openai/codex/pull/1966).
* #1971
* #1970
* __->__ #1966
* #1965
* #1962
2025-08-07 16:49:37 -07:00
Michael Bolin
02c9c2ecad chore: make CodexAuth::api_key a private field (#1965)
Force callers to access this information via `get_token()` rather than
messing with it directly.

---
[//]: # (BEGIN SAPLING FOOTER)
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with [ReviewStack](https://reviewstack.dev/openai/codex/pull/1965).
* #1971
* #1970
* #1966
* __->__ #1965
* #1962
2025-08-07 16:40:01 -07:00
Michael Bolin
db76f32888 chore: rename CodexAuth::new() to create_dummy_codex_auth_for_testing() because it is not for general consumption (#1962)
`CodexAuth::new()` was the first method listed in `CodexAuth`, but it is
only meant to be used by tests. Rename it to
`create_dummy_chatgpt_auth_for_testing()` and move it to the end of the
implementation.

---
[//]: # (BEGIN SAPLING FOOTER)
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with [ReviewStack](https://reviewstack.dev/openai/codex/pull/1962).
* #1971
* #1970
* #1966
* #1965
* __->__ #1962
2025-08-07 16:33:29 -07:00
Dylan
548466df09 [client] Tune retries and backoff (#1956)
## Summary
10 is a bit excessive 😅 Also updates our backoff factor to space out
requests further.
2025-08-07 15:23:31 -07:00
Michael Bolin
7d67159587 fix: public load_auth() fn always called with include_env_var=true (#1961)
Apparently `include_env_var=false` was only used for testing, so clean
up the API a little to make that clear.


---
[//]: # (BEGIN SAPLING FOOTER)
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with [ReviewStack](https://reviewstack.dev/openai/codex/pull/1961).
* #1962
* __->__ #1961
2025-08-07 14:19:30 -07:00
Michael Bolin
f74fe7af7b fix: fix mistaken bitwise OR in #1949 (#1957)
This is hard for me to test conclusively because I have the default of
`ctrl left/right` used to migrate between Spaces on macOS.
2025-08-07 20:11:06 +00:00
Jeremy Rose
c787603812 ctrl+arrows also move words (#1949)
this was removed at some point, but this is a common keybind for word
left/right.
2025-08-07 18:27:44 +00:00
Ed Bayes
e07776ccc9 update readme (#1948)
Co-authored-by: Alexander Embiricos <ae@openai.com>
2025-08-07 11:20:53 -07:00
pakrym-oai
f23c3066c8 Add capacity error (#1947) 2025-08-07 10:46:43 -07:00
pakrym-oai
a593b1c3ab Use different field for error type (#1945) 2025-08-07 10:20:33 -07:00
Michael Bolin
107d2ce4e7 fix: change OPENAI_DEFAULT_MODEL to "gpt-5" (#1943) 2025-08-07 10:13:13 -07:00
Ed Bayes
09adbf9132 remove composer bg (#1944)
passes local tests
2025-08-07 10:04:49 -07:00
pakrym-oai
62ed5907f9 Better usage errors (#1941)
<img width="771" height="279" alt="image"
src="https://github.com/user-attachments/assets/e56f967f-bcd7-49f7-8a94-3d88df68b65a"
/>
2025-08-07 09:46:13 -07:00
Dylan
bc28b87c7b [config] Onboarding flow with persistence (#1929)
## Summary
In collaboration with @gpeal: upgrade the onboarding flow, and persist
user settings.

---------

Co-authored-by: Gabriel Peal <gabriel@openai.com>
2025-08-07 09:27:38 -07:00
pakrym-oai
7e9ecfbc6a Rename the model (#1942) 2025-08-07 09:07:51 -07:00
pakrym-oai
c87fb83d81 Calculate remaining context based on last token usage (#1940)
We should only take last request size (in tokens) into account
2025-08-07 05:17:18 -07:00
ae
81b148bda2 feat: update system prompt (#1939) 2025-08-07 04:29:50 -07:00
ae
12d29c2779 feat: add tip to upgrade to ChatGPT plan (#1938) 2025-08-07 11:10:13 +00:00
ae
c4dc6a80bf feat: improve output of /status (#1936)
Now it looks like this:
```
/status
📂 Workspace
  • Path: ~/code/codex/codex-rs
  • Approval Mode: on-request
  • Sandbox: workspace-write

👤 Account
  • Signed in with ChatGPT
  • Login: example@example.com
  • Plan: Pro

🧠 Model
  • Name: ?!?!?!?!?!
  • Provider: OpenAI

📊 Token Usage
  • Input: 11940 (+ 7999 cached)
  • Output: 2639
  • Total: 14579
```
2025-08-07 12:02:58 +01:00
ae
7c20160676 feat: /prompts slash command (#1937)
- Shows several example prompts which include @-mentions 

------
https://chatgpt.com/codex/tasks/task_i_6894779ba8cc832ca0c871d17ee06aae
2025-08-07 11:55:59 +01:00
Ed Bayes
1e4bf81653 Update copy (#1935)
Updated copy

---------

Co-authored-by: pap-openai <pap@openai.com>
2025-08-07 03:29:33 -07:00
aibrahim-oai
5589c6089b approval ui (#1933)
Asking for approval:

<img width="269" height="41" alt="image"
src="https://github.com/user-attachments/assets/b9ced569-3297-4dae-9ce7-0b015c9e14ea"
/>

Allow:

<img width="400" height="31" alt="image"
src="https://github.com/user-attachments/assets/92056b22-efda-4d49-854d-e2943d5fcf17"
/>

Reject:

<img width="372" height="30" alt="image"
src="https://github.com/user-attachments/assets/be9530a9-7d41-4800-bb42-abb9a24fc3ea"
/>

Always Approve:

<img width="410" height="36" alt="image"
src="https://github.com/user-attachments/assets/acf871ba-4c26-4501-b303-7956d0151754"
/>
2025-08-07 02:02:56 -07:00
Michael Bolin
c2c327c723 feat: change shell_environment_policy to default to inherit="all" (#1904)
Trying to use `core` as the default has been "too clever." Users can
always take responsibility for controlling the env without this setting
at all by specifying the `env` they use when calling `codex` in the
first place.

See https://github.com/openai/codex/issues/1249.
2025-08-07 01:55:41 -07:00
Ed Bayes
20084facfe Add spinner animation to TUI status indicator (#1917)
## Summary
- add a pulsing dot loader before the shimmering `Working` label in the
status indicator widget and include a small test asserting the spinner
character is rendered
- also fix a small bug in the ran command header by adding a space
between the  and `Ran command`


https://github.com/user-attachments/assets/6768c9d2-e094-49cb-ad51-44bcac10aa6f

## Testing
- `just fmt`
- `just fix` *(failed: E0658 `let` expressions in core/src/client.rs)*
- `cargo test --all-features` *(failed: E0658 `let` expressions in
core/src/client.rs)*

------
https://chatgpt.com/codex/tasks/task_i_68941bffdb948322b0f4190bc9dbe7f6

---------

Co-authored-by: aibrahim-oai <aibrahim@openai.com>
2025-08-07 08:45:04 +00:00
Michael Bolin
13982d6b4e chore: fix outstanding review comments from the bot on #1919 (#1928)
I should have read the comments before submitting!
2025-08-07 01:30:13 -07:00
ae
0334476894 feat: parse info from auth.json and show in /status (#1923)
- `/status` renders
    ```
    signed in with chatgpt
      login: example@example.com
      plan: plus
    ```
- Setup for using this info in a few more places.

---------

Co-authored-by: Michael Bolin <mbolin@openai.com>
2025-08-07 01:27:45 -07:00