There was a bit of copypasta I put up with when were publishing two
packages to npm, but now that it's three, I created some more scripts to
consolidate things.
With this change, I ran:
```shell
./scripts/stage_npm_packages.py --release-version 0.43.0-alpha.8 --package codex --package codex-responses-api-proxy --package codex-sdk
```
Indeed when it finished, I ended up with:
```shell
$ tree dist
dist
└── npm
├── codex-npm-0.43.0-alpha.8.tgz
├── codex-responses-api-proxy-npm-0.43.0-alpha.8.tgz
└── codex-sdk-npm-0.43.0-alpha.8.tgz
$ tar tzvf dist/npm/codex-sdk-npm-0.43.0-alpha.8.tgz
-rwxr-xr-x 0 0 0 25476720 Oct 26 1985 package/vendor/aarch64-apple-darwin/codex/codex
-rwxr-xr-x 0 0 0 29871400 Oct 26 1985 package/vendor/aarch64-unknown-linux-musl/codex/codex
-rwxr-xr-x 0 0 0 28368096 Oct 26 1985 package/vendor/x86_64-apple-darwin/codex/codex
-rwxr-xr-x 0 0 0 36029472 Oct 26 1985 package/vendor/x86_64-unknown-linux-musl/codex/codex
-rw-r--r-- 0 0 0 10926 Oct 26 1985 package/LICENSE
-rw-r--r-- 0 0 0 30187520 Oct 26 1985 package/vendor/aarch64-pc-windows-msvc/codex/codex.exe
-rw-r--r-- 0 0 0 35277824 Oct 26 1985 package/vendor/x86_64-pc-windows-msvc/codex/codex.exe
-rw-r--r-- 0 0 0 4842 Oct 26 1985 package/dist/index.js
-rw-r--r-- 0 0 0 1347 Oct 26 1985 package/package.json
-rw-r--r-- 0 0 0 9867 Oct 26 1985 package/dist/index.js.map
-rw-r--r-- 0 0 0 12 Oct 26 1985 package/README.md
-rw-r--r-- 0 0 0 4287 Oct 26 1985 package/dist/index.d.ts
```
This PR expands `.github/workflows/rust-release.yml` so that it also
builds and publishes the `npm` module for
`@openai/codex-responses-api-proxy` in addition to `@openai/codex`. Note
both `npm` modules are similar, in that they each contain a single `.js`
file that is a thin launcher around the appropriate native executable.
(Since we have a minimal dependency on Node.js, I also lowered the
minimum version from 20 to 16 and verified that works on my machine.)
As part of this change, we tighten up some of the docs around
`codex-responses-api-proxy` and ensure the details regarding protecting
the `OPENAI_API_KEY` in memory match the implementation.
To test the `npm` build process, I ran:
```
./codex-cli/scripts/build_npm_package.py --package codex-responses-api-proxy --version 0.43.0-alpha.3
```
which stages the `npm` module for `@openai/codex-responses-api-proxy` in
a temp directory, using the binary artifacts from
https://github.com/openai/codex/releases/tag/rust-v0.43.0-alpha.3.
This should make the `codex-responses-api-proxy` binaries available for
all platforms in a GitHub Release as well as a corresponding DotSlash
file.
Making `codex-responses-api-proxy` available as an `npm` module will be
done in a follow-up PR.
---
[//]: # (BEGIN SAPLING FOOTER)
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with [ReviewStack](https://reviewstack.dev/openai/codex/pull/4404).
* __->__ #4406
* #4404
* #4403
I am not sure what is going on, as
https://github.com/openai/codex/pull/3660 introduced this new logic and
I swear that CI was green before I merged that PR, but I am seeing
failures in this CI job this morning. This feels like a
non-backwards-compatible change in `gh`, but that feels unlikely...
Nevertheless, this is what I currently see on my laptop:
```
$ gh --version
gh version 2.76.2 (2025-07-30)
https://github.com/cli/cli/releases/tag/v2.76.2
$ gh run list --workflow .github/workflows/rust-release.yml --branch rust-v0.40.0 --json workflowName,url,headSha --jq 'first(.[])'
{
"headSha": "5268705a69713752adcbd8416ef9e84a683f7aa3",
"url": "https://github.com/openai/codex/actions/runs/17952349351",
"workflowName": ".github/workflows/rust-release.yml"
}
```
Looking at sample output from an old GitHub issue
(https://github.com/cli/cli/issues/6678), it appears that, at least at
one point in time, the `workflowName` was _not_ the path to the
workflow.
We try to ensure ripgrep (`rg`) is provided with Codex.
- For `brew`, we declare it as a dependency of our formula:
08d82d8b00/Formula/c/codex.rb (L24)
- For `npm`, we declare `@vscode/ripgrep` as a dependency, which
installs the platform-specific binary as part of a `postinstall` script:
fdb8dadcae/codex-cli/package.json (L22)
- Users who download the CLI directly from GitHub Releases are on their
own.
In practice, I have seen `@vscode/ripgrep` fail on occasion. Here is a
trace from a GitHub workflow:
```
npm error code 1
npm error path /Users/runner/hostedtoolcache/node/20.19.5/arm64/lib/node_modules/@openai/codex/node_modules/@vscode/ripgrep
npm error command failed
npm error command sh -c node ./lib/postinstall.js
npm error Finding release for v13.0.0-13
npm error GET https://api.github.com/repos/microsoft/ripgrep-prebuilt/releases/tags/v13.0.0-13
npm error Deleting invalid download cache
npm error Download attempt 1 failed, retrying in 2 seconds...
npm error Finding release for v13.0.0-13
npm error GET https://api.github.com/repos/microsoft/ripgrep-prebuilt/releases/tags/v13.0.0-13
npm error Deleting invalid download cache
npm error Download attempt 2 failed, retrying in 4 seconds...
npm error Finding release for v13.0.0-13
npm error GET https://api.github.com/repos/microsoft/ripgrep-prebuilt/releases/tags/v13.0.0-13
npm error Deleting invalid download cache
npm error Download attempt 3 failed, retrying in 8 seconds...
npm error Finding release for v13.0.0-13
npm error GET https://api.github.com/repos/microsoft/ripgrep-prebuilt/releases/tags/v13.0.0-13
npm error Deleting invalid download cache
npm error Download attempt 4 failed, retrying in 16 seconds...
npm error Finding release for v13.0.0-13
npm error GET https://api.github.com/repos/microsoft/ripgrep-prebuilt/releases/tags/v13.0.0-13
npm error Deleting invalid download cache
npm error Error: Request failed: 403
```
To eliminate this error, this PR changes things so that we vendor the
`rg` binary into https://www.npmjs.com/package/@openai/codex so it is
guaranteed to be included when a user runs `npm i -g @openai/codex`.
The downside of this approach is the increase in package size: we
include the `rg` executable for six architectures (in addition to the
six copies of `codex` we already include). In a follow-up, I plan to add
support for "slices" of our npm module, so that soon users will be able
to do:
```
npm install -g @openai/codex@aarch64-apple-darwin
```
Admittedly, this is a sizable change and I tried to clean some things up
in the process:
- `install_native_deps.sh` has been replaced by `install_native_deps.py`
- `stage_release.sh` and `stage_rust_release.py` has been replaced by
`build_npm_package.py`
We now vendor in a DotSlash file for ripgrep (as a modest attempt to
facilitate local testing) and then build up the extension by:
- creating a temp directory and copying `package.json` over to it with
the target value for `"version"`
- finding the GitHub workflow that corresponds to the
`--release-version` and copying the various `codex` artifacts to
respective `vendor/TARGET_TRIPLE/codex` folder
- downloading the `rg` artifacts specified in the DotSlash file and
copying them over to the respective `vendor/TARGET_TRIPLE/path` folder
- if `--pack-output` is specified, runs `npm pack` on the temp directory
To test, I downloaded the artifact produced by this CI job:
https://github.com/openai/codex/actions/runs/17961595388/job/51085840022?pr=3660
and verified that `node ./bin/codex.js 'which -a rg'` worked as
intended.
## Summary
Ripgrep is our preferred tool for file search. When users install via
`brew install codex`, it's automatically installed as a dependency. We
want to ensure that users running via an npm install also have this
tool! Microsoft has already solved this problem for VS Code - let's not
reinvent the wheel.
This approach of appending to the PATH directly might be a bit
heavy-handed, but feels reasonably robust to a variety of environment
concerns. Open to thoughts on better approaches here!
## Testing
- [x] confirmed this import approach works with `node -e "const { rgPath
} = require('@vscode/ripgrep'); require('child_process').spawn(rgPath,
['--version'], { stdio: 'inherit' })"`
- [x] Ran codex.js locally with `rg` uninstalled, asked it to run `which
rg`. Output below:
```
⚡ Ran command which rg; echo $?
⎿ /Users/dylan.hurd/code/dh--npm-rg/node_modules/@vscode/ripgrep/bin/rg
0
codex
Re-running to confirm the path and exit code.
- Path: `/Users/dylan.hurd/code/dh--npm-rg/node_modules/@vscode/ripgrep/bin/rg`
- Exit code: `0`
```
This deletes the bulk of the `codex-cli` folder and eliminates the logic
that builds the TypeScript code and bundles it into the release.
Since this PR modifies `.github/workflows/rust-release.yml`, to test
changes to the release process, I locally commented out all of the "is
this commit on upstream `main`" checks in
`scripts/create_github_release.sh` and ran:
```
./codex-rs/scripts/create_github_release.sh 0.20.0-alpha.4
```
Which kicked off:
https://github.com/openai/codex/actions/runs/16842085113
And the release artifacts appear legit!
https://github.com/openai/codex/releases/tag/rust-v0.20.0-alpha.4
Historically, the release process for the npm module has been:
- I run `codex-rs/scripts/create_github_release.sh` to kick off a
release for the native artifacts.
- I wait until it is done.
- I run `codex-cli/scripts/stage_rust_release.py` to build the npm
release locally
- I run `npm publish` from my laptop
It has been a longstanding issue to move the npm build to CI. I may
still have to do the `npm publish` manually because it requires 2fac
with `npm`, though I assume we can work that out later.
Note I asked Codex to make these updates, and while they look pretty
good to me, I'm not 100% certain, but let's just merge this and I'll
kick off another alpha build and we'll see what happens?
To date, the build scripts in `codex-cli` still supported building the
old TypeScript version of the Codex CLI to give Windows users something
they can run, but we are just going to have them use the Rust version
like everyone else, so:
- updates `codex-cli/bin/codex.js` so that we run the native binary or
throw if the target platform/arch is not supported (no more conditional
usage based on `CODEX_RUST`, `use-native` file, etc.)
- drops the `--native` flag from `codex-cli/scripts/stage_release.sh`
and updates all the code paths to behave as if `--native` were passed
(i.e., it is the only way to run it now)
Tested this by running:
```
./codex-cli/scripts/stage_rust_release.py --release-version 0.20.0-alpha.2
```
The following test script fails in the codex sandbox:
```
import multiprocessing
from multiprocessing import Lock, Process
def f(lock):
with lock:
print("Lock acquired in child process")
if __name__ == '__main__':
lock = Lock()
p = Process(target=f, args=(lock,))
p.start()
p.join()
```
with
```
Traceback (most recent call last):
File "/Users/david.hao/code/codex/codex-rs/cli/test.py", line 9, in <module>
lock = Lock()
^^^^^^
File "/Users/david.hao/.local/share/uv/python/cpython-3.12.9-macos-aarch64-none/lib/python3.12/multiprocessing/context.py", line 68, in Lock
return Lock(ctx=self.get_context())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/david.hao/.local/share/uv/python/cpython-3.12.9-macos-aarch64-none/lib/python3.12/multiprocessing/synchronize.py", line 169, in __init__
SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx)
File "/Users/david.hao/.local/share/uv/python/cpython-3.12.9-macos-aarch64-none/lib/python3.12/multiprocessing/synchronize.py", line 57, in __init__
sl = self._semlock = _multiprocessing.SemLock(
^^^^^^^^^^^^^^^^^^^^^^^^^
PermissionError: [Errno 1] Operation not permitted
```
After reading, adding this line to the sandbox configs fixes things -
MacOS multiprocessing appears to use sem_lock(), which opens an IPC
which is considered a disk write even though no file is created. I
interrogated ChatGPT about whether it's okay to loosen, and my
impression after reading is that it is, although would appreciate a
close look
Breadcrumb: You can run `cargo run -- debug seatbelt --full-auto <cmd>`
to test the sandbox
When Codex CLI is installed via `npm`, we use a `.js` wrapper script to
launch the Rust binary.
- Previously, we were not listening for signals to ensure that killing
the Node.js process would also kill the underlying Rust process.
- We also did not have a proper `exit` handler in place on the child
process to ensure we exited from the Node.js process.
This PR fixes these things and hopefully addresses
https://github.com/openai/codex/issues/1570.
This also adds logic so that Windows falls back to the TypeScript CLI
again, which should address https://github.com/openai/codex/issues/1573.
It appears that `0.5.0` was built with `stage_release.sh` instead of
`stage_rust_release.py`, so add docs to clarify this and recommend
running `--version` on the release candidate to verify the right thing
was built.
Bumps node from 22-slim to 24-slim.
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
<details>
<summary>Dependabot commands and options</summary>
<br />
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
</details>
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
## Summary
Add Android platform support to Codex CLI
## What?
- Added `android` to the list of supported platforms in
`codex-cli/bin/codex.js`
- Treats Android as Linux for binary compatibility
## Why?
- Fixes "Unsupported platform: android (arm64)" error on Termux
- Enables Codex CLI usage on Android devices via Termux
- Improves platform compatibility without affecting other platforms
## How?
- Modified the platform detection switch statement to include `case
"android":`
- Android falls through to the same logic as Linux, using appropriate
ARM64 binaries
- Minimal change with no breaking effects on existing functionality
## Testing
- Tested on Android/Termux environment
- Verified the fix resolves the platform detection error
- Confirmed no impact on other platforms
## Related Issues
Fixes the "Unsupported platform: android (arm64)" error reported by
Termux users
This is a stopgap solution before migrating the build for the npm
release to GitHub Actions (which is ultimately what should be done to
ensure hermetic builds).
The idea is that instead of continuing to create PRs like
https://github.com/openai/codex/pull/1472 where I have to check in a
change to the `WORKFLOW_URL`, this script uses `gh run list` to get the
`WORKFLOW_URL` dynamically and then threads the value through to
`install_native_deps.sh`.
To create the 0.3.0 release on npm, I ran:
```shell
./codex-cli/scripts/stage_rust_release.py --release-version 0.3.0
```
and then did `npm publish --dry-run` followed by `npm publish` in the
temp directory created by `stage_rust_release.py`.
This introduces two changes to make a quick fix so we can deploy the
Rust CLI for `0.2.0` of `@openai/codex` on npm:
- Updates `WORKFLOW_URL` to point to
https://github.com/openai/codex/actions/runs/15981617627, which is the
GitHub workflow run used to create the binaries for the `0.2.0` release
we published to Homebrew.
- Adds a `--version` option to `stage_release.sh` to specify what the
`version` field in the `package.json` will be.
Locally, I ran the following:
```
./codex-cli/scripts/stage_release.sh --native --version 0.2.0
```
Previously, we only used the `--native` flag to publish to the `native`
tag of `@openai/codex` (e.g., `npm publish --tag native`), but we should
just publish this as the default tag for `0.2.0` to be consistent with
what is in Homebrew.
We can still publish one "final" version of the TypeScript CLI as 0.1.x
later.
Under the hood, this release will still contain `dist/cli.js`,
`bin/codex-linux-sandbox-x64`, and `bin/codex-x86_64-apple-darwin`,
which are not strictly necessary, but we'll fix that in `0.3.0`.
As promised on https://github.com/openai/codex/discussions/1405, we are
making the first official release of the Rust CLI as v0.2.0. As part of
this move, we are making it available in Homebrew:
https://github.com/Homebrew/homebrew-core/pull/228615
Ultimately, we also plan to continue to make the CLI available in npm,
as well, though brew is a bit nicer in that `brew install` will download
only the binary for your platform whereas an npm module is expected to
contain the binaries for _all_ supported platforms, so it is a bit more
heavyweight.
A big part of this change is updating the root `README.md` to document
the behavior of the Rust CLI, which differs in a number of ways from the
TypeScript CLI. The existing `README.md` is moved to
`codex-cli/README.md` as part of this PR, as it is still applicable to
that folder.
As this is still early days for the Rust CLI, I encourage folks to
provide feedback on the command line flags and configuration options.
- Use Responses API for Azure provider endpoints
- Added a unit test to catch regression on the change from
`/chat/completions` to `/responses`
- Updated the default AOAI api version from `2025-03-01-preview` to
`2025-04-01-preview` to avoid user/400 errors due to missing summary
support in the March API version.
- Changes have been tested locally on AOAI endpoints
## Summary
This PR refactors the Codex CLI authentication flow so that
**non-OpenAI** providers (for example **azure**, or any future addition)
can supply their API key through a dedicated environment variable
without triggering the OpenAI login flow.
Key behaviours introduced:
* When `provider !== "openai"` the CLI consults `src/utils/providers.ts`
to locate the correct environment variable (`AZURE_OPENAI_API_KEY`,
`GEMINI_API_KEY`, and so on) before considering any interactive login.
* Credit redemption (`--free`) and PKCE login now run **only** when the
provider is OpenAI, eliminating unwanted browser prompts for Azure and
others.
* User-facing error messages are revamped to guide Azure users to
**[https://ai.azure.com/](https://ai.azure.com)** and show the exact
variable name they must set.
* All code paths still export `OPENAI_API_KEY` so legacy scripts
continue to operate unchanged.
---
## Example `config.json`
```jsonc
{
"model": "codex-mini",
"provider": "azure",
"providers": {
"azure": {
"name": "AzureOpenAI",
"baseURL": "https://ai-<project-name>.openai.azure.com/openai",
"envKey": "AZURE_OPENAI_API_KEY"
}
},
"history": {
"maxSize": 1000,
"saveHistory": true,
"sensitivePatterns": []
}
}
```
With this file in `~/.codex/config.json`, a single command line is
enough:
```bash
export AZURE_OPENAI_API_KEY="<your-key>"
codex "Hello from Azure"
```
No browser window opens, and the CLI works in entirely non-interactive
mode.
---
## Rationale
The new flow enables Codex to run **asynchronously** in sandboxed
environments such as GitHub Actions pipelines. By passing `--provider
azure` (or setting it in `config.json`) and exporting the correct key,
CI/CD jobs can invoke Codex without any ChatGPT-style login or PKCE
round-trip. This unlocks fully automated testing and deployment
scenarios.
---
## What’s changed
| File | Type | Description |
| ------------------------ | ------------------- |
-----------------------------------------------------------------------------------------------------------------------------
|
| `codex-cli/src/cli.tsx` | **feat / refactor** | +43 / -20 lines.
Imports `providers`, adds early provider-specific key lookup, gates
`--free` redemption, rewrites help text. |
| `src/utils/providers.ts` | **chore** | Now consumed by CLI for env-var
discovery. |
---
## How to test
```bash
# Azure example
export AZURE_OPENAI_API_KEY="<your-key>"
codex --provider azure "Automated run in CI"
# OpenAI example (unchanged behaviour)
codex --provider openai --login "Standard OpenAI flow"
```
Expected outcomes:
* Azure and other provider paths are non-interactive when provider flag
is passed.
* The CLI always sets `OPENAI_API_KEY` for backward compatibility.
---
## Checklist
* [x] Logic behind provider-specific env-var lookup added.
* [x] Redundant OpenAI login steps removed for other providers.
* [x] Unit tests cover new branches.
* [x] README and sample config updated.
* [x] CI passes on all supported Node versions.
---
**Related work**
* #92
* #769
* #1321
I have read the CLA Document and I hereby sign the CLA.
Now that we have published a GitHub Release that contains arm64 musl
artifacts for Linux, update the following scripts to take advantage of
them:
- `dotslash-config.json` now uses musl artifacts for the `linux-aarch64`
target
- `install_native_deps.sh` for the TypeScript CLI now includes
`codex-linux-sandbox-aarch64-unknown-linux-musl` instead of
`codex-linux-sandbox-aarch64-unknown-linux-gnu` for sandboxing
- `codex-cli/bin/codex.js` now checks for `aarch64-unknown-linux-musl`
artifacts instead of `aarch64-unknown-linux-gnu` ones
This does not implement the full Login with ChatGPT experience, but it
should unblock people.
**What works**
* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.
**What does not work**
* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:
a67a67f325/codex-cli/src/utils/get-api-key.tsx (L84-L89)
**Implementation**
Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.
As such, the most significant files in this PR are:
```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```
Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:
```
codex-rs/core/src/openai_api_key.rs
```
Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.
Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.
**Testing**
Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:
```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```
For reference:
* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
Added logic so that when we run `./scripts/stage_release.sh --native`
(for the `@native` version of the Node module), we drop a `use-native`
file next to `codex.js`. If present, `codex.js` will now run the Rust
CLI.
Ran `./scripts/stage_release.sh --native` and verified that when the
running `codex.js` in the staged folder:
```
$ /var/folders/wm/f209bc1n2bd_r0jncn9s6j_00000gp/T/tmp.efvEvBlSN6/bin/codex.js --version
codex-cli 0.0.2505220956
```
it ran the expected Rust version of the CLI, as desired.
While here, I also updated the Rust version to one that I cut today,
which includes the new shell environment policy config option:
https://github.com/openai/codex/pull/1061. Note this may "break" some
users if the processes spawned by Codex need extra environment
variables. (We are still working to determine what the right defaults
should be for this option.)
## Summary
- add `--login` and `--free` flags to cli help
- handle `--login` and `--free` logic in cli
- factor out redeem flow into `maybeRedeemCredits`
- call new helper from login callback
When running `npm test` on `codex-cli`, the test
`agent-cancel-prev-response.test.ts` logs a significant body of text to
console for no obvious reason.
This is not helpful, as it makes test logs messy and far longer.
This change deletes the `console.log(...)` that produces the behavior.
- A new “/sessions” command is available for browsing previous sessions,
as shown in the updated slash command list
- The CLI now documents and parses a new “--history” flag to browse past
sessions from the command line
- A dedicated `SessionsOverlay` component loads session metadata and
allows toggling between viewing and resuming sessions
- When the sessions overlay is opened during a chat, selecting a session
can either show the saved rollout or resume it
## Summary
- fix quoting issues in `/diff` to correctly handle files with special
characters
- add regression test for `getGitDiff` when filenames contain `$`
- relax timeout in raw-exec-process-group test
Fixes https://github.com/openai/codex/issues/943
## Testing
- `pnpm test`
More about codespell: https://github.com/codespell-project/codespell .
I personally introduced it to dozens if not hundreds of projects already
and so far only positive feedback.
CI workflow has 'permissions' set only to 'read' so also should be safe.
Let me know if just want to take typo fixes in and get rid of the CI
---------
Signed-off-by: Yaroslav O. Halchenko <debian@onerussian.com>