Files
llmx/codex-rs
Michael Bolin 970e466ab3 fix: switch to unbounded channel (#2874)
#2747 encouraged me to audit our codebase for similar issues, as now I
am particularly suspicious that our flaky tests are due to a racy
deadlock.

I asked Codex to audit our code, and one of its suggestions was this:

> **High-Risk Patterns**
>
> All `send_*` methods await on a bounded
`mpsc::Sender<OutgoingMessage>`. If the writer blocks, the channel fills
and the processor task blocks on send, stops draining incoming requests,
and stdin reader eventually blocks on its send. This creates a
backpressure deadlock cycle across the three tasks.
>
> **Recommendations**
> * Server outgoing path: break the backpressure cycle
> * Option A (minimal risk): Change `OutgoingMessageSender` to use an
unbounded channel to decouple producer from stdout. Add rate logging so
floods are visible.
> * Option B (bounded + drop policy): Change `send_*` to try_send and
drop messages (or coalesce) when the queue is full, logging a warning.
This prevents processor stalls at the cost of losing messages under
extreme backpressure.
> * Option C (two-stage buffer): Keep bounded channel, but have a
dedicated “egress” task that drains an unbounded internal queue, writing
to stdout with retries and a shutdown timeout. This centralizes
backpressure policy.

So this PR is Option A.

Indeed, we previously used a bounded channel with a capacity of `128`,
but as we discovered recently with #2776, there are certainly cases
where we can get flooded with events.

That said, `test_shell_command_approval_triggers_elicitation` just
failed one one build when I put up this PR, so clearly we are not out of
the woods yet...

**Update:** I think I found the true source of the deadlock! See
https://github.com/openai/codex/pull/2876
2025-08-28 22:20:10 -07:00
..
2025-07-30 18:37:00 -07:00
2025-08-20 07:26:14 +00:00
2025-08-25 13:23:32 -04:00
2025-08-27 10:30:39 -07:00

Codex CLI (Rust Implementation)

We provide Codex CLI as a standalone, native executable to ensure a zero-dependency install.

Installing Codex

Today, the easiest way to install Codex is via npm, though we plan to publish Codex to other package managers soon.

npm i -g @openai/codex@native
codex

You can also download a platform-specific release directly from our GitHub Releases.

What's new in the Rust CLI

While we are working to close the gap between the TypeScript and Rust implementations of Codex CLI, note that the Rust CLI has a number of features that the TypeScript CLI does not!

Config

Codex supports a rich set of configuration options. Note that the Rust CLI uses config.toml instead of config.json. See docs/config.md for details.

Model Context Protocol Support

Codex CLI functions as an MCP client that can connect to MCP servers on startup. See the mcp_servers section in the configuration documentation for details.

It is still experimental, but you can also launch Codex as an MCP server by running codex mcp. Use the @modelcontextprotocol/inspector to try it out:

npx @modelcontextprotocol/inspector codex mcp

Notifications

You can enable notifications by configuring a script that is run whenever the agent finishes a turn. The notify documentation includes a detailed example that explains how to get desktop notifications via terminal-notifier on macOS.

codex exec to run Codex programmatially/non-interactively

To run Codex non-interactively, run codex exec PROMPT (you can also pass the prompt via stdin) and Codex will work on your task until it decides that it is done and exits. Output is printed to the terminal directly. You can set the RUST_LOG environment variable to see more about what's going on.

Typing @ triggers a fuzzy-filename search over the workspace root. Use up/down to select among the results and Tab or Enter to replace the @ with the selected path. You can use Esc to cancel the search.

EscEsc to edit a previous message

When the chat composer is empty, press Esc to prime “backtrack” mode. Press Esc again to open a transcript preview highlighting the last user message; press Esc repeatedly to step to older user messages. Press Enter to confirm and Codex will fork the conversation from that point, trim the visible transcript accordingly, and prefill the composer with the selected user message so you can edit and resubmit it.

In the transcript preview, the footer shows an Esc edit prev hint while editing is active.

--cd/-C flag

Sometimes it is not convenient to cd to the directory you want Codex to use as the "working root" before running Codex. Fortunately, codex supports a --cd option so you can specify whatever folder you want. You can confirm that Codex is honoring --cd by double-checking the workdir it reports in the TUI at the start of a new session.

Shell completions

Generate shell completion scripts via:

codex completion bash
codex completion zsh
codex completion fish

Experimenting with the Codex Sandbox

To test to see what happens when a command is run under the sandbox provided by Codex, we provide the following subcommands in Codex CLI:

# macOS
codex debug seatbelt [--full-auto] [COMMAND]...

# Linux
codex debug landlock [--full-auto] [COMMAND]...

Selecting a sandbox policy via --sandbox

The Rust CLI exposes a dedicated --sandbox (-s) flag that lets you pick the sandbox policy without having to reach for the generic -c/--config option:

# Run Codex with the default, read-only sandbox
codex --sandbox read-only

# Allow the agent to write within the current workspace while still blocking network access
codex --sandbox workspace-write

# Danger! Disable sandboxing entirely (only do this if you are already running in a container or other isolated env)
codex --sandbox danger-full-access

The same setting can be persisted in ~/.codex/config.toml via the top-level sandbox_mode = "MODE" key, e.g. sandbox_mode = "workspace-write".

Code Organization

This folder is the root of a Cargo workspace. It contains quite a bit of experimental code, but here are the key crates:

  • core/ contains the business logic for Codex. Ultimately, we hope this to be a library crate that is generally useful for building other Rust/native applications that use Codex.
  • exec/ "headless" CLI for use in automation.
  • tui/ CLI that launches a fullscreen TUI built with Ratatui.
  • cli/ CLI multitool that provides the aforementioned CLIs via subcommands.