When using the OpenAI Responses API, we now record the `usage` field for
a `"response.completed"` event, which includes metrics about the number
of tokens consumed. We also introduce `openai_model_info.rs`, which
includes current data about the most common OpenAI models available via
the API (specifically `context_window` and `max_output_tokens`). If
Codex does not recognize the model, you can set `model_context_window`
and `model_max_output_tokens` explicitly in `config.toml`.
When then introduce a new event type to `protocol.rs`, `TokenCount`,
which includes the `TokenUsage` for the most recent turn.
Finally, we update the TUI to record the running sum of tokens used so
the percentage of available context window remaining can be reported via
the placeholder text for the composer:

We could certainly get much fancier with this (such as reporting the
estimated cost of the conversation), but for now, we are just trying to
achieve feature parity with the TypeScript CLI.
Though arguably this improves upon the TypeScript CLI, as the TypeScript
CLI uses heuristics to estimate the number of tokens used rather than
using the `usage` information directly:
296996d74e/codex-cli/src/utils/approximate-tokens-used.ts (L3-L16)
Fixes https://github.com/openai/codex/issues/1242
Codex CLI (Rust Implementation)
We provide Codex CLI as a standalone, native executable to ensure a zero-dependency install.
Installing Codex
Today, the easiest way to install Codex is via npm, though we plan to publish Codex to other package managers soon.
npm i -g @openai/codex@native
codex
You can also download a platform-specific release directly from our GitHub Releases.
What's new in the Rust CLI
While we are working to close the gap between the TypeScript and Rust implementations of Codex CLI, note that the Rust CLI has a number of features that the TypeScript CLI does not!
Config
Codex supports a rich set of configuration options. Note that the Rust CLI uses config.toml instead of config.json. See config.md for details.
Model Context Protocol Support
Codex CLI functions as an MCP client that can connect to MCP servers on startup. See the mcp_servers section in the configuration documentation for details.
It is still experimental, but you can also launch Codex as an MCP server by running codex mcp. Use the @modelcontextprotocol/inspector to try it out:
npx @modelcontextprotocol/inspector codex mcp
Notifications
You can enable notifications by configuring a script that is run whenever the agent finishes a turn. The notify documentation includes a detailed example that explains how to get desktop notifications via terminal-notifier on macOS.
codex exec to run Codex programmatially/non-interactively
To run Codex non-interactively, run codex exec PROMPT (you can also pass the prompt via stdin) and Codex will work on your task until it decides that it is done and exits. Output is printed to the terminal directly. You can set the RUST_LOG environment variable to see more about what's going on.
--cd/-C flag
Sometimes it is not convenient to cd to the directory you want Codex to use as the "working root" before running Codex. Fortunately, codex supports a --cd option so you can specify whatever folder you want. You can confirm that Codex is honoring --cd by double-checking the workdir it reports in the TUI at the start of a new session.
Experimenting with the Codex Sandbox
To test to see what happens when a command is run under the sandbox provided by Codex, we provide the following subcommands in Codex CLI:
# macOS
codex debug seatbelt [-s SANDBOX_PERMISSION]... [COMMAND]...
# Linux
codex debug landlock [-s SANDBOX_PERMISSION]... [COMMAND]...
You can experiment with different values of -s to see what permissions the COMMAND needs to execute successfully.
Note that the exact API for the -s flag is currently in flux. See https://github.com/openai/codex/issues/1248 for details.
Code Organization
This folder is the root of a Cargo workspace. It contains quite a bit of experimental code, but here are the key crates:
core/contains the business logic for Codex. Ultimately, we hope this to be a library crate that is generally useful for building other Rust/native applications that use Codex.exec/"headless" CLI for use in automation.tui/CLI that launches a fullscreen TUI built with Ratatui.cli/CLI multitool that provides the aforementioned CLIs via subcommands.