Previous to this PR, we always set `reasoning` when making a request using the Responses API:d7245cbbc9/codex-rs/core/src/client.rs (L108-L111)Though if you tried to use the Rust CLI with `--model gpt-4.1`, this would fail with: ```shell "Unsupported parameter: 'reasoning.effort' is not supported with this model." ``` We take a cue from the TypeScript CLI, which does a check on the model name:d7245cbbc9/codex-cli/src/utils/agent/agent-loop.ts (L786-L789)This PR does a similar check, though also adds support for the following config options: ``` model_reasoning_effort = "low" | "medium" | "high" | "none" model_reasoning_summary = "auto" | "concise" | "detailed" | "none" ``` This way, if you have a model whose name happens to start with `"o"` (or `"codex"`?), you can set these to `"none"` to explicitly disable reasoning, if necessary. (That said, it seems unlikely anyone would use the Responses API with non-OpenAI models, but we provide an escape hatch, anyway.) This PR also updates both the TUI and `codex exec` to show `reasoning effort` and `reasoning summaries` in the header.
Codex CLI (Rust Implementation)
We provide Codex CLI as a standalone, native executable to ensure a zero-dependency install.
Installing Codex
Today, the easiest way to install Codex is via npm, though we plan to publish Codex to other package managers soon.
npm i -g @openai/codex@native
codex
You can also download a platform-specific release directly from our GitHub Releases.
Config
Codex supports a rich set of configuration options. See config.md for details.
Model Context Protocol Support
Codex CLI functions as an MCP client that can connect to MCP servers on startup. See the mcp_servers section in the configuration documentation for details.
It is still experimental, but you can also launch Codex as an MCP server by running codex mcp. Using the @modelcontextprotocol/inspector is
npx @modelcontextprotocol/inspector codex mcp
Code Organization
This folder is the root of a Cargo workspace. It contains quite a bit of experimental code, but here are the key crates:
core/contains the business logic for Codex. Ultimately, we hope this to be a library crate that is generally useful for building other Rust/native applications that use Codex.exec/"headless" CLI for use in automation.tui/CLI that launches a fullscreen TUI built with Ratatui.cli/CLI multitool that provides the aforementioned CLIs via subcommands.