Prior to this PR, there were two big misses in `chat_completions.rs`: 1. The loop in `stream_chat_completions()` was only including items of type `ResponseItem::Message` when building up the `"messages"` JSON for the `POST` request to the `chat/completions` endpoint. This fixes things by ensuring other variants (`FunctionCall`, `LocalShellCall`, and `FunctionCallOutput`) are included, as well. 2. In `process_chat_sse()`, we were not recording tool calls and were only emitting items of type `ResponseEvent::OutputItemDone(ResponseItem::Message)` to the stream. Now we introduce `FunctionCallState`, which is used to accumulate the `delta`s of type `tool_calls`, so we can ultimately emit a `ResponseItem::FunctionCall`, when appropriate. While function calling now appears to work for chat completions with my local testing, I believe that there are still edge cases that are not covered and that this codepath would benefit from a battery of integration tests. (As part of that further cleanup, we should also work to support streaming responses in the UI.) The other important part of this PR is some cleanup in `core/src/codex.rs`. In particular, it was hard to reason about how `run_task()` was building up the list of messages to include in a request across the various cases: - Responses API - Chat Completions API - Responses API used in concert with ZDR I like to think things are a bit cleaner now where: - `zdr_transcript` (if present) contains all messages in the history of the conversation, which includes function call outputs that have not been sent back to the model yet - `pending_input` includes any messages the user has submitted while the turn is in flight that need to be injected as part of the next `POST` to the model - `input_for_next_turn` includes the tool call outputs that have not been sent back to the model yet
Codex CLI (Rust Implementation)
We provide Codex CLI as a standalone, native executable to ensure a zero-dependency install.
Installing Codex
Today, the easiest way to install Codex is via npm, though we plan to publish Codex to other package managers soon.
npm i -g @openai/codex@native
codex
You can also download a platform-specific release directly from our GitHub Releases.
Config
Codex supports a rich set of configuration options. See config.md for details.
Model Context Protocol Support
Codex CLI functions as an MCP client that can connect to MCP servers on startup. See the mcp_servers section in the configuration documentation for details.
It is still experimental, but you can also launch Codex as an MCP server by running codex mcp. Using the @modelcontextprotocol/inspector is
npx @modelcontextprotocol/inspector codex mcp
Code Organization
This folder is the root of a Cargo workspace. It contains quite a bit of experimental code, but here are the key crates:
core/contains the business logic for Codex. Ultimately, we hope this to be a library crate that is generally useful for building other Rust/native applications that use Codex.exec/"headless" CLI for use in automation.tui/CLI that launches a fullscreen TUI built with Ratatui.cli/CLI multitool that provides the aforementioned CLIs via subcommands.