Dylan 4764fc1ee7 feat: Freeform apply_patch with simple shell output (#4718)
## Summary
This PR is an alternative approach to #4711, but instead of changing our
storage, parses out shell calls in the client and reserializes them on
the fly before we send them out as part of the request.

What this changes:
1. Adds additional serialization logic when the
ApplyPatchToolType::Freeform is in use.
2. Adds a --custom-apply-patch flag to enable this setting on a
session-by-session basis.

This change is delicate, but is not meant to be permanent. It is meant
to be the first step in a migration:
1. (This PR) Add in-flight serialization with config
2. Update model_family default
3. Update serialization logic to store turn outputs in a structured
format, with logic to serialize based on model_family setting.
4. Remove this rewrite in-flight logic.

## Test Plan
- [x] Additional unit tests added
- [x] Integration tests added
- [x] Tested locally
2025-10-04 19:16:36 -07:00
2025-09-29 13:27:13 -07:00
2025-04-16 12:56:08 -04:00
2025-04-16 12:56:08 -04:00
2025-07-31 00:06:55 +00:00
2025-10-01 12:39:04 -07:00
2025-04-18 17:01:11 -07:00
2025-10-03 15:05:55 -07:00

npm i -g @openai/codex
or brew install codex

Codex CLI is a coding agent from OpenAI that runs locally on your computer.

If you want Codex in your code editor (VS Code, Cursor, Windsurf), install in your IDE
If you are looking for the cloud-based agent from OpenAI, Codex Web, go to chatgpt.com/codex

Codex CLI splash


Quickstart

Installing and running Codex CLI

Install globally with your preferred package manager. If you use npm:

npm install -g @openai/codex

Alternatively, if you use Homebrew:

brew install codex

Then simply run codex to get started:

codex
You can also go to the latest GitHub Release and download the appropriate binary for your platform.

Each GitHub Release contains many executables, but in practice, you likely want one of these:

  • macOS
    • Apple Silicon/arm64: codex-aarch64-apple-darwin.tar.gz
    • x86_64 (older Mac hardware): codex-x86_64-apple-darwin.tar.gz
  • Linux
    • x86_64: codex-x86_64-unknown-linux-musl.tar.gz
    • arm64: codex-aarch64-unknown-linux-musl.tar.gz

Each archive contains a single entry with the platform baked into the name (e.g., codex-x86_64-unknown-linux-musl), so you likely want to rename it to codex after extracting it.

Using Codex with your ChatGPT plan

Codex CLI login

Run codex and select Sign in with ChatGPT. We recommend signing into your ChatGPT account to use Codex as part of your Plus, Pro, Team, Edu, or Enterprise plan. Learn more about what's included in your ChatGPT plan.

You can also use Codex with an API key, but this requires additional setup. If you previously used an API key for usage-based billing, see the migration steps. If you're having trouble with login, please comment on this issue.

Model Context Protocol (MCP)

Codex CLI supports MCP servers. Enable by adding an mcp_servers section to your ~/.codex/config.toml.

Configuration

Codex CLI supports a rich set of configuration options, with preferences stored in ~/.codex/config.toml. For full configuration options, see Configuration.


Docs & FAQ


License

This repository is licensed under the Apache-2.0 License.

Description
No description provided
Readme Apache-2.0 43 MiB
Languages
Rust 96.5%
Python 1.6%
TypeScript 1.1%
PowerShell 0.2%
Shell 0.2%
Other 0.3%