Files
llmx/codex-rs/core
Dylan aff97ed7dd [core] Separate tools config from openai client (#1858)
## Summary
In an effort to make tools easier to work with and more configurable,
I'm introducing `ToolConfig` and updating `Prompt` to take in a general
list of Tools. I think this is simpler and better for a few reasons:
- We can easily assemble tools from various sources (our own harness,
mcp servers, etc.) and we can consolidate the logic for constructing the
logic in one place that is separate from serialization.
- client.rs no longer needs arbitrary config values, it just takes in a
list of tools to serialize

A hefty portion of the PR is now updating our conversion of
`mcp_types::Tool` to `OpenAITool`, but considering that @bolinfest
accurately called this out as a TODO long ago, I think it's time we
tackled it.

## Testing
- [x] Experimented locally, no changes, as expected
- [x] Added additional unit tests
- [x] Responded to rust-review
2025-08-05 19:27:52 -07:00
..
2025-08-05 00:43:23 -07:00

codex-core

This crate implements the business logic for Codex. It is designed to be used by the various Codex UIs written in Rust.

Dependencies

Note that codex-core makes some assumptions about certain helper utilities being available in the environment. Currently, this

macOS

Expects /usr/bin/sandbox-exec to be present.

Linux

Expects the binary containing codex-core to run the equivalent of codex debug landlock when arg0 is codex-linux-sandbox. See the codex-arg0 crate for details.

All Platforms

Expects the binary containing codex-core to simulate the virtual apply_patch CLI when arg1 is --codex-run-as-apply-patch. See the codex-arg0 crate for details.