As we are [close to releasing the Rust CLI beta](https://github.com/openai/codex/discussions/1405), for the moment, let's take a more neutral stance on what it takes to be a "built-in" provider. * For example, there seems to be a discrepancy around what the "right" configuration for Gemini is: https://github.com/openai/codex/pull/881 * And while the current list of "built-in" providers are all arguably "well-known" names, this raises a question of what to do about potentially less familiar providers, such as https://github.com/openai/codex/pull/1142. Do we just accept every pull request like this, or is there some criteria a provider has to meet to "qualify" to be bundled with Codex CLI? I think that if we can establish clear ground rules for being a built-in provider, then we can bring this back. But until then, I would rather take a minimalist approach because if we decided to reverse our position later, it would break folks who were depending on the presence of the built-in providers.
codex-core
This crate implements the business logic for Codex. It is designed to be used by the various Codex UIs written in Rust.
Though for non-Rust UIs, we are also working to define a protocol for talking to Codex. See:
You can use the proto subcommand using the executable in the cli crate to speak the protocol using newline-delimited-JSON over stdin/stdout.