This is the first step in supporting other model providers in the Rust CLI. Specifically, this PR adds support for the new entries in `Config` and `ConfigOverrides` to specify a `ModelProviderInfo`, which is the basic config needed for an LLM provider. This PR does not get us all the way there yet because `client.rs` still categorically appends `/responses` to the URL and expects the endpoint to support the OpenAI Responses API. Will fix that next!
codex-core
This crate implements the business logic for Codex. It is designed to be used by the various Codex UIs written in Rust.
Though for non-Rust UIs, we are also working to define a protocol for talking to Codex. See:
You can use the proto subcommand using the executable in the cli crate to speak the protocol using newline-delimited-JSON over stdin/stdout.