We had multiple issues with context size calculation: 1. `initial_prompt_tokens` calculation based on cache size is not reliable, cache misses might set it to much higher value. For now hardcoded to a safer constant. 2. Input context size for GPT-5 is 272k (that's where 33% came from). Fixes.
codex-protocol
This crate defines the "types" for the protocol used by Codex CLI, which includes both "internal types" for communication between codex-core and codex-tui, as well as "external types" used with codex mcp.
This crate should have minimal dependencies.
Ideally, we should avoid "material business logic" in this crate, as we can always introduce Ext-style traits to add functionality to types in other crates.