fix: use o4-mini as the default model (#1135)

Rollback of https://github.com/openai/codex/pull/972.
This commit is contained in:
Michael Bolin
2025-05-27 09:12:55 -07:00
committed by GitHub
parent 6b5b184f21
commit 29d154cb13
2 changed files with 2 additions and 2 deletions

View File

@@ -32,7 +32,7 @@ The `config.toml` file supports the following options:
The model that Codex should use.
```toml
model = "o3" # overrides the default of "codex-mini-latest"
model = "o3" # overrides the default of "o4-mini"
```
### model_provider