fix: use o4-mini as the default model (#1135)
Rollback of https://github.com/openai/codex/pull/972.
This commit is contained in:
@@ -32,7 +32,7 @@ The `config.toml` file supports the following options:
|
||||
The model that Codex should use.
|
||||
|
||||
```toml
|
||||
model = "o3" # overrides the default of "codex-mini-latest"
|
||||
model = "o3" # overrides the default of "o4-mini"
|
||||
```
|
||||
|
||||
### model_provider
|
||||
|
||||
Reference in New Issue
Block a user