Commit Graph

48 Commits

Author SHA1 Message Date
Michael Bolin
bec51f6c05 chore: enable clippy::redundant_clone (#3489)
Created this PR by:

- adding `redundant_clone` to `[workspace.lints.clippy]` in
`cargo-rs/Cargol.toml`
- running `cargo clippy --tests --fix`
- running `just fmt`

Though I had to clean up one instance of the following that resulted:

```rust
let codex = codex;
```
2025-09-11 11:59:37 -07:00
Gabriel Peal
58d77ca4e7 Clear non-empty prompts with ctrl + c (#3285)
This updates the ctrl + c behavior to clear the current prompt if there
is text and you press ctrl + c.

I also updated the ctrl + c hint text to show `^c to interrupt` instead
of `^c to quit` if there is an active conversation.

Two things I don't love:
1. You can currently interrupt a conversation with escape or ctrl + c
(not related to this PR and maybe fine)
2. The bottom row hint text always says `^c to quit` but this PR doesn't
really make that worse.




https://github.com/user-attachments/assets/6eddadec-0d84-4fa7-abcb-d6f5a04e5748


Fixes https://github.com/openai/codex/issues/3126
2025-09-07 23:21:53 -04:00
pakrym-oai
0269096229 Move token usage/context information to session level (#3221)
Move context information into the main loop so it can be used to
interrupt the loop or start auto-compaction.
2025-09-06 15:19:23 +00:00
Jeremy Rose
be23fe1353 Pause status timer while modals are open (#3131)
Summary:
- pause the status timer while waiting on approval modals
- expose deterministic pause/resume helpers to avoid sleep-based tests
- simplify bottom pane timer handling now that the widget owns the clock
2025-09-04 12:37:43 -07:00
Jeremy Rose
e442ecedab rework message styling (#2877)
https://github.com/user-attachments/assets/cf07f62b-1895-44bb-b9c3-7a12032eb371
2025-09-02 17:29:58 +00:00
dedrisian-oai
b8e8454b3f Custom /prompts (#2696)
Adds custom `/prompts` to `~/.codex/prompts/<command>.md`.

<img width="239" height="107" alt="Screenshot 2025-08-25 at 6 22 42 PM"
src="https://github.com/user-attachments/assets/fe6ebbaa-1bf6-49d3-95f9-fdc53b752679"
/>

---

Details:

1. Adds `Op::ListCustomPrompts` to core.
2. Returns `ListCustomPromptsResponse` with list of `CustomPrompt`
(name, content).
3. TUI calls the operation on load, and populates the custom prompts
(excluding prompts that collide with builtins).
4. Selecting the custom prompt automatically sends the prompt to the
agent.
2025-08-29 02:16:39 +00:00
Ahmed Ibrahim
c9ca63dc1e burst paste edge cases (#2683)
This PR fixes two edge cases in managing burst paste (mainly on power
shell).
Bugs:
- Needs an event key after paste to render the pasted items

> ChatComposer::flush_paste_burst_if_due() flushes on timeout. Called:
>     - Pre-render in App on TuiEvent::Draw.
>     - Via a delayed frame
>
BottomPane::request_redraw_in(ChatComposer::recommended_paste_flush_delay()).

- Parses two key events separately before starting parsing burst paste

> When threshold is crossed, pull preceding burst chars out of the
textarea and prepend to paste_burst_buffer, then keep buffering.

- Integrates with #2567 to bring image pasting to windows.
2025-08-28 12:54:12 -07:00
ae
274d9b413f [feat] Simplfy command approval UI (#2708)
- Removed the plain "No" option, which confused the model,
  since we already have the "No, provide feedback" option,
  which works better.

# Before

<img width="476" height="168" alt="image"
src="https://github.com/user-attachments/assets/6e783d9f-dec9-4610-9cad-8442eb377a90"
/>

# After

<img width="553" height="175" alt="image"
src="https://github.com/user-attachments/assets/3cdae582-3366-47bc-9753-288930df2324"
/>
2025-08-26 10:08:06 -07:00
ae
d085f73a2a [feat] reduce bottom padding to 1 line (#2704) 2025-08-25 22:47:26 -07:00
Ahmed Ibrahim
907afc9425 Fix esc (#2661)
Esc should have other functionalities when it's not used in a
backtracking situation. i.e. to cancel pop up menu when selecting
model/approvals or to interrupt an active turn.
2025-08-25 15:38:46 -07:00
Jeremy Rose
251c4c2ba9 tui: queue messages (#2637)
https://github.com/user-attachments/assets/44349aa6-3b97-4029-99e1-5484e9a8775f
2025-08-25 21:38:38 +00:00
Ahmed Ibrahim
c6a52d611c Resume conversation from an earlier point in history (#2607)
Fixing merge conflict of this: #2588


https://github.com/user-attachments/assets/392c7c37-cf8f-4ed6-952e-8215e8c57bc4
2025-08-23 23:23:15 -07:00
Jeremy Rose
d994019f3f tui: coalesce command output; show unabridged commands in transcript (#2590)
https://github.com/user-attachments/assets/effec7c7-732a-4b61-a2ae-3cb297b6b19b
2025-08-22 16:32:31 -07:00
Jeremy Rose
bbf42f4e12 improve performance of 'cargo test -p codex-tui' (#2593)
before:

```
$ time cargo test -p codex-tui -q
[...]
cargo test -p codex-tui -q  39.89s user 10.77s system 98% cpu 51.328 total
```

after:

```
$ time cargo test -p codex-tui -q
[...]
cargo test -p codex-tui -q  1.37s user 0.64s system 29% cpu 6.699 total
```

the major offenders were the textarea fuzz test and the custom_terminal
doctests. (i think the doctests were being recompiled every time which
made them extra slow?)
2025-08-22 14:03:58 -07:00
pap-openai
c5d21a4564 ctrl+v image + @file accepts images (#1695)
allow ctrl+v in TUI for images + @file that are images are appended as
raw files (and read by the model) rather than pasted as a path that
cannot be read by the model.

Re-used components and same interface we're using for copying pasted
content in
72504f1d9c.
@aibrahim-oai as you've implemented this, mind having a look at this
one?


https://github.com/user-attachments/assets/c6c1153b-6b32-4558-b9a2-f8c57d2be710

---------

Co-authored-by: easong-openai <easong@openai.com>
Co-authored-by: Daniel Edrisian <dedrisian@openai.com>
Co-authored-by: Michael Bolin <mbolin@openai.com>
2025-08-22 17:05:43 +00:00
Jeremy Rose
e95cad1946 hide CoT by default; show headers in status indicator (#2316)
Plan is for full CoT summaries to be visible in a "transcript view" when
we implement that, but for now they're hidden.


https://github.com/user-attachments/assets/e8a1b0ef-8f2a-48ff-9625-9c3c67d92cdb
2025-08-20 16:58:56 -07:00
Jeremy Rose
0d12380c3b refactor onboarding screen to a separate "app" (#2524)
this is in preparation for adding more separate "modes" to the tui, in
particular, a "transcript mode" to view a full history once #2316 lands.

1. split apart "tui events" from "app events".
2. remove onboarding-related events from AppEvent.
3. move several general drawing tools out of App and into a new Tui
class
2025-08-20 20:47:24 +00:00
Jeremy Rose
61bbabe7d9 tui: switch to using tokio + EventStream for processing crossterm events (#2489)
bringing the tui more into tokio-land to make it easier to factorize.

fyi @bolinfest
2025-08-20 17:11:09 +00:00
Ahmed Ibrahim
e91c3d6d1c Support changing reasoning effort (#2435)
https://github.com/user-attachments/assets/50198ee8-5915-47a3-bb71-69af65add1ef

Building up on #2431 #2428
2025-08-19 17:55:07 +00:00
Jeremy Rose
7a80d3c96c replace /prompts with a rotating placeholder (#2314) 2025-08-15 19:37:10 -07:00
aibrahim-oai
d3078b9adc Show progress indicator for /diff command (#2245)
## Summary
- Show a temporary Working on diff state in the bottom pan 
- Add `DiffResult` app event and dispatch git diff asynchronously

## Testing
- `just fmt`
- `just fix` *(fails: `let` expressions in this position are unstable)*
- `cargo test --all-features` *(fails: `let` expressions in this
position are unstable)*

------
https://chatgpt.com/codex/tasks/task_i_689a839f32b88321840a893551d5fbef
2025-08-15 15:32:41 -07:00
Jeremy Rose
b42e679227 remove "status text" in bottom line (#2279)
this used to hold the most recent log line, but it was kinda broken and
not that useful.
2025-08-14 14:10:21 -04:00
easong-openai
6340acd885 Re-add markdown streaming (#2029)
Wait for newlines, then render markdown on a line by line basis. Word wrap it for the current terminal size and then spit it out line by line into the UI. Also adds tests and fixes some UI regressions.
2025-08-12 17:37:28 -07:00
ae
a48372ce5d feat: add a /mention slash command (#2114)
- To help people discover @mentions.
- Command just places a @ in the composer.
- #2115 then improves the behavior of @mentions with empty queries.
2025-08-11 14:15:41 -07:00
easong-openai
52e12f2b6c Revert "Streaming markdown (#1920)" (#1981)
This reverts commit 2b7139859e.
2025-08-08 01:38:39 +00:00
easong-openai
2b7139859e Streaming markdown (#1920)
We wait until we have an entire newline, then format it with markdown and stream in to the UI. This reduces time to first token but is the right thing to do with our current rendering model IMO. Also lets us add word wrapping!
2025-08-07 18:26:47 -07:00
pakrym-oai
c87fb83d81 Calculate remaining context based on last token usage (#1940)
We should only take last request size (in tokens) into account
2025-08-07 05:17:18 -07:00
easong-openai
2098b40369 Scrollable slash commands (#1830)
Scrollable slash commands. Part 1 of the multi PR.
2025-08-06 21:23:09 -07:00
easong-openai
e0303dbac0 Rescue chat completion changes (#1846)
https://github.com/openai/codex/pull/1835 has some messed up history.

This adds support for streaming chat completions, which is useful for ollama. We should probably take a very skeptical eye to the code introduced in this PR.

---------

Co-authored-by: Ahmed Ibrahim <aibrahim@openai.com>
2025-08-05 08:56:13 +00:00
easong-openai
906d449760 Stream model responses (#1810)
Stream models thoughts and responses instead of waiting for the whole
thing to come through. Very rough right now, but I'm making the risk call to push through.
2025-08-05 04:23:22 +00:00
Jeremy Rose
d62b703a21 custom textarea (#1794)
This replaces tui-textarea with a custom textarea component.

Key differences:
1. wrapped lines
2. better unicode handling
3. uses the native terminal cursor

This should perhaps be spun out into its own separate crate at some
point, but for now it's convenient to have it in-tree.
2025-08-03 11:31:35 -07:00
easong-openai
575590e4c2 Detect kitty terminals (#1748)
We want to detect kitty terminals so we can preferentially upgrade their UX without degrading older terminals.
2025-08-01 00:30:44 +00:00
Jeremy Rose
d86270696e streamline ui (#1733)
Simplify and improve many UI elements.
* Remove all-around borders in most places. These interact badly with
terminal resizing and look heavy. Prefer left-side-only borders.
* Make the viewport adjust to the size of its contents.
* <kbd>/</kbd> and <kbd>@</kbd> autocomplete boxes appear below the
prompt, instead of above it.
* Restyle the keyboard shortcut hints & move them to the left.
* Restyle the approval dialog.
* Use synchronized rendering to avoid flashing during rerenders.


https://github.com/user-attachments/assets/96f044af-283b-411c-b7fc-5e6b8a433c20

<img width="1117" height="858" alt="Screenshot 2025-07-30 at 5 29 20 PM"
src="https://github.com/user-attachments/assets/0cc0af77-8396-429b-b6ee-9feaaccdbee7"
/>
2025-07-31 00:43:21 -07:00
Jeremy Rose
f2134f6633 resizable viewport (#1732)
Proof of concept for a resizable viewport.

The general approach here is to duplicate the `Terminal` struct from
ratatui, but with our own logic. This is a "light fork" in that we are
still using all the base ratatui functions (`Buffer`, `Widget` and so
on), but we're doing our own bookkeeping at the top level to determine
where to draw everything.

This approach could use improvement—e.g, when the window is resized to a
smaller size, if the UI wraps, we don't correctly clear out the
artifacts from wrapping. This is possible with a little work (i.e.
tracking what parts of our UI would have been wrapped), but this
behavior is at least at par with the existing behavior.


https://github.com/user-attachments/assets/4eb17689-09fd-4daa-8315-c7ebc654986d


cc @joshka who might have Thoughts™
2025-07-31 00:06:55 +00:00
easong-openai
80c19ea77c Fix approval workflow (#1696)
(Hopefully) temporary solution to the invisible approvals problem -
prints commands to history when they need approval and then also prints
the result of the approval. In the near future we should be able to do
some fancy stuff with updating commands before writing them to permanent
history.

Also, ctr-c while in the approval modal now acts as esc (aborts command)
and puts the TUI in the state where one additional ctr-c will exit.
2025-07-28 19:00:06 +00:00
easong-openai
58bed77ba7 Remove tab focus switching (#1694)
Previously pressing tab would switch TUI focus to the history scrollbox - no longer necessary.
2025-07-27 11:04:09 -07:00
easong-openai
480e82b00d Easily Selectable History (#1672)
This update replaces the previous ratatui history widget with an
append-only log so that the terminal can handle text selection and
scrolling. It also disables streaming responses, which we'll do our best
to bring back in a later PR. It also adds a small summary of token use
after the TUI exits.
2025-07-25 01:56:40 -07:00
aibrahim-oai
bb30ab9e96 Implement redraw debounce (#1599)
## Summary
- debouce redraw events so repeated requests don't overwhelm the
terminal
- add `RequestRedraw` event and schedule redraws after 100ms

## Testing
- `cargo clippy --tests`
- `cargo test` *(fails: Sandbox Denied errors in landlock tests)*

------
https://chatgpt.com/codex/tasks/task_i_68792a65b8b483218ec90a8f68746cd8

---------

Co-authored-by: Michael Bolin <mbolin@openai.com>
2025-07-17 12:54:55 -07:00
Michael Bolin
5b820c5ce7 feat: ctrl-d only exits when there is no user input (#1589)
While this does make it so that `ctrl-d` will not exit Codex when the
composer is not empty, `ctrl-d` will still exit Codex if it is in the
"working" state.

Fixes https://github.com/openai/codex/issues/1443.
2025-07-16 08:59:26 -07:00
aibrahim-oai
72504f1d9c Add paste summarization to Codex TUI (#1549)
## Summary
- introduce `Paste` event to avoid per-character paste handling
- collapse large pasted blocks to `[Pasted Content X lines]`
- store the real text so submission still includes it
- wire paste handling through `App`, `ChatWidget`, `BottomPane`, and
`ChatComposer`

## Testing
- `cargo test -p codex-tui`


------
https://chatgpt.com/codex/tasks/task_i_6871e24abf80832184d1f3ca0c61a5ee


https://github.com/user-attachments/assets/eda7412f-da30-4474-9f7c-96b49d48fbf8
2025-07-12 15:32:00 -07:00
Michael Bolin
4a341efe92 feat: highlight matching characters in fuzzy file search (#1420)
Using the new file-search API introduced in
https://github.com/openai/codex/pull/1419, matching characters are now
shown in bold in the TUI:


https://github.com/user-attachments/assets/8bbcc6c6-75a3-493f-8ea4-b2a063e09b3a

Fixes https://github.com/openai/codex/issues/1261
2025-06-28 15:04:23 -07:00
Michael Bolin
5a0f236ca4 feat: add support for @ to do file search (#1401)
Introduces support for `@` to trigger a fuzzy-filename search in the
composer. Under the hood, this leverages
https://crates.io/crates/nucleo-matcher to do the fuzzy matching and
https://crates.io/crates/ignore to build up the list of file candidates
(so that it respects `.gitignore`).

For simplicity (at least for now), we do not do any caching between
searches like VS Code does for its file search:


1d89ed699b/src/vs/workbench/services/search/node/rawSearchService.ts (L212-L218)

Because we do not do any caching, I saw queries take up to three seconds
on large repositories with hundreds of thousands of files. To that end,
we do not perform searches synchronously on each keystroke, but instead
dispatch an event to do the search on a background thread that
asynchronously reports back to the UI when the results are available.
This is largely handled by the `FileSearchManager` introduced in this
PR, which also has logic for debouncing requests so there is at most one
search in flight at a time.

While we could potentially polish and tune this feature further, it may
already be overengineered for how it will be used, in practice, so we
can improve things going forward if it turns out that this is not "good
enough" in the wild.

Note this feature does not work like `@` in the TypeScript CLI, which
was more like directory-based tab completion. In the Rust CLI, `@`
triggers a full-repo fuzzy-filename search.

Fixes https://github.com/openai/codex/issues/1261.
2025-06-28 13:47:42 -07:00
Gabriel Peal
2e293ce903 Handle Ctrl+C quit when idle (#1402)
## Summary
- show `Ctrl+C to quit` hint when pressing Ctrl+C with no active task
- exiting with Ctrl+C if the hint is already visible
- clear the hint when tasks begin or other keys are pressed


https://github.com/user-attachments/assets/931e2d7c-1c80-4b45-9908-d119f74df23c



------
https://chatgpt.com/s/cd_685ec8875a308191beaa95886dc1379e

Fixes #1245
2025-06-27 13:37:11 -04:00
Michael Bolin
fcfe43c7df feat: show number of tokens remaining in UI (#1388)
When using the OpenAI Responses API, we now record the `usage` field for
a `"response.completed"` event, which includes metrics about the number
of tokens consumed. We also introduce `openai_model_info.rs`, which
includes current data about the most common OpenAI models available via
the API (specifically `context_window` and `max_output_tokens`). If
Codex does not recognize the model, you can set `model_context_window`
and `model_max_output_tokens` explicitly in `config.toml`.

When then introduce a new event type to `protocol.rs`, `TokenCount`,
which includes the `TokenUsage` for the most recent turn.

Finally, we update the TUI to record the running sum of tokens used so
the percentage of available context window remaining can be reported via
the placeholder text for the composer:

![Screenshot 2025-06-25 at 11 20
55 PM](https://github.com/user-attachments/assets/6fd6982f-7247-4f14-84b2-2e600cb1fd49)

We could certainly get much fancier with this (such as reporting the
estimated cost of the conversation), but for now, we are just trying to
achieve feature parity with the TypeScript CLI.

Though arguably this improves upon the TypeScript CLI, as the TypeScript
CLI uses heuristics to estimate the number of tokens used rather than
using the `usage` information directly:


296996d74e/codex-cli/src/utils/approximate-tokens-used.ts (L3-L16)

Fixes https://github.com/openai/codex/issues/1242
2025-06-25 23:31:11 -07:00
Michael Bolin
ce2ecbe72f feat: record messages from user in ~/.codex/history.jsonl (#939)
This is a large change to support a "history" feature like you would
expect in a shell like Bash.

History events are recorded in `$CODEX_HOME/history.jsonl`. Because it
is a JSONL file, it is straightforward to append new entries (as opposed
to the TypeScript file that uses `$CODEX_HOME/history.json`, so to be
valid JSON, each new entry entails rewriting the entire file). Because
it is possible for there to be multiple instances of Codex CLI writing
to `history.jsonl` at once, we use advisory file locking when working
with `history.jsonl` in `codex-rs/core/src/message_history.rs`.

Because we believe history is a sufficiently useful feature, we enable
it by default. Though to provide some safety, we set the file
permissions of `history.jsonl` to be `o600` so that other users on the
system cannot read the user's history. We do not yet support a default
list of `SENSITIVE_PATTERNS` as the TypeScript CLI does:


3fdf9df133/codex-cli/src/utils/storage/command-history.ts (L10-L17)

We are going to take a more conservative approach to this list in the
Rust CLI. For example, while `/\b[A-Za-z0-9-_]{20,}\b/` might exclude
sensitive information like API tokens, it would also exclude valuable
information such as references to Git commits.

As noted in the updated documentation, users can opt-out of history by
adding the following to `config.toml`:

```toml
[history]
persistence = "none" 
```

Because `history.jsonl` could, in theory, be quite large, we take a[n
arguably overly pedantic] approach in reading history entries into
memory. Specifically, we start by telling the client the current number
of entries in the history file (`history_entry_count`) as well as the
inode (`history_log_id`) of `history.jsonl` (see the new fields on
`SessionConfiguredEvent`).

The client is responsible for keeping new entries in memory to create a
"local history," but if the user hits up enough times to go "past" the
end of local history, then the client should use the new
`GetHistoryEntryRequest` in the protocol to fetch older entries.
Specifically, it should pass the `history_log_id` it was given
originally and work backwards from `history_entry_count`. (It should
really fetch history in batches rather than one-at-a-time, but that is
something we can improve upon in subsequent PRs.)

The motivation behind this crazy scheme is that it is designed to defend
against:

* The `history.jsonl` being truncated during the session such that the
index into the history is no longer consistent with what had been read
up to that point. We do not yet have logic to enforce a `max_bytes` for
`history.jsonl`, but once we do, we will aspire to implement it in a way
that should result in a new inode for the file on most systems.
* New items from concurrent Codex CLI sessions amending to the history.
Because, in absence of truncation, `history.jsonl` is an append-only
log, so long as the client reads backwards from `history_entry_count`,
it should always get a consistent view of history. (That said, it will
not be able to read _new_ commands from concurrent sessions, but perhaps
we will introduce a `/` command to reload latest history or something
down the road.)

Admittedly, my testing of this feature thus far has been fairly light. I
expect we will find bugs and introduce enhancements/fixes going forward.
2025-05-15 16:26:23 -07:00
Michael Bolin
3fdf9df133 chore: introduce AppEventSender to help fix clippy warnings and update to Rust 1.87 (#948)
Moving to Rust 1.87 introduced a clippy warning that
`SendError<AppEvent>` was too large.

In practice, the only thing we ever did when we got this error was log
it (if the mspc channel is closed, then the app is likely shutting down
or something, so there's not much to do...), so this finally motivated
me to introduce `AppEventSender`, which wraps
`std::sync::mpsc::Sender<AppEvent>` with a `send()` method that invokes
`send()` on the underlying `Sender` and logs an `Err` if it gets one.

This greatly simplifies the code, as many functions that previously
returned `Result<(), SendError<AppEvent>>` now return `()`, so we don't
have to propagate an `Err` all over the place that we don't really
handle, anyway.

This also makes it so we can upgrade to Rust 1.87 in CI.
2025-05-15 14:50:30 -07:00
Michael Bolin
a12e4b0b31 feat: add support for commands in the Rust TUI (#935)
Introduces support for slash commands like in the TypeScript CLI. We do
not support the full set of commands yet, but the core abstraction is
there now.

In particular, we have a `SlashCommand` enum and due to thoughtful use
of the [strum](https://crates.io/crates/strum) crate, it requires
minimal boilerplate to add a new command to the list.

The key new piece of UI is `CommandPopup`, though the keyboard events
are still handled by `ChatComposer`. The behavior is roughly as follows:

* if the first character in the composer is `/`, the command popup is
displayed (if you really want to send a message to Codex that starts
with a `/`, simply put a space before the `/`)
* while the popup is displayed, up/down can be used to change the
selection of the popup
* if there is a selection, hitting tab completes the command, but does
not send it
* if there is a selection, hitting enter sends the command
* if the prefix of the composer matches a command, the command will be
visible in the popup so the user can see the description (commands could
take arguments, so additional text may appear after the command name
itself)


https://github.com/user-attachments/assets/39c3e6ee-eeb7-4ef7-a911-466d8184975f

Incidentally, Codex wrote almost all the code for this PR!
2025-05-14 12:55:49 -07:00
Michael Bolin
0402aef126 chore: move each view used in BottomPane into its own file (#928)
`BottomPane` was getting a bit unwieldy because it maintained a
`PaneState` enum with three variants and many of its methods had `match`
statements to handle each variant. To replace the enum, this PR:

* Introduces a `trait BottomPaneView` that has two implementations:
`StatusIndicatorView` and `ApprovalModalView`.
* Migrates `PaneState::TextInput` into its own struct, `ChatComposer`,
that does **not** implement `BottomPaneView`.
* Updates `BottomPane` so it has `composer: ChatComposer` and
`active_view: Option<Box<dyn BottomPaneView<'a> + 'a>>`. The idea is
that `active_view` takes priority and is displayed when it is `Some`;
otherwise, `ChatComposer` is displayed.
* While methods of `BottomPane` often have to check whether
`active_view` is present to decide which component to delegate to, the
code is more straightforward than before and introducing new
implementations of `BottomPaneView` should be less painful.

Because we want to retain the `TextArea` owned by `ChatComposer` even
when another view is displayed, to keep the ownership logic simple, it
seemed best to keep `ChatComposer` distinct from `BottomPaneView`.
2025-05-14 10:13:29 -07:00