keep a 1 cell margin at the right edge of the screen in the composer
(and in the user message in history).
this lets us print clear-to-EOL 1 char before the end of the line in
history, so that resizing the terminal will keep the background color
(at least in iterm/terminal.app). it also stops the cursor in the
textarea from floating off the right edge.
---------
Co-authored-by: joshka-oai <joshka@openai.com>
Before when you would enter `/di`, hit tab on `/diff`, and then hit
enter, it would execute `/diff`. But now it's just sending it as a text.
This fixes the issue.
## Summary
- show the remaining context window percentage in `/status` alongside
existing token usage details
- replace the composer shortcut prompt with the context window
percentage (or an unavailable message) while a task is running
- update TUI snapshots to reflect the new context window line
## Testing
- cargo test -p codex-tui
------
https://chatgpt.com/codex/tasks/task_i_68dc6e7397ac8321909d7daff25a396c
Here's the logic:
1. If text is empty and selector is open:
- Enter on a prompt without args should autosubmit the prompt
- Enter on a prompt with numeric args should add `/prompts:name ` to the
text input
- Enter on a prompt with named args should add `/prompts:name ARG1=""
ARG2=""` to the text input
2. If text is not empty but no args are passed:
- For prompts with numeric args -> we allow it to submit (params are
optional)
- For prompts with named args -> we throw an error (all params should
have values)
<img width="454" height="246" alt="Screenshot 2025-09-23 at 2 23 21 PM"
src="https://github.com/user-attachments/assets/fd180a1b-7d17-42ec-b231-8da48828b811"
/>
Fixing the "? for shortcuts"
- Only show the hint when composer is empty
- Don't reset footer on new task updates
- Reorder the elements
- Align the "?" and "/" with overlay on and off
Based on #4364
Instead of overwriting the contents of the composer when pressing
<kbd>Esc</kbd> when there's a queued message, prepend the queued
message(s) to the composer draft.
The only file to watch is the cargo.toml
All the others come from just fix + a few manual small fix
The set of rules have been taken from the list of clippy rules
arbitrarily while trying to optimise the learning and style of the code
while limiting the loss of productivity
Created this PR by:
- adding `redundant_clone` to `[workspace.lints.clippy]` in
`cargo-rs/Cargol.toml`
- running `cargo clippy --tests --fix`
- running `just fmt`
Though I had to clean up one instance of the following that resulted:
```rust
let codex = codex;
```
- In the bottom line of the TUI, print the number of tokens to 3 sigfigs
with an SI suffix, e.g. "1.23K".
- Elsewhere where we print a number, I figure it's worthwhile to print
the exact number, because e.g. it's a summary of your session. Here we print
the numbers comma-separated.
This updates the ctrl + c behavior to clear the current prompt if there
is text and you press ctrl + c.
I also updated the ctrl + c hint text to show `^c to interrupt` instead
of `^c to quit` if there is an active conversation.
Two things I don't love:
1. You can currently interrupt a conversation with escape or ctrl + c
(not related to this PR and maybe fine)
2. The bottom row hint text always says `^c to quit` but this PR doesn't
really make that worse.
https://github.com/user-attachments/assets/6eddadec-0d84-4fa7-abcb-d6f5a04e5748
Fixes https://github.com/openai/codex/issues/3126
We had multiple issues with context size calculation:
1. `initial_prompt_tokens` calculation based on cache size is not
reliable, cache misses might set it to much higher value. For now
hardcoded to a safer constant.
2. Input context size for GPT-5 is 272k (that's where 33% came from).
Fixes.
#### Summary
- render the edit queued message shortcut with the ⌥ modifier on macOS
builds
- add a helper for status indicator snapshot suffixes
- record macOS-specific snapshots for the status indicator widget
## Summary
Pressing Enter with an empty composer was treated as a submission, which
queued a blank message while a task was running. This PR suppresses
submission when there is no text and no attachments.
## Root Cause
- ChatComposer returned Submitted even when the trimmed text was empty.
ChatWidget then queued it during a running task, leading to an empty
item appearing in the queued list and being popped later with no effect.
## Changes
- ChatComposer Enter handling: if trimmed text is empty and there are no
attached images, return None instead of Submitted.
- No changes to ChatWidget; behavior naturally stops queuing blanks at
the source.
## Code Paths
- Modified: `tui/src/bottom_pane/chat_composer.rs`
- Tests added:
- `tui/src/bottom_pane/chat_composer.rs`: `empty_enter_returns_none`
- `tui/src/chatwidget/tests.rs`:
`empty_enter_during_task_does_not_queue`
## Result
### Before
https://github.com/user-attachments/assets/a40e2f6d-42ba-4a82-928b-8f5458f5884d
### After
https://github.com/user-attachments/assets/958900b7-a566-44fc-b16c-b80380739c92
Adds custom `/prompts` to `~/.codex/prompts/<command>.md`.
<img width="239" height="107" alt="Screenshot 2025-08-25 at 6 22 42 PM"
src="https://github.com/user-attachments/assets/fe6ebbaa-1bf6-49d3-95f9-fdc53b752679"
/>
---
Details:
1. Adds `Op::ListCustomPrompts` to core.
2. Returns `ListCustomPromptsResponse` with list of `CustomPrompt`
(name, content).
3. TUI calls the operation on load, and populates the custom prompts
(excluding prompts that collide with builtins).
4. Selecting the custom prompt automatically sends the prompt to the
agent.
This PR fixes two edge cases in managing burst paste (mainly on power
shell).
Bugs:
- Needs an event key after paste to render the pasted items
> ChatComposer::flush_paste_burst_if_due() flushes on timeout. Called:
> - Pre-render in App on TuiEvent::Draw.
> - Via a delayed frame
>
BottomPane::request_redraw_in(ChatComposer::recommended_paste_flush_delay()).
- Parses two key events separately before starting parsing burst paste
> When threshold is crossed, pull preceding burst chars out of the
textarea and prepend to paste_burst_buffer, then keep buffering.
- Integrates with #2567 to bring image pasting to windows.
Prevented panics when deleting placeholders near multibyte characters by
clamping the cursor to a valid boundary and using get-based slicing
Added a regression test to ensure backspacing after multibyte text
leaves placeholders intact without crashing
---------
Co-authored-by: Ahmed Ibrahim <aibrahim@openai.com>
Esc and Ctrl+C while a task is running should do the same thing. There
were some cases where pressing Esc would leave a "stuck" widget in the
history; this fixes that and cleans up the logic so there's just one
path for interrupting the task. Also clean up some subtly mishandled key
events (e.g. Ctrl+D would quit the app while an approval modal was
showing if the textarea was empty).
---------
Co-authored-by: Ahmed Ibrahim <aibrahim@openai.com>
In this PR:
- [x] Add support for dragging / copying image files into chat.
- [x] Don't remove image placeholders when submitting.
- [x] Add tests.
Works for:
- Image Files
- Dragging MacOS Screenshots (Terminal, iTerm)
Todos:
- [ ] In some terminals (VSCode, WIndows Powershell, and remote
SSH-ing), copy-pasting a file streams the escaped filepath as individual
key events rather than a single Paste event. We'll need to have a
function (in a separate PR) for detecting these paste events.
Esc should have other functionalities when it's not used in a
backtracking situation. i.e. to cancel pop up menu when selecting
model/approvals or to interrupt an active turn.
Introduce a minimal paste-burst heuristic in the chat composer so Enter
is treated as a newline during paste-like bursts (plain chars arriving
in very short intervals), avoiding premature submit after the first line
on Windows consoles that lack bracketed paste.
- Detect tight sequences of plain Char events; open a short window where
Enter inserts a newline instead of submitting.
- Extend the window on newline to handle blank lines in pasted content.
- No behavior change for terminals that already emit Event::Paste; no
OS/env toggles added.
allow ctrl+v in TUI for images + @file that are images are appended as
raw files (and read by the model) rather than pasted as a path that
cannot be read by the model.
Re-used components and same interface we're using for copying pasted
content in
72504f1d9c.
@aibrahim-oai as you've implemented this, mind having a look at this
one?
https://github.com/user-attachments/assets/c6c1153b-6b32-4558-b9a2-f8c57d2be710
---------
Co-authored-by: easong-openai <easong@openai.com>
Co-authored-by: Daniel Edrisian <dedrisian@openai.com>
Co-authored-by: Michael Bolin <mbolin@openai.com>