Files
llmx/llmx-rs/tui/src/bottom_pane/chat_composer_history.rs
Sebastian Krüger 3c7efc58c8 feat: Complete LLMX v0.1.0 - Rebrand from Codex with LiteLLM Integration
This release represents a comprehensive transformation of the codebase from Codex to LLMX,
enhanced with LiteLLM integration to support 100+ LLM providers through a unified API.

## Major Changes

### Phase 1: Repository & Infrastructure Setup
- Established new repository structure and branching strategy
- Created comprehensive project documentation (CLAUDE.md, LITELLM-SETUP.md)
- Set up development environment and tooling configuration

### Phase 2: Rust Workspace Transformation
- Renamed all Rust crates from `codex-*` to `llmx-*` (30+ crates)
- Updated package names, binary names, and workspace members
- Renamed core modules: codex.rs → llmx.rs, codex_delegate.rs → llmx_delegate.rs
- Updated all internal references, imports, and type names
- Renamed directories: codex-rs/ → llmx-rs/, codex-backend-openapi-models/ → llmx-backend-openapi-models/
- Fixed all Rust compilation errors after mass rename

### Phase 3: LiteLLM Integration
- Integrated LiteLLM for multi-provider LLM support (Anthropic, OpenAI, Azure, Google AI, AWS Bedrock, etc.)
- Implemented OpenAI-compatible Chat Completions API support
- Added model family detection and provider-specific handling
- Updated authentication to support LiteLLM API keys
- Renamed environment variables: OPENAI_BASE_URL → LLMX_BASE_URL
- Added LLMX_API_KEY for unified authentication
- Enhanced error handling for Chat Completions API responses
- Implemented fallback mechanisms between Responses API and Chat Completions API

### Phase 4: TypeScript/Node.js Components
- Renamed npm package: @codex/codex-cli → @valknar/llmx
- Updated TypeScript SDK to use new LLMX APIs and endpoints
- Fixed all TypeScript compilation and linting errors
- Updated SDK tests to support both API backends
- Enhanced mock server to handle multiple API formats
- Updated build scripts for cross-platform packaging

### Phase 5: Configuration & Documentation
- Updated all configuration files to use LLMX naming
- Rewrote README and documentation for LLMX branding
- Updated config paths: ~/.codex/ → ~/.llmx/
- Added comprehensive LiteLLM setup guide
- Updated all user-facing strings and help text
- Created release plan and migration documentation

### Phase 6: Testing & Validation
- Fixed all Rust tests for new naming scheme
- Updated snapshot tests in TUI (36 frame files)
- Fixed authentication storage tests
- Updated Chat Completions payload and SSE tests
- Fixed SDK tests for new API endpoints
- Ensured compatibility with Claude Sonnet 4.5 model
- Fixed test environment variables (LLMX_API_KEY, LLMX_BASE_URL)

### Phase 7: Build & Release Pipeline
- Updated GitHub Actions workflows for LLMX binary names
- Fixed rust-release.yml to reference llmx-rs/ instead of codex-rs/
- Updated CI/CD pipelines for new package names
- Made Apple code signing optional in release workflow
- Enhanced npm packaging resilience for partial platform builds
- Added Windows sandbox support to workspace
- Updated dotslash configuration for new binary names

### Phase 8: Final Polish
- Renamed all assets (.github images, labels, templates)
- Updated VSCode and DevContainer configurations
- Fixed all clippy warnings and formatting issues
- Applied cargo fmt and prettier formatting across codebase
- Updated issue templates and pull request templates
- Fixed all remaining UI text references

## Technical Details

**Breaking Changes:**
- Binary name changed from `codex` to `llmx`
- Config directory changed from `~/.codex/` to `~/.llmx/`
- Environment variables renamed (CODEX_* → LLMX_*)
- npm package renamed to `@valknar/llmx`

**New Features:**
- Support for 100+ LLM providers via LiteLLM
- Unified authentication with LLMX_API_KEY
- Enhanced model provider detection and handling
- Improved error handling and fallback mechanisms

**Files Changed:**
- 578 files modified across Rust, TypeScript, and documentation
- 30+ Rust crates renamed and updated
- Complete rebrand of UI, CLI, and documentation
- All tests updated and passing

**Dependencies:**
- Updated Cargo.lock with new package names
- Updated npm dependencies in llmx-cli
- Enhanced OpenAPI models for LLMX backend

This release establishes LLMX as a standalone project with comprehensive LiteLLM
integration, maintaining full backward compatibility with existing functionality
while opening support for a wide ecosystem of LLM providers.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
Co-Authored-By: Sebastian Krüger <support@pivoine.art>
2025-11-12 20:40:44 +01:00

301 lines
10 KiB
Rust
Raw Blame History

This file contains ambiguous Unicode characters
This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.
use std::collections::HashMap;
use crate::app_event::AppEvent;
use crate::app_event_sender::AppEventSender;
use llmx_core::protocol::Op;
/// State machine that manages shell-style history navigation (Up/Down) inside
/// the chat composer. This struct is intentionally decoupled from the
/// rendering widget so the logic remains isolated and easier to test.
pub(crate) struct ChatComposerHistory {
/// Identifier of the history log as reported by `SessionConfiguredEvent`.
history_log_id: Option<u64>,
/// Number of entries already present in the persistent cross-session
/// history file when the session started.
history_entry_count: usize,
/// Messages submitted by the user *during this UI session* (newest at END).
local_history: Vec<String>,
/// Cache of persistent history entries fetched on-demand.
fetched_history: HashMap<usize, String>,
/// Current cursor within the combined (persistent + local) history. `None`
/// indicates the user is *not* currently browsing history.
history_cursor: Option<isize>,
/// The text that was last inserted into the composer as a result of
/// history navigation. Used to decide if further Up/Down presses should be
/// treated as navigation versus normal cursor movement.
last_history_text: Option<String>,
}
impl ChatComposerHistory {
pub fn new() -> Self {
Self {
history_log_id: None,
history_entry_count: 0,
local_history: Vec::new(),
fetched_history: HashMap::new(),
history_cursor: None,
last_history_text: None,
}
}
/// Update metadata when a new session is configured.
pub fn set_metadata(&mut self, log_id: u64, entry_count: usize) {
self.history_log_id = Some(log_id);
self.history_entry_count = entry_count;
self.fetched_history.clear();
self.local_history.clear();
self.history_cursor = None;
self.last_history_text = None;
}
/// Record a message submitted by the user in the current session so it can
/// be recalled later.
pub fn record_local_submission(&mut self, text: &str) {
if text.is_empty() {
return;
}
self.history_cursor = None;
self.last_history_text = None;
// Avoid inserting a duplicate if identical to the previous entry.
if self.local_history.last().is_some_and(|prev| prev == text) {
return;
}
self.local_history.push(text.to_string());
}
/// Reset navigation tracking so the next Up key resumes from the latest entry.
pub fn reset_navigation(&mut self) {
self.history_cursor = None;
self.last_history_text = None;
}
/// Should Up/Down key presses be interpreted as history navigation given
/// the current content and cursor position of `textarea`?
pub fn should_handle_navigation(&self, text: &str, cursor: usize) -> bool {
if self.history_entry_count == 0 && self.local_history.is_empty() {
return false;
}
if text.is_empty() {
return true;
}
// Textarea is not empty only navigate when cursor is at start and
// text matches last recalled history entry so regular editing is not
// hijacked.
if cursor != 0 {
return false;
}
matches!(&self.last_history_text, Some(prev) if prev == text)
}
/// Handle <Up>. Returns true when the key was consumed and the caller
/// should request a redraw.
pub fn navigate_up(&mut self, app_event_tx: &AppEventSender) -> Option<String> {
let total_entries = self.history_entry_count + self.local_history.len();
if total_entries == 0 {
return None;
}
let next_idx = match self.history_cursor {
None => (total_entries as isize) - 1,
Some(0) => return None, // already at oldest
Some(idx) => idx - 1,
};
self.history_cursor = Some(next_idx);
self.populate_history_at_index(next_idx as usize, app_event_tx)
}
/// Handle <Down>.
pub fn navigate_down(&mut self, app_event_tx: &AppEventSender) -> Option<String> {
let total_entries = self.history_entry_count + self.local_history.len();
if total_entries == 0 {
return None;
}
let next_idx_opt = match self.history_cursor {
None => return None, // not browsing
Some(idx) if (idx as usize) + 1 >= total_entries => None,
Some(idx) => Some(idx + 1),
};
match next_idx_opt {
Some(idx) => {
self.history_cursor = Some(idx);
self.populate_history_at_index(idx as usize, app_event_tx)
}
None => {
// Past newest clear and exit browsing mode.
self.history_cursor = None;
self.last_history_text = None;
Some(String::new())
}
}
}
/// Integrate a GetHistoryEntryResponse event.
pub fn on_entry_response(
&mut self,
log_id: u64,
offset: usize,
entry: Option<String>,
) -> Option<String> {
if self.history_log_id != Some(log_id) {
return None;
}
let text = entry?;
self.fetched_history.insert(offset, text.clone());
if self.history_cursor == Some(offset as isize) {
self.last_history_text = Some(text.clone());
return Some(text);
}
None
}
// ---------------------------------------------------------------------
// Internal helpers
// ---------------------------------------------------------------------
fn populate_history_at_index(
&mut self,
global_idx: usize,
app_event_tx: &AppEventSender,
) -> Option<String> {
if global_idx >= self.history_entry_count {
// Local entry.
if let Some(text) = self
.local_history
.get(global_idx - self.history_entry_count)
{
self.last_history_text = Some(text.clone());
return Some(text.clone());
}
} else if let Some(text) = self.fetched_history.get(&global_idx) {
self.last_history_text = Some(text.clone());
return Some(text.clone());
} else if let Some(log_id) = self.history_log_id {
let op = Op::GetHistoryEntryRequest {
offset: global_idx,
log_id,
};
app_event_tx.send(AppEvent::LlmxOp(op));
}
None
}
}
#[cfg(test)]
mod tests {
use super::*;
use crate::app_event::AppEvent;
use llmx_core::protocol::Op;
use tokio::sync::mpsc::unbounded_channel;
#[test]
fn duplicate_submissions_are_not_recorded() {
let mut history = ChatComposerHistory::new();
// Empty submissions are ignored.
history.record_local_submission("");
assert_eq!(history.local_history.len(), 0);
// First entry is recorded.
history.record_local_submission("hello");
assert_eq!(history.local_history.len(), 1);
assert_eq!(history.local_history.last().unwrap(), "hello");
// Identical consecutive entry is skipped.
history.record_local_submission("hello");
assert_eq!(history.local_history.len(), 1);
// Different entry is recorded.
history.record_local_submission("world");
assert_eq!(history.local_history.len(), 2);
assert_eq!(history.local_history.last().unwrap(), "world");
}
#[test]
fn navigation_with_async_fetch() {
let (tx, mut rx) = unbounded_channel::<AppEvent>();
let tx = AppEventSender::new(tx);
let mut history = ChatComposerHistory::new();
// Pretend there are 3 persistent entries.
history.set_metadata(1, 3);
// First Up should request offset 2 (latest) and await async data.
assert!(history.should_handle_navigation("", 0));
assert!(history.navigate_up(&tx).is_none()); // don't replace the text yet
// Verify that an AppEvent::LlmxOp with the correct GetHistoryEntryRequest was sent.
let event = rx.try_recv().expect("expected AppEvent to be sent");
let AppEvent::LlmxOp(history_request1) = event else {
panic!("unexpected event variant");
};
assert_eq!(
Op::GetHistoryEntryRequest {
log_id: 1,
offset: 2
},
history_request1
);
// Inject the async response.
assert_eq!(
Some("latest".into()),
history.on_entry_response(1, 2, Some("latest".into()))
);
// Next Up should move to offset 1.
assert!(history.navigate_up(&tx).is_none()); // don't replace the text yet
// Verify second LlmxOp event for offset 1.
let event2 = rx.try_recv().expect("expected second event");
let AppEvent::LlmxOp(history_request_2) = event2 else {
panic!("unexpected event variant");
};
assert_eq!(
Op::GetHistoryEntryRequest {
log_id: 1,
offset: 1
},
history_request_2
);
assert_eq!(
Some("older".into()),
history.on_entry_response(1, 1, Some("older".into()))
);
}
#[test]
fn reset_navigation_resets_cursor() {
let (tx, _rx) = unbounded_channel::<AppEvent>();
let tx = AppEventSender::new(tx);
let mut history = ChatComposerHistory::new();
history.set_metadata(1, 3);
history.fetched_history.insert(1, "command2".into());
history.fetched_history.insert(2, "command3".into());
assert_eq!(Some("command3".into()), history.navigate_up(&tx));
assert_eq!(Some("command2".into()), history.navigate_up(&tx));
history.reset_navigation();
assert!(history.history_cursor.is_none());
assert!(history.last_history_text.is_none());
assert_eq!(Some("command3".into()), history.navigate_up(&tx));
}
}