Files
llmx/llmx-rs/tui/src/onboarding/windows.rs
Sebastian Krüger 3c7efc58c8 feat: Complete LLMX v0.1.0 - Rebrand from Codex with LiteLLM Integration
This release represents a comprehensive transformation of the codebase from Codex to LLMX,
enhanced with LiteLLM integration to support 100+ LLM providers through a unified API.

## Major Changes

### Phase 1: Repository & Infrastructure Setup
- Established new repository structure and branching strategy
- Created comprehensive project documentation (CLAUDE.md, LITELLM-SETUP.md)
- Set up development environment and tooling configuration

### Phase 2: Rust Workspace Transformation
- Renamed all Rust crates from `codex-*` to `llmx-*` (30+ crates)
- Updated package names, binary names, and workspace members
- Renamed core modules: codex.rs → llmx.rs, codex_delegate.rs → llmx_delegate.rs
- Updated all internal references, imports, and type names
- Renamed directories: codex-rs/ → llmx-rs/, codex-backend-openapi-models/ → llmx-backend-openapi-models/
- Fixed all Rust compilation errors after mass rename

### Phase 3: LiteLLM Integration
- Integrated LiteLLM for multi-provider LLM support (Anthropic, OpenAI, Azure, Google AI, AWS Bedrock, etc.)
- Implemented OpenAI-compatible Chat Completions API support
- Added model family detection and provider-specific handling
- Updated authentication to support LiteLLM API keys
- Renamed environment variables: OPENAI_BASE_URL → LLMX_BASE_URL
- Added LLMX_API_KEY for unified authentication
- Enhanced error handling for Chat Completions API responses
- Implemented fallback mechanisms between Responses API and Chat Completions API

### Phase 4: TypeScript/Node.js Components
- Renamed npm package: @codex/codex-cli → @valknar/llmx
- Updated TypeScript SDK to use new LLMX APIs and endpoints
- Fixed all TypeScript compilation and linting errors
- Updated SDK tests to support both API backends
- Enhanced mock server to handle multiple API formats
- Updated build scripts for cross-platform packaging

### Phase 5: Configuration & Documentation
- Updated all configuration files to use LLMX naming
- Rewrote README and documentation for LLMX branding
- Updated config paths: ~/.codex/ → ~/.llmx/
- Added comprehensive LiteLLM setup guide
- Updated all user-facing strings and help text
- Created release plan and migration documentation

### Phase 6: Testing & Validation
- Fixed all Rust tests for new naming scheme
- Updated snapshot tests in TUI (36 frame files)
- Fixed authentication storage tests
- Updated Chat Completions payload and SSE tests
- Fixed SDK tests for new API endpoints
- Ensured compatibility with Claude Sonnet 4.5 model
- Fixed test environment variables (LLMX_API_KEY, LLMX_BASE_URL)

### Phase 7: Build & Release Pipeline
- Updated GitHub Actions workflows for LLMX binary names
- Fixed rust-release.yml to reference llmx-rs/ instead of codex-rs/
- Updated CI/CD pipelines for new package names
- Made Apple code signing optional in release workflow
- Enhanced npm packaging resilience for partial platform builds
- Added Windows sandbox support to workspace
- Updated dotslash configuration for new binary names

### Phase 8: Final Polish
- Renamed all assets (.github images, labels, templates)
- Updated VSCode and DevContainer configurations
- Fixed all clippy warnings and formatting issues
- Applied cargo fmt and prettier formatting across codebase
- Updated issue templates and pull request templates
- Fixed all remaining UI text references

## Technical Details

**Breaking Changes:**
- Binary name changed from `codex` to `llmx`
- Config directory changed from `~/.codex/` to `~/.llmx/`
- Environment variables renamed (CODEX_* → LLMX_*)
- npm package renamed to `@valknar/llmx`

**New Features:**
- Support for 100+ LLM providers via LiteLLM
- Unified authentication with LLMX_API_KEY
- Enhanced model provider detection and handling
- Improved error handling and fallback mechanisms

**Files Changed:**
- 578 files modified across Rust, TypeScript, and documentation
- 30+ Rust crates renamed and updated
- Complete rebrand of UI, CLI, and documentation
- All tests updated and passing

**Dependencies:**
- Updated Cargo.lock with new package names
- Updated npm dependencies in llmx-cli
- Enhanced OpenAPI models for LLMX backend

This release establishes LLMX as a standalone project with comprehensive LiteLLM
integration, maintaining full backward compatibility with existing functionality
while opening support for a wide ecosystem of LLM providers.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
Co-Authored-By: Sebastian Krüger <support@pivoine.art>
2025-11-12 20:40:44 +01:00

206 lines
6.6 KiB
Rust

use std::path::PathBuf;
use crossterm::event::KeyCode;
use crossterm::event::KeyEvent;
use crossterm::event::KeyEventKind;
use llmx_core::config::edit::ConfigEditsBuilder;
use ratatui::buffer::Buffer;
use ratatui::layout::Rect;
use ratatui::prelude::Widget;
use ratatui::style::Color;
use ratatui::style::Stylize;
use ratatui::text::Line;
use ratatui::widgets::Paragraph;
use ratatui::widgets::WidgetRef;
use ratatui::widgets::Wrap;
use crate::onboarding::onboarding_screen::KeyboardHandler;
use crate::onboarding::onboarding_screen::StepStateProvider;
use super::onboarding_screen::StepState;
pub(crate) const WSL_INSTRUCTIONS: &str = r#"Install WSL2 by opening PowerShell as Administrator and running:
# Install WSL using the default Linux distribution (Ubuntu).
# See https://learn.microsoft.com/en-us/windows/wsl/install for more info
wsl --install
# Restart your computer, then start a shell inside of Windows Subsystem for Linux
wsl
# Install Node.js in WSL via nvm
# Documentation: https://learn.microsoft.com/en-us/windows/dev-environment/javascript/nodejs-on-wsl
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/master/install.sh | bash && export NVM_DIR="$HOME/.nvm" && \. "$NVM_DIR/nvm.sh"
nvm install 22
# Install and run LLMX in WSL
npm install --global @openai/llmx
llmx
# Additional details and instructions for how to install and run LLMX in WSL:
https://developers.openai.com/llmx/windows"#;
pub(crate) struct WindowsSetupWidget {
pub llmx_home: PathBuf,
pub selection: Option<WindowsSetupSelection>,
pub highlighted: WindowsSetupSelection,
pub error: Option<String>,
exit_requested: bool,
}
#[derive(Clone, Copy, Debug, PartialEq, Eq)]
pub enum WindowsSetupSelection {
Continue,
Install,
}
impl WindowsSetupWidget {
pub fn new(llmx_home: PathBuf) -> Self {
Self {
llmx_home,
selection: None,
highlighted: WindowsSetupSelection::Install,
error: None,
exit_requested: false,
}
}
fn handle_continue(&mut self) {
self.highlighted = WindowsSetupSelection::Continue;
match ConfigEditsBuilder::new(&self.llmx_home)
.set_windows_wsl_setup_acknowledged(true)
.apply_blocking()
{
Ok(()) => {
self.selection = Some(WindowsSetupSelection::Continue);
self.exit_requested = false;
self.error = None;
}
Err(err) => {
tracing::error!("Failed to persist Windows onboarding acknowledgement: {err:?}");
self.error = Some(format!("Failed to update config: {err}"));
self.selection = None;
}
}
}
fn handle_install(&mut self) {
self.highlighted = WindowsSetupSelection::Install;
self.selection = Some(WindowsSetupSelection::Install);
self.exit_requested = true;
}
pub fn exit_requested(&self) -> bool {
self.exit_requested
}
}
impl WidgetRef for &WindowsSetupWidget {
fn render_ref(&self, area: Rect, buf: &mut Buffer) {
let mut lines: Vec<Line> = vec![
Line::from(vec![
"> ".into(),
"To use all LLMX features, we recommend running LLMX in Windows Subsystem for Linux (WSL2)".bold(),
]),
Line::from(vec![" ".into(), "WSL allows LLMX to run Agent mode in a sandboxed environment with better data protections in place.".into()]),
Line::from(vec![" ".into(), "Learn more: https://developers.openai.com/llmx/windows".into()]),
Line::from(""),
];
let create_option =
|idx: usize, option: WindowsSetupSelection, text: &str| -> Line<'static> {
if self.highlighted == option {
Line::from(format!("> {}. {text}", idx + 1)).cyan()
} else {
Line::from(format!(" {}. {}", idx + 1, text))
}
};
lines.push(create_option(
0,
WindowsSetupSelection::Install,
"Exit and install WSL2",
));
lines.push(create_option(
1,
WindowsSetupSelection::Continue,
"Continue anyway",
));
lines.push("".into());
if let Some(error) = &self.error {
lines.push(Line::from(format!(" {error}")).fg(Color::Red));
lines.push("".into());
}
lines.push(Line::from(vec![" Press Enter to continue".dim()]));
Paragraph::new(lines)
.wrap(Wrap { trim: false })
.render(area, buf);
}
}
impl KeyboardHandler for WindowsSetupWidget {
fn handle_key_event(&mut self, key_event: KeyEvent) {
if key_event.kind == KeyEventKind::Release {
return;
}
match key_event.code {
KeyCode::Up | KeyCode::Char('k') => {
self.highlighted = WindowsSetupSelection::Install;
}
KeyCode::Down | KeyCode::Char('j') => {
self.highlighted = WindowsSetupSelection::Continue;
}
KeyCode::Char('1') => self.handle_install(),
KeyCode::Char('2') => self.handle_continue(),
KeyCode::Enter => match self.highlighted {
WindowsSetupSelection::Install => self.handle_install(),
WindowsSetupSelection::Continue => self.handle_continue(),
},
_ => {}
}
}
}
impl StepStateProvider for WindowsSetupWidget {
fn get_step_state(&self) -> StepState {
match self.selection {
Some(WindowsSetupSelection::Continue) => StepState::Hidden,
Some(WindowsSetupSelection::Install) => StepState::Complete,
None => StepState::InProgress,
}
}
}
#[cfg(test)]
mod tests {
use super::*;
use tempfile::TempDir;
#[test]
fn windows_step_hidden_after_continue() {
let temp_dir = TempDir::new().expect("temp dir");
let mut widget = WindowsSetupWidget::new(temp_dir.path().to_path_buf());
assert_eq!(widget.get_step_state(), StepState::InProgress);
widget.handle_continue();
assert_eq!(widget.get_step_state(), StepState::Hidden);
assert!(!widget.exit_requested());
}
#[test]
fn windows_step_complete_after_install_selection() {
let temp_dir = TempDir::new().expect("temp dir");
let mut widget = WindowsSetupWidget::new(temp_dir.path().to_path_buf());
widget.handle_install();
assert_eq!(widget.get_step_state(), StepState::Complete);
assert!(widget.exit_requested());
}
}