Commit Graph

6 Commits

Author SHA1 Message Date
Sebastian Krüger
70cb667a2c Fix UI text: Replace remaining "Codex" with "LLMX"
Updated user-facing strings throughout the TUI:

Slash commands:
- "instructions for Codex" → "instructions for LLMX"
- "ask Codex to undo" → "ask LLMX to undo"
- "exit Codex" → "exit LLMX"
- "what Codex can do" → "what LLMX can do"
- "log out of Codex" → "log out of LLMX"

Onboarding screens:
- "running Codex" → "running LLMX"
- "allow Codex" → "allow LLMX"
- "use Codex" → "use LLMX"
- "autonomy to grant Codex" → "autonomy to grant LLMX"
- "Codex can make mistakes" → "LLMX can make mistakes"
- "Codex will use" → "LLMX will use"

Chat composer:
- "Ask Codex to do anything" → "Ask LLMX to do anything"

Schema name:
- "codex_output_schema" → "llmx_output_schema"

Files changed: 7 files in TUI and core

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-11 15:28:25 +01:00
Sebastian Krüger
7be8b00b05 Phase 6: Testing & Validation - Additional Fixes
Fixed remaining references found during testing:

Rust source code fixes:
- Updated CLI bin_name and override_usage: codex → llmx
- Updated test examples in wsl_paths.rs
- Updated GitHub URLs: github.com/openai/codex → github.com/valknar/llmx
- Updated directory references: ~/.codex/ → ~/.llmx/
- Updated documentation link: "Codex docs" → "LLMX docs"
- Updated feedback URL to point to valknar/llmx repository

Configuration files:
- Regenerated llmx-cli/package-lock.json with updated package name
- Updated pnpm-lock.yaml

Test results:
- TypeScript SDK build: ✓ Success
- TypeScript lint: ✓ Pass
- Rust tests: 12/13 passed (1 locale-specific test failure unrelated to rename)
- Rust release build: In progress

Files changed: 22 files (49 insertions, 46 deletions)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-11 15:03:50 +01:00
Sebastian Krüger
a6c537ac50 Phase 2 Part 2: Fix Rust compilation errors after rename
Fixed all compilation errors resulting from the crate rename:

- Fixed FunctionCallOutputPayload::from() double reference issue in response_processing.rs:61
  - Removed unnecessary & as call_tool_result was already a reference
- Fixed spawn_task method call in llmx.rs by adding & reference to Arc<Session>
- Fixed Arc type inference by adding explicit type annotations
- Fixed tokio::join! type inference by separating future creation
- Updated all remaining crate::codex imports to crate::llmx across:
  - tools/orchestrator.rs
  - tools/handlers/shell.rs
  - apply_patch.rs
  - compact.rs
  - unified_exec/mod.rs
  - tools/context.rs
  - tools/sandboxing.rs
  - tools/events.rs

Successfully verified with cargo check --bin llmx (31.50s compile time).

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-11 14:41:08 +01:00
Sebastian Krüger
27909b7495 Phase 3: LiteLLM Integration
- Added LiteLLM as built-in model provider in model_provider_info.rs:
  - Default base_url: http://localhost:4000/v1 (configurable via LITELLM_BASE_URL)
  - Uses Chat wire API (OpenAI-compatible)
  - Requires LITELLM_API_KEY environment variable
  - No OpenAI auth required (simple bearer token)
  - Positioned as first provider in list

- Updated default models to use LiteLLM format:
  - Changed from "gpt-5-codex" to "anthropic/claude-sonnet-4-20250514"
  - Updated all default model constants (OPENAI_DEFAULT_MODEL, etc.)
  - Uses provider/model format compatible with LiteLLM

- Provider configuration:
  - Supports base_url override via environment variable
  - Includes helpful env_key_instructions pointing to LiteLLM docs
  - Uses standard retry/timeout defaults

This makes LLMX work out-of-the-box with LiteLLM proxy, supporting
multiple providers (Anthropic, OpenAI, etc.) through a single interface.

🤖 Generated with Claude Code
2025-11-11 14:33:00 +01:00
Sebastian Krüger
cb8d941adf Phase 2: Rust Workspace Transformation (Part 1)
- Renamed directory: codex-backend-openapi-models -> llmx-backend-openapi-models
- Updated all Cargo.toml files:
  - Package names: codex-* -> llmx-*
  - Library names: codex_* -> llmx_*
  - Workspace dependencies updated
- Renamed Rust source files:
  - codex*.rs -> llmx*.rs (all modules)
  - codex_conversation -> llmx_conversation
  - codex_delegate -> llmx_delegate
  - codex_message_processor -> llmx_message_processor
  - codex_tool_* -> llmx_tool_*
- Updated all Rust imports:
  - use codex_* -> use llmx_*
  - mod codex* -> mod llmx*
- Updated environment variables in code:
  - CODEX_HOME -> LLMX_HOME
  - .codex -> .llmx paths
- Updated protocol crate lib name for proper linking

Note: Some compilation errors remain (type inference issues) but all
renaming is complete. Will fix compilation in next phase.

🤖 Generated with Claude Code
2025-11-11 14:29:57 +01:00
Sebastian Krüger
f237fe560d Phase 1: Repository & Infrastructure Setup
- Renamed directories: codex-rs -> llmx-rs, codex-cli -> llmx-cli
- Updated package.json files:
  - Root: llmx-monorepo
  - CLI: @llmx/llmx
  - SDK: @llmx/llmx-sdk
- Updated pnpm workspace configuration
- Renamed binary: codex.js -> llmx.js
- Updated environment variables: CODEX_* -> LLMX_*
- Changed repository URLs to valknar/llmx

🤖 Generated with Claude Code
2025-11-11 14:01:52 +01:00