This release represents a comprehensive transformation of the codebase from Codex to LLMX, enhanced with LiteLLM integration to support 100+ LLM providers through a unified API. ## Major Changes ### Phase 1: Repository & Infrastructure Setup - Established new repository structure and branching strategy - Created comprehensive project documentation (CLAUDE.md, LITELLM-SETUP.md) - Set up development environment and tooling configuration ### Phase 2: Rust Workspace Transformation - Renamed all Rust crates from `codex-*` to `llmx-*` (30+ crates) - Updated package names, binary names, and workspace members - Renamed core modules: codex.rs → llmx.rs, codex_delegate.rs → llmx_delegate.rs - Updated all internal references, imports, and type names - Renamed directories: codex-rs/ → llmx-rs/, codex-backend-openapi-models/ → llmx-backend-openapi-models/ - Fixed all Rust compilation errors after mass rename ### Phase 3: LiteLLM Integration - Integrated LiteLLM for multi-provider LLM support (Anthropic, OpenAI, Azure, Google AI, AWS Bedrock, etc.) - Implemented OpenAI-compatible Chat Completions API support - Added model family detection and provider-specific handling - Updated authentication to support LiteLLM API keys - Renamed environment variables: OPENAI_BASE_URL → LLMX_BASE_URL - Added LLMX_API_KEY for unified authentication - Enhanced error handling for Chat Completions API responses - Implemented fallback mechanisms between Responses API and Chat Completions API ### Phase 4: TypeScript/Node.js Components - Renamed npm package: @codex/codex-cli → @valknar/llmx - Updated TypeScript SDK to use new LLMX APIs and endpoints - Fixed all TypeScript compilation and linting errors - Updated SDK tests to support both API backends - Enhanced mock server to handle multiple API formats - Updated build scripts for cross-platform packaging ### Phase 5: Configuration & Documentation - Updated all configuration files to use LLMX naming - Rewrote README and documentation for LLMX branding - Updated config paths: ~/.codex/ → ~/.llmx/ - Added comprehensive LiteLLM setup guide - Updated all user-facing strings and help text - Created release plan and migration documentation ### Phase 6: Testing & Validation - Fixed all Rust tests for new naming scheme - Updated snapshot tests in TUI (36 frame files) - Fixed authentication storage tests - Updated Chat Completions payload and SSE tests - Fixed SDK tests for new API endpoints - Ensured compatibility with Claude Sonnet 4.5 model - Fixed test environment variables (LLMX_API_KEY, LLMX_BASE_URL) ### Phase 7: Build & Release Pipeline - Updated GitHub Actions workflows for LLMX binary names - Fixed rust-release.yml to reference llmx-rs/ instead of codex-rs/ - Updated CI/CD pipelines for new package names - Made Apple code signing optional in release workflow - Enhanced npm packaging resilience for partial platform builds - Added Windows sandbox support to workspace - Updated dotslash configuration for new binary names ### Phase 8: Final Polish - Renamed all assets (.github images, labels, templates) - Updated VSCode and DevContainer configurations - Fixed all clippy warnings and formatting issues - Applied cargo fmt and prettier formatting across codebase - Updated issue templates and pull request templates - Fixed all remaining UI text references ## Technical Details **Breaking Changes:** - Binary name changed from `codex` to `llmx` - Config directory changed from `~/.codex/` to `~/.llmx/` - Environment variables renamed (CODEX_* → LLMX_*) - npm package renamed to `@valknar/llmx` **New Features:** - Support for 100+ LLM providers via LiteLLM - Unified authentication with LLMX_API_KEY - Enhanced model provider detection and handling - Improved error handling and fallback mechanisms **Files Changed:** - 578 files modified across Rust, TypeScript, and documentation - 30+ Rust crates renamed and updated - Complete rebrand of UI, CLI, and documentation - All tests updated and passing **Dependencies:** - Updated Cargo.lock with new package names - Updated npm dependencies in llmx-cli - Enhanced OpenAPI models for LLMX backend This release establishes LLMX as a standalone project with comprehensive LiteLLM integration, maintaining full backward compatibility with existing functionality while opening support for a wide ecosystem of LLM providers. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> Co-Authored-By: Sebastian Krüger <support@pivoine.art>
110 lines
3.8 KiB
Rust
110 lines
3.8 KiB
Rust
use clap::Parser;
|
|
use clap::ValueEnum;
|
|
use llmx_common::CliConfigOverrides;
|
|
use std::path::PathBuf;
|
|
|
|
#[derive(Parser, Debug)]
|
|
#[command(version)]
|
|
pub struct Cli {
|
|
/// Action to perform. If omitted, runs a new non-interactive session.
|
|
#[command(subcommand)]
|
|
pub command: Option<Command>,
|
|
|
|
/// Optional image(s) to attach to the initial prompt.
|
|
#[arg(long = "image", short = 'i', value_name = "FILE", value_delimiter = ',', num_args = 1..)]
|
|
pub images: Vec<PathBuf>,
|
|
|
|
/// Model the agent should use.
|
|
#[arg(long, short = 'm')]
|
|
pub model: Option<String>,
|
|
|
|
#[arg(long = "oss", default_value_t = false)]
|
|
pub oss: bool,
|
|
|
|
/// Select the sandbox policy to use when executing model-generated shell
|
|
/// commands.
|
|
#[arg(long = "sandbox", short = 's', value_enum)]
|
|
pub sandbox_mode: Option<llmx_common::SandboxModeCliArg>,
|
|
|
|
/// Configuration profile from config.toml to specify default options.
|
|
#[arg(long = "profile", short = 'p')]
|
|
pub config_profile: Option<String>,
|
|
|
|
/// Convenience alias for low-friction sandboxed automatic execution (-a on-failure, --sandbox workspace-write).
|
|
#[arg(long = "full-auto", default_value_t = false)]
|
|
pub full_auto: bool,
|
|
|
|
/// Skip all confirmation prompts and execute commands without sandboxing.
|
|
/// EXTREMELY DANGEROUS. Intended solely for running in environments that are externally sandboxed.
|
|
#[arg(
|
|
long = "dangerously-bypass-approvals-and-sandbox",
|
|
alias = "yolo",
|
|
default_value_t = false,
|
|
conflicts_with = "full_auto"
|
|
)]
|
|
pub dangerously_bypass_approvals_and_sandbox: bool,
|
|
|
|
/// Tell the agent to use the specified directory as its working root.
|
|
#[clap(long = "cd", short = 'C', value_name = "DIR")]
|
|
pub cwd: Option<PathBuf>,
|
|
|
|
/// Allow running Llmx outside a Git repository.
|
|
#[arg(long = "skip-git-repo-check", default_value_t = false)]
|
|
pub skip_git_repo_check: bool,
|
|
|
|
/// Path to a JSON Schema file describing the model's final response shape.
|
|
#[arg(long = "output-schema", value_name = "FILE")]
|
|
pub output_schema: Option<PathBuf>,
|
|
|
|
#[clap(skip)]
|
|
pub config_overrides: CliConfigOverrides,
|
|
|
|
/// Specifies color settings for use in the output.
|
|
#[arg(long = "color", value_enum, default_value_t = Color::Auto)]
|
|
pub color: Color,
|
|
|
|
/// Print events to stdout as JSONL.
|
|
#[arg(long = "json", alias = "experimental-json", default_value_t = false)]
|
|
pub json: bool,
|
|
|
|
/// Specifies file where the last message from the agent should be written.
|
|
#[arg(long = "output-last-message", short = 'o', value_name = "FILE")]
|
|
pub last_message_file: Option<PathBuf>,
|
|
|
|
/// Initial instructions for the agent. If not provided as an argument (or
|
|
/// if `-` is used), instructions are read from stdin.
|
|
#[arg(value_name = "PROMPT", value_hint = clap::ValueHint::Other)]
|
|
pub prompt: Option<String>,
|
|
}
|
|
|
|
#[derive(Debug, clap::Subcommand)]
|
|
pub enum Command {
|
|
/// Resume a previous session by id or pick the most recent with --last.
|
|
Resume(ResumeArgs),
|
|
}
|
|
|
|
#[derive(Parser, Debug)]
|
|
pub struct ResumeArgs {
|
|
/// Conversation/session id (UUID). When provided, resumes this session.
|
|
/// If omitted, use --last to pick the most recent recorded session.
|
|
#[arg(value_name = "SESSION_ID")]
|
|
pub session_id: Option<String>,
|
|
|
|
/// Resume the most recent recorded session (newest) without specifying an id.
|
|
#[arg(long = "last", default_value_t = false, conflicts_with = "session_id")]
|
|
pub last: bool,
|
|
|
|
/// Prompt to send after resuming the session. If `-` is used, read from stdin.
|
|
#[arg(value_name = "PROMPT", value_hint = clap::ValueHint::Other)]
|
|
pub prompt: Option<String>,
|
|
}
|
|
|
|
#[derive(Debug, Clone, Copy, Default, PartialEq, Eq, ValueEnum)]
|
|
#[value(rename_all = "kebab-case")]
|
|
pub enum Color {
|
|
Always,
|
|
Never,
|
|
#[default]
|
|
Auto,
|
|
}
|