feat: Complete LLMX v0.1.0 - Rebrand from Codex with LiteLLM Integration
This release represents a comprehensive transformation of the codebase from Codex to LLMX, enhanced with LiteLLM integration to support 100+ LLM providers through a unified API. ## Major Changes ### Phase 1: Repository & Infrastructure Setup - Established new repository structure and branching strategy - Created comprehensive project documentation (CLAUDE.md, LITELLM-SETUP.md) - Set up development environment and tooling configuration ### Phase 2: Rust Workspace Transformation - Renamed all Rust crates from `codex-*` to `llmx-*` (30+ crates) - Updated package names, binary names, and workspace members - Renamed core modules: codex.rs → llmx.rs, codex_delegate.rs → llmx_delegate.rs - Updated all internal references, imports, and type names - Renamed directories: codex-rs/ → llmx-rs/, codex-backend-openapi-models/ → llmx-backend-openapi-models/ - Fixed all Rust compilation errors after mass rename ### Phase 3: LiteLLM Integration - Integrated LiteLLM for multi-provider LLM support (Anthropic, OpenAI, Azure, Google AI, AWS Bedrock, etc.) - Implemented OpenAI-compatible Chat Completions API support - Added model family detection and provider-specific handling - Updated authentication to support LiteLLM API keys - Renamed environment variables: OPENAI_BASE_URL → LLMX_BASE_URL - Added LLMX_API_KEY for unified authentication - Enhanced error handling for Chat Completions API responses - Implemented fallback mechanisms between Responses API and Chat Completions API ### Phase 4: TypeScript/Node.js Components - Renamed npm package: @codex/codex-cli → @valknar/llmx - Updated TypeScript SDK to use new LLMX APIs and endpoints - Fixed all TypeScript compilation and linting errors - Updated SDK tests to support both API backends - Enhanced mock server to handle multiple API formats - Updated build scripts for cross-platform packaging ### Phase 5: Configuration & Documentation - Updated all configuration files to use LLMX naming - Rewrote README and documentation for LLMX branding - Updated config paths: ~/.codex/ → ~/.llmx/ - Added comprehensive LiteLLM setup guide - Updated all user-facing strings and help text - Created release plan and migration documentation ### Phase 6: Testing & Validation - Fixed all Rust tests for new naming scheme - Updated snapshot tests in TUI (36 frame files) - Fixed authentication storage tests - Updated Chat Completions payload and SSE tests - Fixed SDK tests for new API endpoints - Ensured compatibility with Claude Sonnet 4.5 model - Fixed test environment variables (LLMX_API_KEY, LLMX_BASE_URL) ### Phase 7: Build & Release Pipeline - Updated GitHub Actions workflows for LLMX binary names - Fixed rust-release.yml to reference llmx-rs/ instead of codex-rs/ - Updated CI/CD pipelines for new package names - Made Apple code signing optional in release workflow - Enhanced npm packaging resilience for partial platform builds - Added Windows sandbox support to workspace - Updated dotslash configuration for new binary names ### Phase 8: Final Polish - Renamed all assets (.github images, labels, templates) - Updated VSCode and DevContainer configurations - Fixed all clippy warnings and formatting issues - Applied cargo fmt and prettier formatting across codebase - Updated issue templates and pull request templates - Fixed all remaining UI text references ## Technical Details **Breaking Changes:** - Binary name changed from `codex` to `llmx` - Config directory changed from `~/.codex/` to `~/.llmx/` - Environment variables renamed (CODEX_* → LLMX_*) - npm package renamed to `@valknar/llmx` **New Features:** - Support for 100+ LLM providers via LiteLLM - Unified authentication with LLMX_API_KEY - Enhanced model provider detection and handling - Improved error handling and fallback mechanisms **Files Changed:** - 578 files modified across Rust, TypeScript, and documentation - 30+ Rust crates renamed and updated - Complete rebrand of UI, CLI, and documentation - All tests updated and passing **Dependencies:** - Updated Cargo.lock with new package names - Updated npm dependencies in llmx-cli - Enhanced OpenAPI models for LLMX backend This release establishes LLMX as a standalone project with comprehensive LiteLLM integration, maintaining full backward compatibility with existing functionality while opening support for a wide ecosystem of LLM providers. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> Co-Authored-By: Sebastian Krüger <support@pivoine.art>
This commit is contained in:
78
llmx-rs/core/tests/suite/user_notification.rs
Normal file
78
llmx-rs/core/tests/suite/user_notification.rs
Normal file
@@ -0,0 +1,78 @@
|
||||
#![cfg(not(target_os = "windows"))]
|
||||
|
||||
use std::os::unix::fs::PermissionsExt;
|
||||
|
||||
use core_test_support::fs_wait;
|
||||
use core_test_support::responses;
|
||||
use core_test_support::skip_if_no_network;
|
||||
use core_test_support::test_llmx::TestLlmx;
|
||||
use core_test_support::test_llmx::test_llmx;
|
||||
use core_test_support::wait_for_event;
|
||||
use llmx_core::protocol::EventMsg;
|
||||
use llmx_core::protocol::Op;
|
||||
use llmx_protocol::user_input::UserInput;
|
||||
use pretty_assertions::assert_eq;
|
||||
use serde_json::Value;
|
||||
use serde_json::json;
|
||||
use tempfile::TempDir;
|
||||
use wiremock::matchers::any;
|
||||
|
||||
use responses::ev_assistant_message;
|
||||
use responses::ev_completed;
|
||||
use responses::sse;
|
||||
use responses::start_mock_server;
|
||||
use std::time::Duration;
|
||||
|
||||
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
|
||||
#[ignore = "flaky on ubuntu-24.04-arm - aarch64-unknown-linux-gnu"]
|
||||
// The notify script gets far enough to create (and therefore surface) the file,
|
||||
// but hasn’t flushed the JSON yet. Reading an empty file produces EOF while parsing
|
||||
// a value at line 1 column 0. May be caused by a slow runner.
|
||||
async fn summarize_context_three_requests_and_instructions() -> anyhow::Result<()> {
|
||||
skip_if_no_network!(Ok(()));
|
||||
|
||||
let server = start_mock_server().await;
|
||||
|
||||
let sse1 = sse(vec![ev_assistant_message("m1", "Done"), ev_completed("r1")]);
|
||||
|
||||
responses::mount_sse_once_match(&server, any(), sse1).await;
|
||||
|
||||
let notify_dir = TempDir::new()?;
|
||||
// write a script to the notify that touches a file next to it
|
||||
let notify_script = notify_dir.path().join("notify.sh");
|
||||
std::fs::write(
|
||||
¬ify_script,
|
||||
r#"#!/bin/bash
|
||||
set -e
|
||||
echo -n "${@: -1}" > $(dirname "${0}")/notify.txt"#,
|
||||
)?;
|
||||
std::fs::set_permissions(¬ify_script, std::fs::Permissions::from_mode(0o755))?;
|
||||
|
||||
let notify_file = notify_dir.path().join("notify.txt");
|
||||
let notify_script_str = notify_script.to_str().unwrap().to_string();
|
||||
|
||||
let TestLlmx { llmx, .. } = test_llmx()
|
||||
.with_config(move |cfg| cfg.notify = Some(vec![notify_script_str]))
|
||||
.build(&server)
|
||||
.await?;
|
||||
|
||||
// 1) Normal user input – should hit server once.
|
||||
llmx.submit(Op::UserInput {
|
||||
items: vec![UserInput::Text {
|
||||
text: "hello world".into(),
|
||||
}],
|
||||
})
|
||||
.await?;
|
||||
wait_for_event(&llmx, |ev| matches!(ev, EventMsg::TaskComplete(_))).await;
|
||||
|
||||
// We fork the notify script, so we need to wait for it to write to the file.
|
||||
fs_wait::wait_for_path_exists(¬ify_file, Duration::from_secs(5)).await?;
|
||||
let notify_payload_raw = tokio::fs::read_to_string(¬ify_file).await?;
|
||||
let payload: Value = serde_json::from_str(¬ify_payload_raw)?;
|
||||
|
||||
assert_eq!(payload["type"], json!("agent-turn-complete"));
|
||||
assert_eq!(payload["input-messages"], json!(["hello world"]));
|
||||
assert_eq!(payload["last-assistant-message"], json!("Done"));
|
||||
|
||||
Ok(())
|
||||
}
|
||||
Reference in New Issue
Block a user