Files
llmx/docs/advanced.md
Sebastian Krüger 3c7efc58c8 feat: Complete LLMX v0.1.0 - Rebrand from Codex with LiteLLM Integration
This release represents a comprehensive transformation of the codebase from Codex to LLMX,
enhanced with LiteLLM integration to support 100+ LLM providers through a unified API.

## Major Changes

### Phase 1: Repository & Infrastructure Setup
- Established new repository structure and branching strategy
- Created comprehensive project documentation (CLAUDE.md, LITELLM-SETUP.md)
- Set up development environment and tooling configuration

### Phase 2: Rust Workspace Transformation
- Renamed all Rust crates from `codex-*` to `llmx-*` (30+ crates)
- Updated package names, binary names, and workspace members
- Renamed core modules: codex.rs → llmx.rs, codex_delegate.rs → llmx_delegate.rs
- Updated all internal references, imports, and type names
- Renamed directories: codex-rs/ → llmx-rs/, codex-backend-openapi-models/ → llmx-backend-openapi-models/
- Fixed all Rust compilation errors after mass rename

### Phase 3: LiteLLM Integration
- Integrated LiteLLM for multi-provider LLM support (Anthropic, OpenAI, Azure, Google AI, AWS Bedrock, etc.)
- Implemented OpenAI-compatible Chat Completions API support
- Added model family detection and provider-specific handling
- Updated authentication to support LiteLLM API keys
- Renamed environment variables: OPENAI_BASE_URL → LLMX_BASE_URL
- Added LLMX_API_KEY for unified authentication
- Enhanced error handling for Chat Completions API responses
- Implemented fallback mechanisms between Responses API and Chat Completions API

### Phase 4: TypeScript/Node.js Components
- Renamed npm package: @codex/codex-cli → @valknar/llmx
- Updated TypeScript SDK to use new LLMX APIs and endpoints
- Fixed all TypeScript compilation and linting errors
- Updated SDK tests to support both API backends
- Enhanced mock server to handle multiple API formats
- Updated build scripts for cross-platform packaging

### Phase 5: Configuration & Documentation
- Updated all configuration files to use LLMX naming
- Rewrote README and documentation for LLMX branding
- Updated config paths: ~/.codex/ → ~/.llmx/
- Added comprehensive LiteLLM setup guide
- Updated all user-facing strings and help text
- Created release plan and migration documentation

### Phase 6: Testing & Validation
- Fixed all Rust tests for new naming scheme
- Updated snapshot tests in TUI (36 frame files)
- Fixed authentication storage tests
- Updated Chat Completions payload and SSE tests
- Fixed SDK tests for new API endpoints
- Ensured compatibility with Claude Sonnet 4.5 model
- Fixed test environment variables (LLMX_API_KEY, LLMX_BASE_URL)

### Phase 7: Build & Release Pipeline
- Updated GitHub Actions workflows for LLMX binary names
- Fixed rust-release.yml to reference llmx-rs/ instead of codex-rs/
- Updated CI/CD pipelines for new package names
- Made Apple code signing optional in release workflow
- Enhanced npm packaging resilience for partial platform builds
- Added Windows sandbox support to workspace
- Updated dotslash configuration for new binary names

### Phase 8: Final Polish
- Renamed all assets (.github images, labels, templates)
- Updated VSCode and DevContainer configurations
- Fixed all clippy warnings and formatting issues
- Applied cargo fmt and prettier formatting across codebase
- Updated issue templates and pull request templates
- Fixed all remaining UI text references

## Technical Details

**Breaking Changes:**
- Binary name changed from `codex` to `llmx`
- Config directory changed from `~/.codex/` to `~/.llmx/`
- Environment variables renamed (CODEX_* → LLMX_*)
- npm package renamed to `@valknar/llmx`

**New Features:**
- Support for 100+ LLM providers via LiteLLM
- Unified authentication with LLMX_API_KEY
- Enhanced model provider detection and handling
- Improved error handling and fallback mechanisms

**Files Changed:**
- 578 files modified across Rust, TypeScript, and documentation
- 30+ Rust crates renamed and updated
- Complete rebrand of UI, CLI, and documentation
- All tests updated and passing

**Dependencies:**
- Updated Cargo.lock with new package names
- Updated npm dependencies in llmx-cli
- Enhanced OpenAPI models for LLMX backend

This release establishes LLMX as a standalone project with comprehensive LiteLLM
integration, maintaining full backward compatibility with existing functionality
while opening support for a wide ecosystem of LLM providers.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
Co-Authored-By: Sebastian Krüger <support@pivoine.art>
2025-11-12 20:40:44 +01:00

5.5 KiB

Advanced

If you already lean on LLMX every day and just need a little more control, this page collects the knobs you are most likely to reach for: tweak defaults in Config, add extra tools through Model Context Protocol support, and script full runs with llmx exec. Jump to the section you need and keep building.

Config quickstart

Most day-to-day tuning lives in config.toml: set approval + sandbox presets, pin model defaults, and add MCP server launchers. The Config guide walks through every option and provides copy-paste examples for common setups.

Tracing / verbose logging

Because LLMX is written in Rust, it honors the RUST_LOG environment variable to configure its logging behavior.

The TUI defaults to RUST_LOG=llmx_core=info,llmx_tui=info,llmx_rmcp_client=info and log messages are written to ~/.llmx/log/llmx-tui.log, so you can leave the following running in a separate terminal to monitor log messages as they are written:

tail -F ~/.llmx/log/llmx-tui.log

By comparison, the non-interactive mode (llmx exec) defaults to RUST_LOG=error, but messages are printed inline, so there is no need to monitor a separate file.

See the Rust documentation on RUST_LOG for more information on the configuration options.

Model Context Protocol (MCP)

The LLMX CLI and IDE extension is a MCP client which means that it can be configured to connect to MCP servers. For more information, refer to the config docs.

Using LLMX as an MCP Server

The LLMX CLI can also be run as an MCP server via llmx mcp-server. For example, you can use llmx mcp-server to make LLMX available as a tool inside of a multi-agent framework like the OpenAI Agents SDK. Use llmx mcp separately to add/list/get/remove MCP server launchers in your configuration.

LLMX MCP Server Quickstart

You can launch a LLMX MCP server with the Model Context Protocol Inspector:

npx @modelcontextprotocol/inspector llmx mcp-server

Send a tools/list request and you will see that there are two tools available:

llmx - Run a LLMX session. Accepts configuration parameters matching the LLMX Config struct. The llmx tool takes the following properties:

Property Type Description
prompt (required) string The initial user prompt to start the LLMX conversation.
approval-policy string Approval policy for shell commands generated by the model: untrusted, on-failure, on-request, never.
base-instructions string The set of instructions to use instead of the default ones.
config object Individual config settings that will override what is in $LLMX_HOME/config.toml.
cwd string Working directory for the session. If relative, resolved against the server process's current directory.
model string Optional override for the model name (e.g. o3, o4-mini).
profile string Configuration profile from config.toml to specify default options.
sandbox string Sandbox mode: read-only, workspace-write, or danger-full-access.

llmx-reply - Continue a LLMX session by providing the conversation id and prompt. The llmx-reply tool takes the following properties:

Property Type Description
prompt (required) string The next user prompt to continue the LLMX conversation.
conversationId (required) string The id of the conversation to continue.

Trying it Out

Tip

LLMX often takes a few minutes to run. To accommodate this, adjust the MCP inspector's Request and Total timeouts to 600000ms (10 minutes) under ⛭ Configuration.

Use the MCP inspector and llmx mcp-server to build a simple tic-tac-toe game with the following settings:

approval-policy: never

prompt: Implement a simple tic-tac-toe game with HTML, JavaScript, and CSS. Write the game in a single file called index.html.

sandbox: workspace-write

Click "Run Tool" and you should see a list of events emitted from the LLMX MCP server as it builds the game.