Sebastian Krüger 462b219d3f
Some checks failed
ci / build-test (push) Failing after 4m51s
Codespell / Check for spelling errors (push) Successful in 4s
rust-ci / Detect changed areas (push) Has been cancelled
rust-ci / Format / etc (push) Has been cancelled
rust-ci / cargo shear (push) Has been cancelled
rust-ci / Lint/Build — macos-14 - aarch64-apple-darwin (push) Has been cancelled
rust-ci / Lint/Build — macos-14 - x86_64-apple-darwin (push) Has been cancelled
rust-ci / Lint/Build — ubuntu-24.04 - x86_64-unknown-linux-gnu (push) Has been cancelled
rust-ci / Lint/Build — ubuntu-24.04 - x86_64-unknown-linux-musl (push) Has been cancelled
rust-ci / Lint/Build — ubuntu-24.04-arm - aarch64-unknown-linux-gnu (push) Has been cancelled
rust-ci / Lint/Build — ubuntu-24.04-arm - aarch64-unknown-linux-musl (push) Has been cancelled
rust-ci / Lint/Build — windows-latest - x86_64-pc-windows-msvc (push) Has been cancelled
rust-ci / Lint/Build — macos-14 - aarch64-apple-darwin (release) (push) Has been cancelled
rust-ci / Lint/Build — ubuntu-24.04 - x86_64-unknown-linux-musl (release) (push) Has been cancelled
rust-ci / Lint/Build — windows-11-arm - aarch64-pc-windows-msvc (release) (push) Has been cancelled
rust-ci / Lint/Build — windows-latest - x86_64-pc-windows-msvc (release) (push) Has been cancelled
rust-ci / Tests — macos-14 - aarch64-apple-darwin (push) Has been cancelled
rust-ci / Tests — ubuntu-24.04 - x86_64-unknown-linux-gnu (push) Has been cancelled
rust-ci / Tests — ubuntu-24.04-arm - aarch64-unknown-linux-gnu (push) Has been cancelled
rust-ci / Tests — windows-11-arm - aarch64-pc-windows-msvc (push) Has been cancelled
rust-ci / Tests — windows-latest - x86_64-pc-windows-msvc (push) Has been cancelled
rust-ci / CI results (required) (push) Has been cancelled
rust-ci / Lint/Build — windows-11-arm - aarch64-pc-windows-msvc (push) Has been cancelled
sdk / sdks (push) Has been cancelled
feat: Add comprehensive Anthropic prompt caching
- Add cache_control to system instructions (stable, high value)
- Add cache_control to conversation history (4 messages before end)
- Implements full Anthropic caching hierarchy: tools → system → history
- Significant cost savings for repeated similar requests

Cache hierarchy:
1. Tools (last tool) - already implemented
2. System instructions - new
3. Conversation history - new (stable portion)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-16 21:43:01 +01:00
2025-11-13 12:21:18 +01:00
2025-09-29 13:27:13 -07:00
2025-10-15 17:46:01 +01:00
2025-04-16 12:56:08 -04:00
2025-10-17 12:19:08 -07:00
2025-10-17 12:19:08 -07:00
2025-04-16 12:56:08 -04:00
2025-07-31 00:06:55 +00:00

npm i -g @valknarthing/llmx

LLMX CLI is a coding agent powered by LiteLLM that runs locally on your computer.

This project is a community fork with enhanced support for multiple LLM providers via LiteLLM.
Original project: github.com/openai/codex


Quickstart

Installing and running LLMX CLI

Install globally with npm:

npm install -g @valknarthing/llmx

Then simply run llmx to get started:

llmx
You can also go to the latest GitHub Release and download the appropriate binary for your platform.

Each GitHub Release contains many executables, but in practice, you likely want one of these:

  • macOS
    • Apple Silicon/arm64: llmx-aarch64-apple-darwin.tar.gz
    • x86_64 (older Mac hardware): llmx-x86_64-apple-darwin.tar.gz
  • Linux
    • x86_64: llmx-x86_64-unknown-linux-musl.tar.gz
    • arm64: llmx-aarch64-unknown-linux-musl.tar.gz

Each archive contains a single entry with the platform baked into the name (e.g., llmx-x86_64-unknown-linux-musl), so you likely want to rename it to llmx after extracting it.

Using LLMX with LiteLLM

LLMX is powered by LiteLLM, which provides access to 100+ LLM providers including OpenAI, Anthropic, Google, Azure, AWS Bedrock, and more.

Quick Start with LiteLLM:

# Set your LiteLLM server URL (default: http://localhost:4000/v1)
export LLMX_BASE_URL="http://localhost:4000/v1"
export LLMX_API_KEY="your-api-key"

# Run LLMX
llmx "hello world"

Configuration: See LITELLM-SETUP.md for detailed setup instructions.

You can also use LLMX with ChatGPT or OpenAI API keys. For authentication options, see the authentication docs.

Model Context Protocol (MCP)

LLMX can access MCP servers. To configure them, refer to the config docs.

Configuration

LLMX CLI supports a rich set of configuration options, with preferences stored in ~/.llmx/config.toml. For full configuration options, see Configuration.


Docs & FAQ


License

This repository is licensed under the Apache-2.0 License.

Description
No description provided
Readme Apache-2.0 43 MiB
Languages
Rust 96.5%
Python 1.6%
TypeScript 1.1%
PowerShell 0.2%
Shell 0.2%
Other 0.3%