This release includes two critical fixes: 1. fix: Accept '*** Create File:' as alias for '*** Add File:' in patch parser - Claude sometimes uses 'Create File' syntax instead of 'Add File' - Parser now accepts both markers to prevent validation failures - Updated error message to include both valid syntaxes 2. fix: Increase default max_tokens from 8192 to 20480 - Claude Sonnet 4.5 was getting cut off mid-task - New default is 5 * 4096 = 20480 tokens - Claude Sonnet 4.5 supports up to 64K tokens - Gives Claude enough space to complete comprehensive tasks 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
npm i -g @valknarthing/llmx
LLMX CLI is a coding agent powered by LiteLLM that runs locally on your computer.
This project is a community fork with enhanced support for multiple LLM providers via LiteLLM.
Original project: github.com/openai/codex
Quickstart
Installing and running LLMX CLI
Install globally with npm:
npm install -g @valknarthing/llmx
Then simply run llmx to get started:
llmx
You can also go to the latest GitHub Release and download the appropriate binary for your platform.
Each GitHub Release contains many executables, but in practice, you likely want one of these:
- macOS
- Apple Silicon/arm64:
llmx-aarch64-apple-darwin.tar.gz - x86_64 (older Mac hardware):
llmx-x86_64-apple-darwin.tar.gz
- Apple Silicon/arm64:
- Linux
- x86_64:
llmx-x86_64-unknown-linux-musl.tar.gz - arm64:
llmx-aarch64-unknown-linux-musl.tar.gz
- x86_64:
Each archive contains a single entry with the platform baked into the name (e.g., llmx-x86_64-unknown-linux-musl), so you likely want to rename it to llmx after extracting it.
Using LLMX with LiteLLM
LLMX is powered by LiteLLM, which provides access to 100+ LLM providers including OpenAI, Anthropic, Google, Azure, AWS Bedrock, and more.
Quick Start with LiteLLM:
# Set your LiteLLM server URL (default: http://localhost:4000/v1)
export LLMX_BASE_URL="http://localhost:4000/v1"
export LLMX_API_KEY="your-api-key"
# Run LLMX
llmx "hello world"
Configuration: See LITELLM-SETUP.md for detailed setup instructions.
You can also use LLMX with ChatGPT or OpenAI API keys. For authentication options, see the authentication docs.
Model Context Protocol (MCP)
LLMX can access MCP servers. To configure them, refer to the config docs.
Configuration
LLMX CLI supports a rich set of configuration options, with preferences stored in ~/.llmx/config.toml. For full configuration options, see Configuration.
Docs & FAQ
- Getting started
- Configuration
- Sandbox & approvals
- Authentication
- Automating LLMX
- Advanced
- Zero data retention (ZDR)
- Contributing
- Install & build
- FAQ
License
This repository is licensed under the Apache-2.0 License.