Updated GitHub Actions workflows and build scripts: GitHub Actions: - Updated all workflow files to use llmx-rs instead of codex-rs - Updated rust-release.yml to build llmx binary - Updated rust-ci.yml for continuous integration - Updated sdk.yml for TypeScript SDK workflows Build Scripts: - Renamed scripts/debug-codex.sh → scripts/debug-llmx.sh - Updated debug script to use LLMX_RS_DIR and build llmx binary - Updated scripts/stage_npm_packages.py with new package names - Updated llmx-cli/scripts/build_npm_package.py: - Package choices: "codex" → "llmx", "codex-responses-api-proxy" → "llmx-responses-api-proxy" - Updated default package: codex → llmx - Updated component dependencies - Updated llmx-cli/scripts/install_native_deps.py: - Binary component names: codex → llmx - Artifact prefixes and destinations updated - Variable names: CODEX_CLI_ROOT → LLMX_CLI_ROOT - Default components: ["codex", "rg"] → ["llmx", "rg"] GitHub URLs: - Updated all references: github.com/openai/codex → github.com/valknar/llmx Files changed: 8 files (build scripts and workflows) 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
npm i -g @llmx/llmx
or brew install --cask llmx
LLMX CLI is a coding agent powered by LiteLLM that runs locally on your computer.
This project is a community fork with enhanced support for multiple LLM providers via LiteLLM.
Original project: github.com/openai/codex
Quickstart
Installing and running LLMX CLI
Install globally with your preferred package manager. If you use npm:
npm install -g @llmx/llmx
Alternatively, if you use Homebrew:
brew install --cask llmx
Then simply run llmx to get started:
llmx
If you're running into upgrade issues with Homebrew, see the FAQ entry on brew upgrade llmx.
You can also go to the latest GitHub Release and download the appropriate binary for your platform.
Each GitHub Release contains many executables, but in practice, you likely want one of these:
- macOS
- Apple Silicon/arm64:
llmx-aarch64-apple-darwin.tar.gz - x86_64 (older Mac hardware):
llmx-x86_64-apple-darwin.tar.gz
- Apple Silicon/arm64:
- Linux
- x86_64:
llmx-x86_64-unknown-linux-musl.tar.gz - arm64:
llmx-aarch64-unknown-linux-musl.tar.gz
- x86_64:
Each archive contains a single entry with the platform baked into the name (e.g., llmx-x86_64-unknown-linux-musl), so you likely want to rename it to llmx after extracting it.
Using LLMX with your ChatGPT plan
Run llmx and select Sign in with ChatGPT. We recommend signing into your ChatGPT account to use LLMX as part of your Plus, Pro, Team, Edu, or Enterprise plan. Learn more about what's included in your ChatGPT plan.
You can also use LLMX with an API key, but this requires additional setup. If you previously used an API key for usage-based billing, see the migration steps. If you're having trouble with login, please comment on this issue.
Model Context Protocol (MCP)
LLMX can access MCP servers. To configure them, refer to the config docs.
Configuration
LLMX CLI supports a rich set of configuration options, with preferences stored in ~/.llmx/config.toml. For full configuration options, see Configuration.
Docs & FAQ
- Getting started
- Configuration
- Sandbox & approvals
- Authentication
- Automating LLMX
- Advanced
- Zero data retention (ZDR)
- Contributing
- Install & build
- FAQ
- Open source fund
License
This repository is licensed under the Apache-2.0 License.

