Updated all documentation and configuration files: Documentation changes: - Updated README.md to describe LLMX as LiteLLM-powered fork - Updated CLAUDE.md with LiteLLM integration details - Updated 50+ markdown files across docs/, llmx-rs/, llmx-cli/, sdk/ - Changed all references: codex → llmx, Codex → LLMX - Updated package references: @openai/codex → @llmx/llmx - Updated repository URLs: github.com/openai/codex → github.com/valknar/llmx Configuration changes: - Updated .github/dependabot.yaml - Updated .github workflow files - Updated cliff.toml (changelog configuration) - Updated Cargo.toml comments Key branding updates: - Project description: "coding agent from OpenAI" → "coding agent powered by LiteLLM" - Added attribution to original OpenAI Codex project - Documented LiteLLM integration benefits Files changed: 51 files (559 insertions, 559 deletions) 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
Containerized Development
We provide the following options to facilitate LLMX development in a container. This is particularly useful for verifying the Linux build when working on a macOS host.
Docker
To build the Docker image locally for x64 and then run it with the repo mounted under /workspace:
CODEX_DOCKER_IMAGE_NAME=llmx-linux-dev
docker build --platform=linux/amd64 -t "$CODEX_DOCKER_IMAGE_NAME" ./.devcontainer
docker run --platform=linux/amd64 --rm -it -e CARGO_TARGET_DIR=/workspace/llmx-rs/target-amd64 -v "$PWD":/workspace -w /workspace/llmx-rs "$CODEX_DOCKER_IMAGE_NAME"
Note that /workspace/target will contain the binaries built for your host platform, so we include -e CARGO_TARGET_DIR=/workspace/llmx-rs/target-amd64 in the docker run command so that the binaries built inside your container are written to a separate directory.
For arm64, specify --platform=linux/amd64 instead for both docker build and docker run.
Currently, the Dockerfile works for both x64 and arm64 Linux, though you need to run rustup target add x86_64-unknown-linux-musl yourself to install the musl toolchain for x64.
VS Code
VS Code recognizes the devcontainer.json file and gives you the option to develop LLMX in a container. Currently, devcontainer.json builds and runs the arm64 flavor of the container.
From the integrated terminal in VS Code, you can build either flavor of the arm64 build (GNU or musl):
cargo build --target aarch64-unknown-linux-musl
cargo build --target aarch64-unknown-linux-gnu