This release represents a comprehensive transformation of the codebase from Codex to LLMX, enhanced with LiteLLM integration to support 100+ LLM providers through a unified API. ## Major Changes ### Phase 1: Repository & Infrastructure Setup - Established new repository structure and branching strategy - Created comprehensive project documentation (CLAUDE.md, LITELLM-SETUP.md) - Set up development environment and tooling configuration ### Phase 2: Rust Workspace Transformation - Renamed all Rust crates from `codex-*` to `llmx-*` (30+ crates) - Updated package names, binary names, and workspace members - Renamed core modules: codex.rs → llmx.rs, codex_delegate.rs → llmx_delegate.rs - Updated all internal references, imports, and type names - Renamed directories: codex-rs/ → llmx-rs/, codex-backend-openapi-models/ → llmx-backend-openapi-models/ - Fixed all Rust compilation errors after mass rename ### Phase 3: LiteLLM Integration - Integrated LiteLLM for multi-provider LLM support (Anthropic, OpenAI, Azure, Google AI, AWS Bedrock, etc.) - Implemented OpenAI-compatible Chat Completions API support - Added model family detection and provider-specific handling - Updated authentication to support LiteLLM API keys - Renamed environment variables: OPENAI_BASE_URL → LLMX_BASE_URL - Added LLMX_API_KEY for unified authentication - Enhanced error handling for Chat Completions API responses - Implemented fallback mechanisms between Responses API and Chat Completions API ### Phase 4: TypeScript/Node.js Components - Renamed npm package: @codex/codex-cli → @valknar/llmx - Updated TypeScript SDK to use new LLMX APIs and endpoints - Fixed all TypeScript compilation and linting errors - Updated SDK tests to support both API backends - Enhanced mock server to handle multiple API formats - Updated build scripts for cross-platform packaging ### Phase 5: Configuration & Documentation - Updated all configuration files to use LLMX naming - Rewrote README and documentation for LLMX branding - Updated config paths: ~/.codex/ → ~/.llmx/ - Added comprehensive LiteLLM setup guide - Updated all user-facing strings and help text - Created release plan and migration documentation ### Phase 6: Testing & Validation - Fixed all Rust tests for new naming scheme - Updated snapshot tests in TUI (36 frame files) - Fixed authentication storage tests - Updated Chat Completions payload and SSE tests - Fixed SDK tests for new API endpoints - Ensured compatibility with Claude Sonnet 4.5 model - Fixed test environment variables (LLMX_API_KEY, LLMX_BASE_URL) ### Phase 7: Build & Release Pipeline - Updated GitHub Actions workflows for LLMX binary names - Fixed rust-release.yml to reference llmx-rs/ instead of codex-rs/ - Updated CI/CD pipelines for new package names - Made Apple code signing optional in release workflow - Enhanced npm packaging resilience for partial platform builds - Added Windows sandbox support to workspace - Updated dotslash configuration for new binary names ### Phase 8: Final Polish - Renamed all assets (.github images, labels, templates) - Updated VSCode and DevContainer configurations - Fixed all clippy warnings and formatting issues - Applied cargo fmt and prettier formatting across codebase - Updated issue templates and pull request templates - Fixed all remaining UI text references ## Technical Details **Breaking Changes:** - Binary name changed from `codex` to `llmx` - Config directory changed from `~/.codex/` to `~/.llmx/` - Environment variables renamed (CODEX_* → LLMX_*) - npm package renamed to `@valknar/llmx` **New Features:** - Support for 100+ LLM providers via LiteLLM - Unified authentication with LLMX_API_KEY - Enhanced model provider detection and handling - Improved error handling and fallback mechanisms **Files Changed:** - 578 files modified across Rust, TypeScript, and documentation - 30+ Rust crates renamed and updated - Complete rebrand of UI, CLI, and documentation - All tests updated and passing **Dependencies:** - Updated Cargo.lock with new package names - Updated npm dependencies in llmx-cli - Enhanced OpenAPI models for LLMX backend This release establishes LLMX as a standalone project with comprehensive LiteLLM integration, maintaining full backward compatibility with existing functionality while opening support for a wide ecosystem of LLM providers. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> Co-Authored-By: Sebastian Krüger <support@pivoine.art>
4.9 KiB
Contributing
This project is under active development and the code will likely change pretty significantly.
At the moment, we only plan to prioritize reviewing external contributions for bugs or security fixes.
If you want to add a new feature or change the behavior of an existing one, please open an issue proposing the feature and get approval from an OpenAI team member before spending time building it.
New contributions that don't go through this process may be closed if they aren't aligned with our current roadmap or conflict with other priorities/upcoming features.
Development workflow
- Create a topic branch from
main- e.g.feat/interactive-prompt. - Keep your changes focused. Multiple unrelated fixes should be opened as separate PRs.
- Ensure your change is free of lint warnings and test failures.
Writing high-impact code changes
- Start with an issue. Open a new one or comment on an existing discussion so we can agree on the solution before code is written.
- Add or update tests. Every new feature or bug-fix should come with test coverage that fails before your change and passes afterwards. 100% coverage is not required, but aim for meaningful assertions.
- Document behaviour. If your change affects user-facing behaviour, update the README, inline help (
llmx --help), or relevant example projects. - Keep commits atomic. Each commit should compile and the tests should pass. This makes reviews and potential rollbacks easier.
Opening a pull request
- Fill in the PR template (or include similar information) - What? Why? How?
- Include a link to a bug report or enhancement request in the issue tracker
- Run all checks locally (
cargo test && cargo clippy --tests && cargo fmt -- --config imports_granularity=Item). CI failures that could have been caught locally slow down the process. - Make sure your branch is up-to-date with
mainand that you have resolved merge conflicts. - Mark the PR as Ready for review only when you believe it is in a merge-able state.
Review process
- One maintainer will be assigned as a primary reviewer.
- If your PR adds a new feature that was not previously discussed and approved, we may choose to close your PR (see Contributing).
- We may ask for changes - please do not take this personally. We value the work, but we also value consistency and long-term maintainability.
- When there is consensus that the PR meets the bar, a maintainer will squash-and-merge.
Community values
- Be kind and inclusive. Treat others with respect; we follow the Contributor Covenant.
- Assume good intent. Written communication is hard - err on the side of generosity.
- Teach & learn. If you spot something confusing, open an issue or PR with improvements.
Getting help
If you run into problems setting up the project, would like feedback on an idea, or just want to say hi - please open a Discussion or jump into the relevant issue. We are happy to help.
Together we can make LLMX CLI an incredible tool. Happy hacking! 🚀
Contributor license agreement (CLA)
All contributors must accept the CLA. The process is lightweight:
-
Open your pull request.
-
Paste the following comment (or reply
recheckif you've signed before):I have read the CLA Document and I hereby sign the CLA -
The CLA-Assistant bot records your signature in the repo and marks the status check as passed.
No special Git commands, email attachments, or commit footers required.
Quick fixes
| Scenario | Command |
|---|---|
| Amend last commit | git commit --amend -s --no-edit && git push -f |
The DCO check blocks merges until every commit in the PR carries the footer (with squash this is just the one).
Releasing llmx
For admins only.
Make sure you are on main and have no local changes. Then run:
VERSION=0.2.0 # Can also be 0.2.0-alpha.1 or any valid Rust version.
./llmx-rs/scripts/create_github_release.sh "$VERSION"
This will make a local commit on top of main with version set to $VERSION in llmx-rs/Cargo.toml (note that on main, we leave the version as version = "0.0.0").
This will push the commit using the tag rust-v${VERSION}, which in turn kicks off the release workflow. This will create a new GitHub Release named $VERSION.
If everything looks good in the generated GitHub Release, uncheck the pre-release box so it is the latest release.
Create a PR to update Cask/c/llmx.rb on Homebrew.
Security & responsible AI
Have you discovered a vulnerability or have concerns about model output? Please e-mail security@openai.com and we will respond promptly.