Updated all documentation and configuration files: Documentation changes: - Updated README.md to describe LLMX as LiteLLM-powered fork - Updated CLAUDE.md with LiteLLM integration details - Updated 50+ markdown files across docs/, llmx-rs/, llmx-cli/, sdk/ - Changed all references: codex → llmx, Codex → LLMX - Updated package references: @openai/codex → @llmx/llmx - Updated repository URLs: github.com/openai/codex → github.com/valknar/llmx Configuration changes: - Updated .github/dependabot.yaml - Updated .github workflow files - Updated cliff.toml (changelog configuration) - Updated Cargo.toml comments Key branding updates: - Project description: "coding agent from OpenAI" → "coding agent powered by LiteLLM" - Added attribution to original OpenAI Codex project - Documented LiteLLM integration benefits Files changed: 51 files (559 insertions, 559 deletions) 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
14 lines
581 B
Markdown
14 lines
581 B
Markdown
# @llmx/llmx-responses-api-proxy
|
|
|
|
<p align="center"><code>npm i -g @llmx/llmx-responses-api-proxy</code> to install <code>llmx-responses-api-proxy</code></p>
|
|
|
|
This package distributes the prebuilt [LLMX Responses API proxy binary](https://github.com/valknar/llmx/tree/main/llmx-rs/responses-api-proxy) for macOS, Linux, and Windows.
|
|
|
|
To see available options, run:
|
|
|
|
```
|
|
node ./bin/llmx-responses-api-proxy.js --help
|
|
```
|
|
|
|
Refer to [`llmx-rs/responses-api-proxy/README.md`](https://github.com/valknar/llmx/blob/main/llmx-rs/responses-api-proxy/README.md) for detailed documentation.
|