Phase 5: Configuration & Documentation
Updated all documentation and configuration files: Documentation changes: - Updated README.md to describe LLMX as LiteLLM-powered fork - Updated CLAUDE.md with LiteLLM integration details - Updated 50+ markdown files across docs/, llmx-rs/, llmx-cli/, sdk/ - Changed all references: codex → llmx, Codex → LLMX - Updated package references: @openai/codex → @llmx/llmx - Updated repository URLs: github.com/openai/codex → github.com/valknar/llmx Configuration changes: - Updated .github/dependabot.yaml - Updated .github workflow files - Updated cliff.toml (changelog configuration) - Updated Cargo.toml comments Key branding updates: - Project description: "coding agent from OpenAI" → "coding agent powered by LiteLLM" - Added attribution to original OpenAI Codex project - Documented LiteLLM integration benefits Files changed: 51 files (559 insertions, 559 deletions) 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
@@ -1,19 +1,19 @@
|
||||
# Codex MCP Server Interface [experimental]
|
||||
# LLMX MCP Server Interface [experimental]
|
||||
|
||||
This document describes Codex’s experimental MCP server interface: a JSON‑RPC API that runs over the Model Context Protocol (MCP) transport to control a local Codex engine.
|
||||
This document describes LLMX’s experimental MCP server interface: a JSON‑RPC API that runs over the Model Context Protocol (MCP) transport to control a local LLMX engine.
|
||||
|
||||
- Status: experimental and subject to change without notice
|
||||
- Server binary: `codex mcp-server` (or `codex-mcp-server`)
|
||||
- Server binary: `llmx mcp-server` (or `llmx-mcp-server`)
|
||||
- Transport: standard MCP over stdio (JSON‑RPC 2.0, line‑delimited)
|
||||
|
||||
## Overview
|
||||
|
||||
Codex exposes a small set of MCP‑compatible methods to create and manage conversations, send user input, receive live events, and handle approval prompts. The types are defined in `protocol/src/mcp_protocol.rs` and re‑used by the MCP server implementation in `mcp-server/`.
|
||||
LLMX exposes a small set of MCP‑compatible methods to create and manage conversations, send user input, receive live events, and handle approval prompts. The types are defined in `protocol/src/mcp_protocol.rs` and re‑used by the MCP server implementation in `mcp-server/`.
|
||||
|
||||
At a glance:
|
||||
|
||||
- Conversations
|
||||
- `newConversation` → start a Codex session
|
||||
- `newConversation` → start a LLMX session
|
||||
- `sendUserMessage` / `sendUserTurn` → send user input into a conversation
|
||||
- `interruptConversation` → stop the current turn
|
||||
- `listConversations`, `resumeConversation`, `archiveConversation`
|
||||
@@ -29,25 +29,25 @@ At a glance:
|
||||
- `applyPatchApproval`, `execCommandApproval`
|
||||
- Notifications (server → client)
|
||||
- `loginChatGptComplete`, `authStatusChange`
|
||||
- `codex/event` stream with agent events
|
||||
- `llmx/event` stream with agent events
|
||||
|
||||
See code for full type definitions and exact shapes: `protocol/src/mcp_protocol.rs`.
|
||||
|
||||
## Starting the server
|
||||
|
||||
Run Codex as an MCP server and connect an MCP client:
|
||||
Run LLMX as an MCP server and connect an MCP client:
|
||||
|
||||
```bash
|
||||
codex mcp-server | your_mcp_client
|
||||
llmx mcp-server | your_mcp_client
|
||||
```
|
||||
|
||||
For a simple inspection UI, you can also try:
|
||||
|
||||
```bash
|
||||
npx @modelcontextprotocol/inspector codex mcp-server
|
||||
npx @modelcontextprotocol/inspector llmx mcp-server
|
||||
```
|
||||
|
||||
Use the separate `codex mcp` subcommand to manage configured MCP server launchers in `config.toml`.
|
||||
Use the separate `llmx mcp` subcommand to manage configured MCP server launchers in `config.toml`.
|
||||
|
||||
## Conversations
|
||||
|
||||
@@ -55,7 +55,7 @@ Start a new session with optional overrides:
|
||||
|
||||
Request `newConversation` params (subset):
|
||||
|
||||
- `model`: string model id (e.g. "o3", "gpt-5", "gpt-5-codex")
|
||||
- `model`: string model id (e.g. "o3", "gpt-5", "gpt-5-llmx")
|
||||
- `profile`: optional named profile
|
||||
- `cwd`: optional working directory
|
||||
- `approvalPolicy`: `untrusted` | `on-request` | `on-failure` | `never`
|
||||
@@ -78,7 +78,7 @@ List/resume/archive: `listConversations`, `resumeConversation`, `archiveConversa
|
||||
|
||||
## Models
|
||||
|
||||
Fetch the catalog of models available in the current Codex build with `model/list`. The request accepts optional pagination inputs:
|
||||
Fetch the catalog of models available in the current LLMX build with `model/list`. The request accepts optional pagination inputs:
|
||||
|
||||
- `pageSize` – number of models to return (defaults to a server-selected value)
|
||||
- `cursor` – opaque string from the previous response’s `nextCursor`
|
||||
@@ -98,14 +98,14 @@ Each response yields:
|
||||
|
||||
While a conversation runs, the server sends notifications:
|
||||
|
||||
- `codex/event` with the serialized Codex event payload. The shape matches `core/src/protocol.rs`’s `Event` and `EventMsg` types. Some notifications include a `_meta.requestId` to correlate with the originating request.
|
||||
- `llmx/event` with the serialized LLMX event payload. The shape matches `core/src/protocol.rs`’s `Event` and `EventMsg` types. Some notifications include a `_meta.requestId` to correlate with the originating request.
|
||||
- Auth notifications via method names `loginChatGptComplete` and `authStatusChange`.
|
||||
|
||||
Clients should render events and, when present, surface approval requests (see next section).
|
||||
|
||||
## Approvals (server → client)
|
||||
|
||||
When Codex needs approval to apply changes or run commands, the server issues JSON‑RPC requests to the client:
|
||||
When LLMX needs approval to apply changes or run commands, the server issues JSON‑RPC requests to the client:
|
||||
|
||||
- `applyPatchApproval { conversationId, callId, fileChanges, reason?, grantRoot? }`
|
||||
- `execCommandApproval { conversationId, callId, command, cwd, reason? }`
|
||||
@@ -131,10 +131,10 @@ Server responds:
|
||||
Then send input:
|
||||
|
||||
```json
|
||||
{ "jsonrpc": "2.0", "id": 2, "method": "sendUserMessage", "params": { "conversationId": "c7b0…", "items": [{ "type": "text", "text": "Hello Codex" }] } }
|
||||
{ "jsonrpc": "2.0", "id": 2, "method": "sendUserMessage", "params": { "conversationId": "c7b0…", "items": [{ "type": "text", "text": "Hello LLMX" }] } }
|
||||
```
|
||||
|
||||
While processing, the server emits `codex/event` notifications containing agent output, approvals, and status updates.
|
||||
While processing, the server emits `llmx/event` notifications containing agent output, approvals, and status updates.
|
||||
|
||||
## Compatibility and stability
|
||||
|
||||
|
||||
@@ -6,22 +6,22 @@ NOTE: The code might not completely match this spec. There are a few minor chang
|
||||
|
||||
## Entities
|
||||
|
||||
These are entities exit on the codex backend. The intent of this section is to establish vocabulary and construct a shared mental model for the `Codex` core system.
|
||||
These are entities exit on the llmx backend. The intent of this section is to establish vocabulary and construct a shared mental model for the `LLMX` core system.
|
||||
|
||||
0. `Model`
|
||||
- In our case, this is the Responses REST API
|
||||
1. `Codex`
|
||||
- The core engine of codex
|
||||
1. `LLMX`
|
||||
- The core engine of llmx
|
||||
- Runs locally, either in a background thread or separate process
|
||||
- Communicated to via a queue pair – SQ (Submission Queue) / EQ (Event Queue)
|
||||
- Takes user input, makes requests to the `Model`, executes commands and applies patches.
|
||||
2. `Session`
|
||||
- The `Codex`'s current configuration and state
|
||||
- `Codex` starts with no `Session`, and it is initialized by `Op::ConfigureSession`, which should be the first message sent by the UI.
|
||||
- The `LLMX`'s current configuration and state
|
||||
- `LLMX` starts with no `Session`, and it is initialized by `Op::ConfigureSession`, which should be the first message sent by the UI.
|
||||
- The current `Session` can be reconfigured with additional `Op::ConfigureSession` calls.
|
||||
- Any running execution is aborted when the session is reconfigured.
|
||||
3. `Task`
|
||||
- A `Task` is `Codex` executing work in response to user input.
|
||||
- A `Task` is `LLMX` executing work in response to user input.
|
||||
- `Session` has at most one `Task` running at a time.
|
||||
- Receiving `Op::UserInput` starts a `Task`
|
||||
- Consists of a series of `Turn`s
|
||||
@@ -35,28 +35,28 @@ These are entities exit on the codex backend. The intent of this section is to e
|
||||
- One cycle of iteration in a `Task`, consists of:
|
||||
- A request to the `Model` - (initially) prompt + (optional) `last_response_id`, or (in loop) previous turn output
|
||||
- The `Model` streams responses back in an SSE, which are collected until "completed" message and the SSE terminates
|
||||
- `Codex` then executes command(s), applies patch(es), and outputs message(s) returned by the `Model`
|
||||
- `LLMX` then executes command(s), applies patch(es), and outputs message(s) returned by the `Model`
|
||||
- Pauses to request approval when necessary
|
||||
- The output of one `Turn` is the input to the next `Turn`
|
||||
- A `Turn` yielding no output terminates the `Task`
|
||||
|
||||
The term "UI" is used to refer to the application driving `Codex`. This may be the CLI / TUI chat-like interface that users operate, or it may be a GUI interface like a VSCode extension. The UI is external to `Codex`, as `Codex` is intended to be operated by arbitrary UI implementations.
|
||||
The term "UI" is used to refer to the application driving `LLMX`. This may be the CLI / TUI chat-like interface that users operate, or it may be a GUI interface like a VSCode extension. The UI is external to `LLMX`, as `LLMX` is intended to be operated by arbitrary UI implementations.
|
||||
|
||||
When a `Turn` completes, the `response_id` from the `Model`'s final `response.completed` message is stored in the `Session` state to resume the thread given the next `Op::UserInput`. The `response_id` is also returned in the `EventMsg::TurnComplete` to the UI, which can be used to fork the thread from an earlier point by providing it in the `Op::UserInput`.
|
||||
|
||||
Since only 1 `Task` can be run at a time, for parallel tasks it is recommended that a single `Codex` be run for each thread of work.
|
||||
Since only 1 `Task` can be run at a time, for parallel tasks it is recommended that a single `LLMX` be run for each thread of work.
|
||||
|
||||
## Interface
|
||||
|
||||
- `Codex`
|
||||
- `LLMX`
|
||||
- Communicates with UI via a `SQ` (Submission Queue) and `EQ` (Event Queue).
|
||||
- `Submission`
|
||||
- These are messages sent on the `SQ` (UI -> `Codex`)
|
||||
- These are messages sent on the `SQ` (UI -> `LLMX`)
|
||||
- Has an string ID provided by the UI, referred to as `sub_id`
|
||||
- `Op` refers to the enum of all possible `Submission` payloads
|
||||
- This enum is `non_exhaustive`; variants can be added at future dates
|
||||
- `Event`
|
||||
- These are messages sent on the `EQ` (`Codex` -> UI)
|
||||
- These are messages sent on the `EQ` (`LLMX` -> UI)
|
||||
- Each `Event` has a non-unique ID, matching the `sub_id` from the `Op::UserInput` that started the current task.
|
||||
- `EventMsg` refers to the enum of all possible `Event` payloads
|
||||
- This enum is `non_exhaustive`; variants can be added at future dates
|
||||
@@ -98,16 +98,16 @@ sequenceDiagram
|
||||
participant user as User
|
||||
end
|
||||
box Daemon
|
||||
participant codex as Codex
|
||||
participant llmx as LLMX
|
||||
participant session as Session
|
||||
participant task as Task
|
||||
end
|
||||
box Rest API
|
||||
participant agent as Model
|
||||
end
|
||||
user->>codex: Op::ConfigureSession
|
||||
codex-->>session: create session
|
||||
codex->>user: Event::SessionConfigured
|
||||
user->>llmx: Op::ConfigureSession
|
||||
llmx-->>session: create session
|
||||
llmx->>user: Event::SessionConfigured
|
||||
user->>session: Op::UserInput
|
||||
session-->>+task: start task
|
||||
task->>user: Event::TaskStarted
|
||||
|
||||
Reference in New Issue
Block a user