2025-05-07 12:56:38 -07:00
|
|
|
use std::time::Duration;
|
|
|
|
|
|
feat: support mcp_servers in config.toml (#829)
This adds initial support for MCP servers in the style of Claude Desktop
and Cursor. Note this PR is the bare minimum to get things working end
to end: all configured MCP servers are launched every time Codex is run,
there is no recovery for MCP servers that crash, etc.
(Also, I took some shortcuts to change some fields of `Session` to be
`pub(crate)`, which also means there are circular deps between
`codex.rs` and `mcp_tool_call.rs`, but I will clean that up in a
subsequent PR.)
`codex-rs/README.md` is updated as part of this PR to explain how to use
this feature. There is a bit of plumbing to route the new settings from
`Config` to the business logic in `codex.rs`. The most significant
chunks for new code are in `mcp_connection_manager.rs` (which defines
the `McpConnectionManager` struct) and `mcp_tool_call.rs`, which is
responsible for tool calls.
This PR also introduces new `McpToolCallBegin` and `McpToolCallEnd`
event types to the protocol, but does not add any handlers for them.
(See https://github.com/openai/codex/pull/836 for initial usage.)
To test, I added the following to my `~/.codex/config.toml`:
```toml
# Local build of https://github.com/hideya/mcp-server-weather-js
[mcp_servers.weather]
command = "/Users/mbolin/code/mcp-server-weather-js/dist/index.js"
args = []
```
And then I ran the following:
```
codex-rs$ cargo run --bin codex exec 'what is the weather in san francisco'
[2025-05-06T22:40:05] Task started: 1
[2025-05-06T22:40:18] Agent message: Here’s the latest National Weather Service forecast for San Francisco (downtown, near 37.77° N, 122.42° W):
This Afternoon (Tue):
• Sunny, high near 69 °F
• West-southwest wind around 12 mph
Tonight:
• Partly cloudy, low around 52 °F
• SW wind 7–10 mph
...
```
Note that Codex itself is not able to make network calls, so it would
not normally be able to get live weather information like this. However,
the weather MCP is [currently] not run under the Codex sandbox, so it is
able to hit `api.weather.gov` and fetch current weather information.
---
[//]: # (BEGIN SAPLING FOOTER)
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with [ReviewStack](https://reviewstack.dev/openai/codex/pull/829).
* #836
* __->__ #829
2025-05-06 15:47:59 -07:00
|
|
|
use tracing::error;
|
|
|
|
|
|
|
|
|
|
use crate::codex::Session;
|
|
|
|
|
use crate::models::FunctionCallOutputPayload;
|
|
|
|
|
use crate::models::ResponseInputItem;
|
|
|
|
|
use crate::protocol::Event;
|
|
|
|
|
use crate::protocol::EventMsg;
|
|
|
|
|
|
|
|
|
|
/// Handles the specified tool call dispatches the appropriate
|
|
|
|
|
/// `McpToolCallBegin` and `McpToolCallEnd` events to the `Session`.
|
|
|
|
|
pub(crate) async fn handle_mcp_tool_call(
|
|
|
|
|
sess: &Session,
|
|
|
|
|
sub_id: &str,
|
|
|
|
|
call_id: String,
|
|
|
|
|
server: String,
|
|
|
|
|
tool_name: String,
|
|
|
|
|
arguments: String,
|
2025-05-07 12:56:38 -07:00
|
|
|
timeout: Option<Duration>,
|
feat: support mcp_servers in config.toml (#829)
This adds initial support for MCP servers in the style of Claude Desktop
and Cursor. Note this PR is the bare minimum to get things working end
to end: all configured MCP servers are launched every time Codex is run,
there is no recovery for MCP servers that crash, etc.
(Also, I took some shortcuts to change some fields of `Session` to be
`pub(crate)`, which also means there are circular deps between
`codex.rs` and `mcp_tool_call.rs`, but I will clean that up in a
subsequent PR.)
`codex-rs/README.md` is updated as part of this PR to explain how to use
this feature. There is a bit of plumbing to route the new settings from
`Config` to the business logic in `codex.rs`. The most significant
chunks for new code are in `mcp_connection_manager.rs` (which defines
the `McpConnectionManager` struct) and `mcp_tool_call.rs`, which is
responsible for tool calls.
This PR also introduces new `McpToolCallBegin` and `McpToolCallEnd`
event types to the protocol, but does not add any handlers for them.
(See https://github.com/openai/codex/pull/836 for initial usage.)
To test, I added the following to my `~/.codex/config.toml`:
```toml
# Local build of https://github.com/hideya/mcp-server-weather-js
[mcp_servers.weather]
command = "/Users/mbolin/code/mcp-server-weather-js/dist/index.js"
args = []
```
And then I ran the following:
```
codex-rs$ cargo run --bin codex exec 'what is the weather in san francisco'
[2025-05-06T22:40:05] Task started: 1
[2025-05-06T22:40:18] Agent message: Here’s the latest National Weather Service forecast for San Francisco (downtown, near 37.77° N, 122.42° W):
This Afternoon (Tue):
• Sunny, high near 69 °F
• West-southwest wind around 12 mph
Tonight:
• Partly cloudy, low around 52 °F
• SW wind 7–10 mph
...
```
Note that Codex itself is not able to make network calls, so it would
not normally be able to get live weather information like this. However,
the weather MCP is [currently] not run under the Codex sandbox, so it is
able to hit `api.weather.gov` and fetch current weather information.
---
[//]: # (BEGIN SAPLING FOOTER)
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with [ReviewStack](https://reviewstack.dev/openai/codex/pull/829).
* #836
* __->__ #829
2025-05-06 15:47:59 -07:00
|
|
|
) -> ResponseInputItem {
|
|
|
|
|
// Parse the `arguments` as JSON. An empty string is OK, but invalid JSON
|
|
|
|
|
// is not.
|
|
|
|
|
let arguments_value = if arguments.trim().is_empty() {
|
|
|
|
|
None
|
|
|
|
|
} else {
|
|
|
|
|
match serde_json::from_str::<serde_json::Value>(&arguments) {
|
|
|
|
|
Ok(value) => Some(value),
|
|
|
|
|
Err(e) => {
|
|
|
|
|
error!("failed to parse tool call arguments: {e}");
|
|
|
|
|
return ResponseInputItem::FunctionCallOutput {
|
|
|
|
|
call_id: call_id.clone(),
|
|
|
|
|
output: FunctionCallOutputPayload {
|
|
|
|
|
content: format!("err: {e}"),
|
|
|
|
|
success: Some(false),
|
|
|
|
|
},
|
|
|
|
|
};
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
};
|
|
|
|
|
|
|
|
|
|
let tool_call_begin_event = EventMsg::McpToolCallBegin {
|
|
|
|
|
call_id: call_id.clone(),
|
|
|
|
|
server: server.clone(),
|
|
|
|
|
tool: tool_name.clone(),
|
|
|
|
|
arguments: arguments_value.clone(),
|
|
|
|
|
};
|
|
|
|
|
notify_mcp_tool_call_event(sess, sub_id, tool_call_begin_event).await;
|
|
|
|
|
|
|
|
|
|
// Perform the tool call.
|
2025-05-07 12:56:38 -07:00
|
|
|
let (tool_call_end_event, tool_call_err) = match sess
|
|
|
|
|
.call_tool(&server, &tool_name, arguments_value, timeout)
|
|
|
|
|
.await
|
|
|
|
|
{
|
|
|
|
|
Ok(result) => (
|
|
|
|
|
EventMsg::McpToolCallEnd {
|
|
|
|
|
call_id,
|
|
|
|
|
success: !result.is_error.unwrap_or(false),
|
|
|
|
|
result: Some(result),
|
|
|
|
|
},
|
|
|
|
|
None,
|
|
|
|
|
),
|
|
|
|
|
Err(e) => (
|
|
|
|
|
EventMsg::McpToolCallEnd {
|
|
|
|
|
call_id,
|
|
|
|
|
success: false,
|
|
|
|
|
result: None,
|
|
|
|
|
},
|
|
|
|
|
Some(e),
|
|
|
|
|
),
|
|
|
|
|
};
|
feat: support mcp_servers in config.toml (#829)
This adds initial support for MCP servers in the style of Claude Desktop
and Cursor. Note this PR is the bare minimum to get things working end
to end: all configured MCP servers are launched every time Codex is run,
there is no recovery for MCP servers that crash, etc.
(Also, I took some shortcuts to change some fields of `Session` to be
`pub(crate)`, which also means there are circular deps between
`codex.rs` and `mcp_tool_call.rs`, but I will clean that up in a
subsequent PR.)
`codex-rs/README.md` is updated as part of this PR to explain how to use
this feature. There is a bit of plumbing to route the new settings from
`Config` to the business logic in `codex.rs`. The most significant
chunks for new code are in `mcp_connection_manager.rs` (which defines
the `McpConnectionManager` struct) and `mcp_tool_call.rs`, which is
responsible for tool calls.
This PR also introduces new `McpToolCallBegin` and `McpToolCallEnd`
event types to the protocol, but does not add any handlers for them.
(See https://github.com/openai/codex/pull/836 for initial usage.)
To test, I added the following to my `~/.codex/config.toml`:
```toml
# Local build of https://github.com/hideya/mcp-server-weather-js
[mcp_servers.weather]
command = "/Users/mbolin/code/mcp-server-weather-js/dist/index.js"
args = []
```
And then I ran the following:
```
codex-rs$ cargo run --bin codex exec 'what is the weather in san francisco'
[2025-05-06T22:40:05] Task started: 1
[2025-05-06T22:40:18] Agent message: Here’s the latest National Weather Service forecast for San Francisco (downtown, near 37.77° N, 122.42° W):
This Afternoon (Tue):
• Sunny, high near 69 °F
• West-southwest wind around 12 mph
Tonight:
• Partly cloudy, low around 52 °F
• SW wind 7–10 mph
...
```
Note that Codex itself is not able to make network calls, so it would
not normally be able to get live weather information like this. However,
the weather MCP is [currently] not run under the Codex sandbox, so it is
able to hit `api.weather.gov` and fetch current weather information.
---
[//]: # (BEGIN SAPLING FOOTER)
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with [ReviewStack](https://reviewstack.dev/openai/codex/pull/829).
* #836
* __->__ #829
2025-05-06 15:47:59 -07:00
|
|
|
|
|
|
|
|
notify_mcp_tool_call_event(sess, sub_id, tool_call_end_event.clone()).await;
|
|
|
|
|
let EventMsg::McpToolCallEnd {
|
|
|
|
|
call_id,
|
|
|
|
|
success,
|
|
|
|
|
result,
|
|
|
|
|
} = tool_call_end_event
|
|
|
|
|
else {
|
|
|
|
|
unimplemented!("unexpected event type");
|
|
|
|
|
};
|
|
|
|
|
|
|
|
|
|
ResponseInputItem::FunctionCallOutput {
|
|
|
|
|
call_id,
|
|
|
|
|
output: FunctionCallOutputPayload {
|
|
|
|
|
content: result.map_or_else(
|
|
|
|
|
|| format!("err: {tool_call_err:?}"),
|
|
|
|
|
|result| {
|
|
|
|
|
serde_json::to_string(&result)
|
|
|
|
|
.unwrap_or_else(|e| format!("JSON serialization error: {e}"))
|
|
|
|
|
},
|
|
|
|
|
),
|
|
|
|
|
success: Some(success),
|
|
|
|
|
},
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
async fn notify_mcp_tool_call_event(sess: &Session, sub_id: &str, event: EventMsg) {
|
2025-05-06 16:21:35 -07:00
|
|
|
sess.send_event(Event {
|
|
|
|
|
id: sub_id.to_string(),
|
|
|
|
|
msg: event,
|
|
|
|
|
})
|
|
|
|
|
.await;
|
feat: support mcp_servers in config.toml (#829)
This adds initial support for MCP servers in the style of Claude Desktop
and Cursor. Note this PR is the bare minimum to get things working end
to end: all configured MCP servers are launched every time Codex is run,
there is no recovery for MCP servers that crash, etc.
(Also, I took some shortcuts to change some fields of `Session` to be
`pub(crate)`, which also means there are circular deps between
`codex.rs` and `mcp_tool_call.rs`, but I will clean that up in a
subsequent PR.)
`codex-rs/README.md` is updated as part of this PR to explain how to use
this feature. There is a bit of plumbing to route the new settings from
`Config` to the business logic in `codex.rs`. The most significant
chunks for new code are in `mcp_connection_manager.rs` (which defines
the `McpConnectionManager` struct) and `mcp_tool_call.rs`, which is
responsible for tool calls.
This PR also introduces new `McpToolCallBegin` and `McpToolCallEnd`
event types to the protocol, but does not add any handlers for them.
(See https://github.com/openai/codex/pull/836 for initial usage.)
To test, I added the following to my `~/.codex/config.toml`:
```toml
# Local build of https://github.com/hideya/mcp-server-weather-js
[mcp_servers.weather]
command = "/Users/mbolin/code/mcp-server-weather-js/dist/index.js"
args = []
```
And then I ran the following:
```
codex-rs$ cargo run --bin codex exec 'what is the weather in san francisco'
[2025-05-06T22:40:05] Task started: 1
[2025-05-06T22:40:18] Agent message: Here’s the latest National Weather Service forecast for San Francisco (downtown, near 37.77° N, 122.42° W):
This Afternoon (Tue):
• Sunny, high near 69 °F
• West-southwest wind around 12 mph
Tonight:
• Partly cloudy, low around 52 °F
• SW wind 7–10 mph
...
```
Note that Codex itself is not able to make network calls, so it would
not normally be able to get live weather information like this. However,
the weather MCP is [currently] not run under the Codex sandbox, so it is
able to hit `api.weather.gov` and fetch current weather information.
---
[//]: # (BEGIN SAPLING FOOTER)
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with [ReviewStack](https://reviewstack.dev/openai/codex/pull/829).
* #836
* __->__ #829
2025-05-06 15:47:59 -07:00
|
|
|
}
|