## Summary
- Factor `load_config_as_toml` into `core::config_loader` so config
loading is reusable across callers.
- Layer `~/.codex/config.toml`, optional `~/.codex/managed_config.toml`,
and macOS managed preferences (base64) with recursive table merging and
scoped threads per source.
## Config Flow
```
Managed prefs (macOS profile: com.openai.codex/config_toml_base64)
▲
│
~/.codex/managed_config.toml │ (optional file-based override)
▲
│
~/.codex/config.toml (user-defined settings)
```
- The loader searches under the resolved `CODEX_HOME` directory
(defaults to `~/.codex`).
- Managed configs let administrators ship fleet-wide overrides via
device profiles which is useful for enforcing certain settings like
sandbox or approval defaults.
- For nested hash tables: overlays merge recursively. Child tables are
merged key-by-key, while scalar or array values replace the prior layer
entirely. This lets admins add or tweak individual fields without
clobbering unrelated user settings.
This PR adds support for streamable HTTP MCP servers when the
`experimental_use_rmcp_client` is enabled.
To set one up, simply add a new mcp server config with the url:
```
[mcp_servers.figma]
url = "http://127.0.0.1:3845/mcp"
```
It also supports an optional `bearer_token` which will be provided in an
authorization header. The full oauth flow is not supported yet.
The config parsing will throw if it detects that the user mixed and
matched config fields (like command + bearer token or url + env).
The best way to review it is to review `core/src` and then
`rmcp-client/src/rmcp_client.rs` first. The rest is tests and
propagating the `Transport` struct around the codebase.
Example with the Figma MCP:
<img width="5084" height="1614" alt="CleanShot 2025-09-26 at 13 35 40"
src="https://github.com/user-attachments/assets/eaf2771e-df3e-4300-816b-184d7dec5a28"
/>