fix: vendor ripgrep in the npm module (#3660)
We try to ensure ripgrep (`rg`) is provided with Codex. - For `brew`, we declare it as a dependency of our formula:08d82d8b00/Formula/c/codex.rb (L24)- For `npm`, we declare `@vscode/ripgrep` as a dependency, which installs the platform-specific binary as part of a `postinstall` script:fdb8dadcae/codex-cli/package.json (L22)- Users who download the CLI directly from GitHub Releases are on their own. In practice, I have seen `@vscode/ripgrep` fail on occasion. Here is a trace from a GitHub workflow: ``` npm error code 1 npm error path /Users/runner/hostedtoolcache/node/20.19.5/arm64/lib/node_modules/@openai/codex/node_modules/@vscode/ripgrep npm error command failed npm error command sh -c node ./lib/postinstall.js npm error Finding release for v13.0.0-13 npm error GET https://api.github.com/repos/microsoft/ripgrep-prebuilt/releases/tags/v13.0.0-13 npm error Deleting invalid download cache npm error Download attempt 1 failed, retrying in 2 seconds... npm error Finding release for v13.0.0-13 npm error GET https://api.github.com/repos/microsoft/ripgrep-prebuilt/releases/tags/v13.0.0-13 npm error Deleting invalid download cache npm error Download attempt 2 failed, retrying in 4 seconds... npm error Finding release for v13.0.0-13 npm error GET https://api.github.com/repos/microsoft/ripgrep-prebuilt/releases/tags/v13.0.0-13 npm error Deleting invalid download cache npm error Download attempt 3 failed, retrying in 8 seconds... npm error Finding release for v13.0.0-13 npm error GET https://api.github.com/repos/microsoft/ripgrep-prebuilt/releases/tags/v13.0.0-13 npm error Deleting invalid download cache npm error Download attempt 4 failed, retrying in 16 seconds... npm error Finding release for v13.0.0-13 npm error GET https://api.github.com/repos/microsoft/ripgrep-prebuilt/releases/tags/v13.0.0-13 npm error Deleting invalid download cache npm error Error: Request failed: 403 ``` To eliminate this error, this PR changes things so that we vendor the `rg` binary into https://www.npmjs.com/package/@openai/codex so it is guaranteed to be included when a user runs `npm i -g @openai/codex`. The downside of this approach is the increase in package size: we include the `rg` executable for six architectures (in addition to the six copies of `codex` we already include). In a follow-up, I plan to add support for "slices" of our npm module, so that soon users will be able to do: ``` npm install -g @openai/codex@aarch64-apple-darwin ``` Admittedly, this is a sizable change and I tried to clean some things up in the process: - `install_native_deps.sh` has been replaced by `install_native_deps.py` - `stage_release.sh` and `stage_rust_release.py` has been replaced by `build_npm_package.py` We now vendor in a DotSlash file for ripgrep (as a modest attempt to facilitate local testing) and then build up the extension by: - creating a temp directory and copying `package.json` over to it with the target value for `"version"` - finding the GitHub workflow that corresponds to the `--release-version` and copying the various `codex` artifacts to respective `vendor/TARGET_TRIPLE/codex` folder - downloading the `rg` artifacts specified in the DotSlash file and copying them over to the respective `vendor/TARGET_TRIPLE/path` folder - if `--pack-output` is specified, runs `npm pack` on the temp directory To test, I downloaded the artifact produced by this CI job: https://github.com/openai/codex/actions/runs/17961595388/job/51085840022?pr=3660 and verified that `node ./bin/codex.js 'which -a rg'` worked as intended.
This commit is contained in:
20
.github/workflows/ci.yml
vendored
20
.github/workflows/ci.yml
vendored
@@ -27,12 +27,26 @@ jobs:
|
|||||||
- name: Install dependencies
|
- name: Install dependencies
|
||||||
run: pnpm install --frozen-lockfile
|
run: pnpm install --frozen-lockfile
|
||||||
|
|
||||||
# Run all tasks using workspace filters
|
# build_npm_package.py requires DotSlash when staging releases.
|
||||||
|
- uses: facebook/install-dotslash@v2
|
||||||
|
|
||||||
- name: Ensure staging a release works.
|
- name: Stage npm package
|
||||||
env:
|
env:
|
||||||
GH_TOKEN: ${{ github.token }}
|
GH_TOKEN: ${{ github.token }}
|
||||||
run: ./codex-cli/scripts/stage_release.sh
|
run: |
|
||||||
|
set -euo pipefail
|
||||||
|
CODEX_VERSION=0.40.0
|
||||||
|
PACK_OUTPUT="${RUNNER_TEMP}/codex-npm.tgz"
|
||||||
|
python3 ./codex-cli/scripts/build_npm_package.py \
|
||||||
|
--release-version "$CODEX_VERSION" \
|
||||||
|
--pack-output "$PACK_OUTPUT"
|
||||||
|
echo "PACK_OUTPUT=$PACK_OUTPUT" >> "$GITHUB_ENV"
|
||||||
|
|
||||||
|
- name: Upload staged npm package artifact
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
|
with:
|
||||||
|
name: codex-npm-staging
|
||||||
|
path: ${{ env.PACK_OUTPUT }}
|
||||||
|
|
||||||
- name: Ensure root README.md contains only ASCII and certain Unicode code points
|
- name: Ensure root README.md contains only ASCII and certain Unicode code points
|
||||||
run: ./scripts/asciicheck.py README.md
|
run: ./scripts/asciicheck.py README.md
|
||||||
|
|||||||
13
.github/workflows/rust-release.yml
vendored
13
.github/workflows/rust-release.yml
vendored
@@ -193,21 +193,18 @@ jobs:
|
|||||||
version="${GITHUB_REF_NAME#rust-v}"
|
version="${GITHUB_REF_NAME#rust-v}"
|
||||||
echo "name=${version}" >> $GITHUB_OUTPUT
|
echo "name=${version}" >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
|
# build_npm_package.py requires DotSlash when staging releases.
|
||||||
|
- uses: facebook/install-dotslash@v2
|
||||||
- name: Stage npm package
|
- name: Stage npm package
|
||||||
env:
|
env:
|
||||||
GH_TOKEN: ${{ github.token }}
|
GH_TOKEN: ${{ github.token }}
|
||||||
run: |
|
run: |
|
||||||
set -euo pipefail
|
set -euo pipefail
|
||||||
TMP_DIR="${RUNNER_TEMP}/npm-stage"
|
TMP_DIR="${RUNNER_TEMP}/npm-stage"
|
||||||
python3 codex-cli/scripts/stage_rust_release.py \
|
./codex-cli/scripts/build_npm_package.py \
|
||||||
--release-version "${{ steps.release_name.outputs.name }}" \
|
--release-version "${{ steps.release_name.outputs.name }}" \
|
||||||
--tmp "${TMP_DIR}"
|
--staging-dir "${TMP_DIR}" \
|
||||||
mkdir -p dist/npm
|
--pack-output "${GITHUB_WORKSPACE}/dist/npm/codex-npm-${{ steps.release_name.outputs.name }}.tgz"
|
||||||
# Produce an npm-ready tarball using `npm pack` and store it in dist/npm.
|
|
||||||
# We then rename it to a stable name used by our publishing script.
|
|
||||||
(cd "$TMP_DIR" && npm pack --pack-destination "${GITHUB_WORKSPACE}/dist/npm")
|
|
||||||
mv "${GITHUB_WORKSPACE}"/dist/npm/*.tgz \
|
|
||||||
"${GITHUB_WORKSPACE}/dist/npm/codex-npm-${{ steps.release_name.outputs.name }}.tgz"
|
|
||||||
|
|
||||||
- name: Create GitHub Release
|
- name: Create GitHub Release
|
||||||
uses: softprops/action-gh-release@v2
|
uses: softprops/action-gh-release@v2
|
||||||
|
|||||||
8
codex-cli/.gitignore
vendored
8
codex-cli/.gitignore
vendored
@@ -1,7 +1 @@
|
|||||||
# Added by ./scripts/install_native_deps.sh
|
/vendor/
|
||||||
/bin/codex-aarch64-apple-darwin
|
|
||||||
/bin/codex-aarch64-unknown-linux-musl
|
|
||||||
/bin/codex-linux-sandbox-arm64
|
|
||||||
/bin/codex-linux-sandbox-x64
|
|
||||||
/bin/codex-x86_64-apple-darwin
|
|
||||||
/bin/codex-x86_64-unknown-linux-musl
|
|
||||||
|
|||||||
@@ -1,6 +1,7 @@
|
|||||||
#!/usr/bin/env node
|
#!/usr/bin/env node
|
||||||
// Unified entry point for the Codex CLI.
|
// Unified entry point for the Codex CLI.
|
||||||
|
|
||||||
|
import { existsSync } from "fs";
|
||||||
import path from "path";
|
import path from "path";
|
||||||
import { fileURLToPath } from "url";
|
import { fileURLToPath } from "url";
|
||||||
|
|
||||||
@@ -40,10 +41,10 @@ switch (platform) {
|
|||||||
case "win32":
|
case "win32":
|
||||||
switch (arch) {
|
switch (arch) {
|
||||||
case "x64":
|
case "x64":
|
||||||
targetTriple = "x86_64-pc-windows-msvc.exe";
|
targetTriple = "x86_64-pc-windows-msvc";
|
||||||
break;
|
break;
|
||||||
case "arm64":
|
case "arm64":
|
||||||
targetTriple = "aarch64-pc-windows-msvc.exe";
|
targetTriple = "aarch64-pc-windows-msvc";
|
||||||
break;
|
break;
|
||||||
default:
|
default:
|
||||||
break;
|
break;
|
||||||
@@ -57,7 +58,10 @@ if (!targetTriple) {
|
|||||||
throw new Error(`Unsupported platform: ${platform} (${arch})`);
|
throw new Error(`Unsupported platform: ${platform} (${arch})`);
|
||||||
}
|
}
|
||||||
|
|
||||||
const binaryPath = path.join(__dirname, "..", "bin", `codex-${targetTriple}`);
|
const vendorRoot = path.join(__dirname, "..", "vendor");
|
||||||
|
const archRoot = path.join(vendorRoot, targetTriple);
|
||||||
|
const codexBinaryName = process.platform === "win32" ? "codex.exe" : "codex";
|
||||||
|
const binaryPath = path.join(archRoot, "codex", codexBinaryName);
|
||||||
|
|
||||||
// Use an asynchronous spawn instead of spawnSync so that Node is able to
|
// Use an asynchronous spawn instead of spawnSync so that Node is able to
|
||||||
// respond to signals (e.g. Ctrl-C / SIGINT) while the native binary is
|
// respond to signals (e.g. Ctrl-C / SIGINT) while the native binary is
|
||||||
@@ -66,23 +70,6 @@ const binaryPath = path.join(__dirname, "..", "bin", `codex-${targetTriple}`);
|
|||||||
// receives a fatal signal, both processes exit in a predictable manner.
|
// receives a fatal signal, both processes exit in a predictable manner.
|
||||||
const { spawn } = await import("child_process");
|
const { spawn } = await import("child_process");
|
||||||
|
|
||||||
async function tryImport(moduleName) {
|
|
||||||
try {
|
|
||||||
// eslint-disable-next-line node/no-unsupported-features/es-syntax
|
|
||||||
return await import(moduleName);
|
|
||||||
} catch (err) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async function resolveRgDir() {
|
|
||||||
const ripgrep = await tryImport("@vscode/ripgrep");
|
|
||||||
if (!ripgrep?.rgPath) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
return path.dirname(ripgrep.rgPath);
|
|
||||||
}
|
|
||||||
|
|
||||||
function getUpdatedPath(newDirs) {
|
function getUpdatedPath(newDirs) {
|
||||||
const pathSep = process.platform === "win32" ? ";" : ":";
|
const pathSep = process.platform === "win32" ? ";" : ":";
|
||||||
const existingPath = process.env.PATH || "";
|
const existingPath = process.env.PATH || "";
|
||||||
@@ -94,9 +81,9 @@ function getUpdatedPath(newDirs) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
const additionalDirs = [];
|
const additionalDirs = [];
|
||||||
const rgDir = await resolveRgDir();
|
const pathDir = path.join(archRoot, "path");
|
||||||
if (rgDir) {
|
if (existsSync(pathDir)) {
|
||||||
additionalDirs.push(rgDir);
|
additionalDirs.push(pathDir);
|
||||||
}
|
}
|
||||||
const updatedPath = getUpdatedPath(additionalDirs);
|
const updatedPath = getUpdatedPath(additionalDirs);
|
||||||
|
|
||||||
|
|||||||
79
codex-cli/bin/rg
Executable file
79
codex-cli/bin/rg
Executable file
@@ -0,0 +1,79 @@
|
|||||||
|
#!/usr/bin/env dotslash
|
||||||
|
|
||||||
|
{
|
||||||
|
"name": "rg",
|
||||||
|
"platforms": {
|
||||||
|
"macos-aarch64": {
|
||||||
|
"size": 1787248,
|
||||||
|
"hash": "blake3",
|
||||||
|
"digest": "8d9942032585ea8ee805937634238d9aee7b210069f4703c88fbe568e26fb78a",
|
||||||
|
"format": "tar.gz",
|
||||||
|
"path": "ripgrep-14.1.1-aarch64-apple-darwin/rg",
|
||||||
|
"providers": [
|
||||||
|
{
|
||||||
|
"url": "https://github.com/BurntSushi/ripgrep/releases/download/14.1.1/ripgrep-14.1.1-aarch64-apple-darwin.tar.gz"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"linux-aarch64": {
|
||||||
|
"size": 2047405,
|
||||||
|
"hash": "blake3",
|
||||||
|
"digest": "0b670b8fa0a3df2762af2fc82cc4932f684ca4c02dbd1260d4f3133fd4b2a515",
|
||||||
|
"format": "tar.gz",
|
||||||
|
"path": "ripgrep-14.1.1-aarch64-unknown-linux-gnu/rg",
|
||||||
|
"providers": [
|
||||||
|
{
|
||||||
|
"url": "https://github.com/BurntSushi/ripgrep/releases/download/14.1.1/ripgrep-14.1.1-aarch64-unknown-linux-gnu.tar.gz"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"macos-x86_64": {
|
||||||
|
"size": 2082672,
|
||||||
|
"hash": "blake3",
|
||||||
|
"digest": "e9b862fc8da3127f92791f0ff6a799504154ca9d36c98bf3e60a81c6b1f7289e",
|
||||||
|
"format": "tar.gz",
|
||||||
|
"path": "ripgrep-14.1.1-x86_64-apple-darwin/rg",
|
||||||
|
"providers": [
|
||||||
|
{
|
||||||
|
"url": "https://github.com/BurntSushi/ripgrep/releases/download/14.1.1/ripgrep-14.1.1-x86_64-apple-darwin.tar.gz"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"linux-x86_64": {
|
||||||
|
"size": 2566310,
|
||||||
|
"hash": "blake3",
|
||||||
|
"digest": "f73cca4e54d78c31f832c7f6e2c0b4db8b04fa3eaa747915727d570893dbee76",
|
||||||
|
"format": "tar.gz",
|
||||||
|
"path": "ripgrep-14.1.1-x86_64-unknown-linux-musl/rg",
|
||||||
|
"providers": [
|
||||||
|
{
|
||||||
|
"url": "https://github.com/BurntSushi/ripgrep/releases/download/14.1.1/ripgrep-14.1.1-x86_64-unknown-linux-musl.tar.gz"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"windows-x86_64": {
|
||||||
|
"size": 2058893,
|
||||||
|
"hash": "blake3",
|
||||||
|
"digest": "a8ce1a6fed4f8093ee997e57f33254e94b2cd18e26358b09db599c89882eadbd",
|
||||||
|
"format": "zip",
|
||||||
|
"path": "ripgrep-14.1.1-x86_64-pc-windows-msvc/rg.exe",
|
||||||
|
"providers": [
|
||||||
|
{
|
||||||
|
"url": "https://github.com/BurntSushi/ripgrep/releases/download/14.1.1/ripgrep-14.1.1-x86_64-pc-windows-msvc.zip"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"windows-aarch64": {
|
||||||
|
"size": 1667740,
|
||||||
|
"hash": "blake3",
|
||||||
|
"digest": "47b971a8c4fca1d23a4e7c19bd4d88465ebc395598458133139406d3bf85f3fa",
|
||||||
|
"format": "zip",
|
||||||
|
"path": "rg.exe",
|
||||||
|
"providers": [
|
||||||
|
{
|
||||||
|
"url": "https://github.com/microsoft/ripgrep-prebuilt/releases/download/v13.0.0-13/ripgrep-v13.0.0-13-aarch64-pc-windows-msvc.zip"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
101
codex-cli/package-lock.json
generated
101
codex-cli/package-lock.json
generated
@@ -2,118 +2,17 @@
|
|||||||
"name": "@openai/codex",
|
"name": "@openai/codex",
|
||||||
"version": "0.0.0-dev",
|
"version": "0.0.0-dev",
|
||||||
"lockfileVersion": 3,
|
"lockfileVersion": 3,
|
||||||
"requires": true,
|
|
||||||
"packages": {
|
"packages": {
|
||||||
"": {
|
"": {
|
||||||
"name": "@openai/codex",
|
"name": "@openai/codex",
|
||||||
"version": "0.0.0-dev",
|
"version": "0.0.0-dev",
|
||||||
"license": "Apache-2.0",
|
"license": "Apache-2.0",
|
||||||
"dependencies": {
|
|
||||||
"@vscode/ripgrep": "^1.15.14"
|
|
||||||
},
|
|
||||||
"bin": {
|
"bin": {
|
||||||
"codex": "bin/codex.js"
|
"codex": "bin/codex.js"
|
||||||
},
|
},
|
||||||
"engines": {
|
"engines": {
|
||||||
"node": ">=20"
|
"node": ">=20"
|
||||||
}
|
}
|
||||||
},
|
|
||||||
"node_modules/@vscode/ripgrep": {
|
|
||||||
"version": "1.15.14",
|
|
||||||
"resolved": "https://registry.npmjs.org/@vscode/ripgrep/-/ripgrep-1.15.14.tgz",
|
|
||||||
"integrity": "sha512-/G1UJPYlm+trBWQ6cMO3sv6b8D1+G16WaJH1/DSqw32JOVlzgZbLkDxRyzIpTpv30AcYGMkCf5tUqGlW6HbDWw==",
|
|
||||||
"hasInstallScript": true,
|
|
||||||
"license": "MIT",
|
|
||||||
"dependencies": {
|
|
||||||
"https-proxy-agent": "^7.0.2",
|
|
||||||
"proxy-from-env": "^1.1.0",
|
|
||||||
"yauzl": "^2.9.2"
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"node_modules/agent-base": {
|
|
||||||
"version": "7.1.4",
|
|
||||||
"resolved": "https://registry.npmjs.org/agent-base/-/agent-base-7.1.4.tgz",
|
|
||||||
"integrity": "sha512-MnA+YT8fwfJPgBx3m60MNqakm30XOkyIoH1y6huTQvC0PwZG7ki8NacLBcrPbNoo8vEZy7Jpuk7+jMO+CUovTQ==",
|
|
||||||
"license": "MIT",
|
|
||||||
"engines": {
|
|
||||||
"node": ">= 14"
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"node_modules/buffer-crc32": {
|
|
||||||
"version": "0.2.13",
|
|
||||||
"resolved": "https://registry.npmjs.org/buffer-crc32/-/buffer-crc32-0.2.13.tgz",
|
|
||||||
"integrity": "sha512-VO9Ht/+p3SN7SKWqcrgEzjGbRSJYTx+Q1pTQC0wrWqHx0vpJraQ6GtHx8tvcg1rlK1byhU5gccxgOgj7B0TDkQ==",
|
|
||||||
"license": "MIT",
|
|
||||||
"engines": {
|
|
||||||
"node": "*"
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"node_modules/debug": {
|
|
||||||
"version": "4.4.1",
|
|
||||||
"resolved": "https://registry.npmjs.org/debug/-/debug-4.4.1.tgz",
|
|
||||||
"integrity": "sha512-KcKCqiftBJcZr++7ykoDIEwSa3XWowTfNPo92BYxjXiyYEVrUQh2aLyhxBCwww+heortUFxEJYcRzosstTEBYQ==",
|
|
||||||
"license": "MIT",
|
|
||||||
"dependencies": {
|
|
||||||
"ms": "^2.1.3"
|
|
||||||
},
|
|
||||||
"engines": {
|
|
||||||
"node": ">=6.0"
|
|
||||||
},
|
|
||||||
"peerDependenciesMeta": {
|
|
||||||
"supports-color": {
|
|
||||||
"optional": true
|
|
||||||
}
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"node_modules/fd-slicer": {
|
|
||||||
"version": "1.1.0",
|
|
||||||
"resolved": "https://registry.npmjs.org/fd-slicer/-/fd-slicer-1.1.0.tgz",
|
|
||||||
"integrity": "sha512-cE1qsB/VwyQozZ+q1dGxR8LBYNZeofhEdUNGSMbQD3Gw2lAzX9Zb3uIU6Ebc/Fmyjo9AWWfnn0AUCHqtevs/8g==",
|
|
||||||
"license": "MIT",
|
|
||||||
"dependencies": {
|
|
||||||
"pend": "~1.2.0"
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"node_modules/https-proxy-agent": {
|
|
||||||
"version": "7.0.6",
|
|
||||||
"resolved": "https://registry.npmjs.org/https-proxy-agent/-/https-proxy-agent-7.0.6.tgz",
|
|
||||||
"integrity": "sha512-vK9P5/iUfdl95AI+JVyUuIcVtd4ofvtrOr3HNtM2yxC9bnMbEdp3x01OhQNnjb8IJYi38VlTE3mBXwcfvywuSw==",
|
|
||||||
"license": "MIT",
|
|
||||||
"dependencies": {
|
|
||||||
"agent-base": "^7.1.2",
|
|
||||||
"debug": "4"
|
|
||||||
},
|
|
||||||
"engines": {
|
|
||||||
"node": ">= 14"
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"node_modules/ms": {
|
|
||||||
"version": "2.1.3",
|
|
||||||
"resolved": "https://registry.npmjs.org/ms/-/ms-2.1.3.tgz",
|
|
||||||
"integrity": "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==",
|
|
||||||
"license": "MIT"
|
|
||||||
},
|
|
||||||
"node_modules/pend": {
|
|
||||||
"version": "1.2.0",
|
|
||||||
"resolved": "https://registry.npmjs.org/pend/-/pend-1.2.0.tgz",
|
|
||||||
"integrity": "sha512-F3asv42UuXchdzt+xXqfW1OGlVBe+mxa2mqI0pg5yAHZPvFmY3Y6drSf/GQ1A86WgWEN9Kzh/WrgKa6iGcHXLg==",
|
|
||||||
"license": "MIT"
|
|
||||||
},
|
|
||||||
"node_modules/proxy-from-env": {
|
|
||||||
"version": "1.1.0",
|
|
||||||
"resolved": "https://registry.npmjs.org/proxy-from-env/-/proxy-from-env-1.1.0.tgz",
|
|
||||||
"integrity": "sha512-D+zkORCbA9f1tdWRK0RaCR3GPv50cMxcrz4X8k5LTSUD1Dkw47mKJEZQNunItRTkWwgtaUSo1RVFRIG9ZXiFYg==",
|
|
||||||
"license": "MIT"
|
|
||||||
},
|
|
||||||
"node_modules/yauzl": {
|
|
||||||
"version": "2.10.0",
|
|
||||||
"resolved": "https://registry.npmjs.org/yauzl/-/yauzl-2.10.0.tgz",
|
|
||||||
"integrity": "sha512-p4a9I6X6nu6IhoGmBqAcbJy1mlC4j27vEPZX9F4L4/vZT3Lyq1VkFHw/V/PUcB9Buo+DG3iHkT0x3Qya58zc3g==",
|
|
||||||
"license": "MIT",
|
|
||||||
"dependencies": {
|
|
||||||
"buffer-crc32": "~0.2.3",
|
|
||||||
"fd-slicer": "~1.1.0"
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -11,17 +11,11 @@
|
|||||||
},
|
},
|
||||||
"files": [
|
"files": [
|
||||||
"bin",
|
"bin",
|
||||||
"dist"
|
"vendor"
|
||||||
],
|
],
|
||||||
"repository": {
|
"repository": {
|
||||||
"type": "git",
|
"type": "git",
|
||||||
"url": "git+https://github.com/openai/codex.git",
|
"url": "git+https://github.com/openai/codex.git",
|
||||||
"directory": "codex-cli"
|
"directory": "codex-cli"
|
||||||
},
|
|
||||||
"dependencies": {
|
|
||||||
"@vscode/ripgrep": "^1.15.14"
|
|
||||||
},
|
|
||||||
"devDependencies": {
|
|
||||||
"prettier": "^3.3.3"
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -5,5 +5,7 @@ Run the following:
|
|||||||
To build the 0.2.x or later version of the npm module, which runs the Rust version of the CLI, build it as follows:
|
To build the 0.2.x or later version of the npm module, which runs the Rust version of the CLI, build it as follows:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
./codex-cli/scripts/stage_rust_release.py --release-version 0.6.0
|
./codex-cli/scripts/build_npm_package.py --release-version 0.6.0
|
||||||
```
|
```
|
||||||
|
|
||||||
|
Note this will create `./codex-cli/vendor/` as a side-effect.
|
||||||
|
|||||||
294
codex-cli/scripts/build_npm_package.py
Executable file
294
codex-cli/scripts/build_npm_package.py
Executable file
@@ -0,0 +1,294 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""Stage and optionally package the @openai/codex npm module."""
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import json
|
||||||
|
import re
|
||||||
|
import shutil
|
||||||
|
import subprocess
|
||||||
|
import sys
|
||||||
|
import tempfile
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
SCRIPT_DIR = Path(__file__).resolve().parent
|
||||||
|
CODEX_CLI_ROOT = SCRIPT_DIR.parent
|
||||||
|
REPO_ROOT = CODEX_CLI_ROOT.parent
|
||||||
|
GITHUB_REPO = "openai/codex"
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args() -> argparse.Namespace:
|
||||||
|
parser = argparse.ArgumentParser(description="Build or stage the Codex CLI npm package.")
|
||||||
|
parser.add_argument(
|
||||||
|
"--version",
|
||||||
|
help="Version number to write to package.json inside the staged package.",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--release-version",
|
||||||
|
help=(
|
||||||
|
"Version to stage for npm release. When provided, the script also resolves the "
|
||||||
|
"matching rust-release workflow unless --workflow-url is supplied."
|
||||||
|
),
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--workflow-url",
|
||||||
|
help="Optional GitHub Actions workflow run URL used to download native binaries.",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--staging-dir",
|
||||||
|
type=Path,
|
||||||
|
help=(
|
||||||
|
"Directory to stage the package contents. Defaults to a new temporary directory "
|
||||||
|
"if omitted. The directory must be empty when provided."
|
||||||
|
),
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--tmp",
|
||||||
|
dest="staging_dir",
|
||||||
|
type=Path,
|
||||||
|
help=argparse.SUPPRESS,
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--pack-output",
|
||||||
|
type=Path,
|
||||||
|
help="Path where the generated npm tarball should be written.",
|
||||||
|
)
|
||||||
|
return parser.parse_args()
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> int:
|
||||||
|
args = parse_args()
|
||||||
|
|
||||||
|
version = args.version
|
||||||
|
release_version = args.release_version
|
||||||
|
if release_version:
|
||||||
|
if version and version != release_version:
|
||||||
|
raise RuntimeError("--version and --release-version must match when both are provided.")
|
||||||
|
version = release_version
|
||||||
|
|
||||||
|
if not version:
|
||||||
|
raise RuntimeError("Must specify --version or --release-version.")
|
||||||
|
|
||||||
|
staging_dir, created_temp = prepare_staging_dir(args.staging_dir)
|
||||||
|
|
||||||
|
try:
|
||||||
|
stage_sources(staging_dir, version)
|
||||||
|
|
||||||
|
workflow_url = args.workflow_url
|
||||||
|
resolved_head_sha: str | None = None
|
||||||
|
if not workflow_url:
|
||||||
|
if release_version:
|
||||||
|
workflow = resolve_release_workflow(version)
|
||||||
|
workflow_url = workflow["url"]
|
||||||
|
resolved_head_sha = workflow.get("headSha")
|
||||||
|
else:
|
||||||
|
workflow_url = resolve_latest_alpha_workflow_url()
|
||||||
|
elif release_version:
|
||||||
|
try:
|
||||||
|
workflow = resolve_release_workflow(version)
|
||||||
|
resolved_head_sha = workflow.get("headSha")
|
||||||
|
except Exception:
|
||||||
|
resolved_head_sha = None
|
||||||
|
|
||||||
|
if release_version and resolved_head_sha:
|
||||||
|
print(f"should `git checkout {resolved_head_sha}`")
|
||||||
|
|
||||||
|
if not workflow_url:
|
||||||
|
raise RuntimeError("Unable to determine workflow URL for native binaries.")
|
||||||
|
|
||||||
|
install_native_binaries(staging_dir, workflow_url)
|
||||||
|
|
||||||
|
if release_version:
|
||||||
|
staging_dir_str = str(staging_dir)
|
||||||
|
print(
|
||||||
|
f"Staged version {version} for release in {staging_dir_str}\n\n"
|
||||||
|
"Verify the CLI:\n"
|
||||||
|
f" node {staging_dir_str}/bin/codex.js --version\n"
|
||||||
|
f" node {staging_dir_str}/bin/codex.js --help\n\n"
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
print(f"Staged package in {staging_dir}")
|
||||||
|
|
||||||
|
if args.pack_output is not None:
|
||||||
|
output_path = run_npm_pack(staging_dir, args.pack_output)
|
||||||
|
print(f"npm pack output written to {output_path}")
|
||||||
|
finally:
|
||||||
|
if created_temp:
|
||||||
|
# Preserve the staging directory for further inspection.
|
||||||
|
pass
|
||||||
|
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
def prepare_staging_dir(staging_dir: Path | None) -> tuple[Path, bool]:
|
||||||
|
if staging_dir is not None:
|
||||||
|
staging_dir = staging_dir.resolve()
|
||||||
|
staging_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
if any(staging_dir.iterdir()):
|
||||||
|
raise RuntimeError(f"Staging directory {staging_dir} is not empty.")
|
||||||
|
return staging_dir, False
|
||||||
|
|
||||||
|
temp_dir = Path(tempfile.mkdtemp(prefix="codex-npm-stage-"))
|
||||||
|
return temp_dir, True
|
||||||
|
|
||||||
|
|
||||||
|
def stage_sources(staging_dir: Path, version: str) -> None:
|
||||||
|
bin_dir = staging_dir / "bin"
|
||||||
|
bin_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
shutil.copy2(CODEX_CLI_ROOT / "bin" / "codex.js", bin_dir / "codex.js")
|
||||||
|
rg_manifest = CODEX_CLI_ROOT / "bin" / "rg"
|
||||||
|
if rg_manifest.exists():
|
||||||
|
shutil.copy2(rg_manifest, bin_dir / "rg")
|
||||||
|
|
||||||
|
readme_src = REPO_ROOT / "README.md"
|
||||||
|
if readme_src.exists():
|
||||||
|
shutil.copy2(readme_src, staging_dir / "README.md")
|
||||||
|
|
||||||
|
with open(CODEX_CLI_ROOT / "package.json", "r", encoding="utf-8") as fh:
|
||||||
|
package_json = json.load(fh)
|
||||||
|
package_json["version"] = version
|
||||||
|
|
||||||
|
with open(staging_dir / "package.json", "w", encoding="utf-8") as out:
|
||||||
|
json.dump(package_json, out, indent=2)
|
||||||
|
out.write("\n")
|
||||||
|
|
||||||
|
|
||||||
|
def install_native_binaries(staging_dir: Path, workflow_url: str | None) -> None:
|
||||||
|
cmd = ["./scripts/install_native_deps.py"]
|
||||||
|
if workflow_url:
|
||||||
|
cmd.extend(["--workflow-url", workflow_url])
|
||||||
|
cmd.append(str(staging_dir))
|
||||||
|
subprocess.check_call(cmd, cwd=CODEX_CLI_ROOT)
|
||||||
|
|
||||||
|
|
||||||
|
def resolve_latest_alpha_workflow_url() -> str:
|
||||||
|
version = determine_latest_alpha_version()
|
||||||
|
workflow_url = fetch_workflow_url_for_version(version)
|
||||||
|
if not workflow_url:
|
||||||
|
raise RuntimeError(f"Unable to locate workflow for version {version}.")
|
||||||
|
return workflow_url
|
||||||
|
|
||||||
|
|
||||||
|
def determine_latest_alpha_version() -> str:
|
||||||
|
releases = list_releases()
|
||||||
|
best_key: tuple[int, int, int, int] | None = None
|
||||||
|
best_version: str | None = None
|
||||||
|
pattern = re.compile(r"^rust-v(\d+)\.(\d+)\.(\d+)-alpha\.(\d+)$")
|
||||||
|
for release in releases:
|
||||||
|
tag = release.get("tag_name", "")
|
||||||
|
match = pattern.match(tag)
|
||||||
|
if not match:
|
||||||
|
continue
|
||||||
|
key = tuple(int(match.group(i)) for i in range(1, 5))
|
||||||
|
if best_key is None or key > best_key:
|
||||||
|
best_key = key
|
||||||
|
best_version = (
|
||||||
|
f"{match.group(1)}.{match.group(2)}.{match.group(3)}-alpha.{match.group(4)}"
|
||||||
|
)
|
||||||
|
|
||||||
|
if best_version is None:
|
||||||
|
raise RuntimeError("No alpha releases found when resolving workflow URL.")
|
||||||
|
return best_version
|
||||||
|
|
||||||
|
|
||||||
|
def list_releases() -> list[dict]:
|
||||||
|
stdout = subprocess.check_output(
|
||||||
|
["gh", "api", f"/repos/{GITHUB_REPO}/releases?per_page=100"],
|
||||||
|
text=True,
|
||||||
|
)
|
||||||
|
try:
|
||||||
|
releases = json.loads(stdout or "[]")
|
||||||
|
except json.JSONDecodeError as exc:
|
||||||
|
raise RuntimeError("Unable to parse releases JSON.") from exc
|
||||||
|
if not isinstance(releases, list):
|
||||||
|
raise RuntimeError("Unexpected response when listing releases.")
|
||||||
|
return releases
|
||||||
|
|
||||||
|
|
||||||
|
def fetch_workflow_url_for_version(version: str) -> str | None:
|
||||||
|
ref = f"rust-v{version}"
|
||||||
|
stdout = subprocess.check_output(
|
||||||
|
[
|
||||||
|
"gh",
|
||||||
|
"run",
|
||||||
|
"list",
|
||||||
|
"--branch",
|
||||||
|
ref,
|
||||||
|
"--limit",
|
||||||
|
"20",
|
||||||
|
"--json",
|
||||||
|
"workflowName,url",
|
||||||
|
],
|
||||||
|
text=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
runs = json.loads(stdout or "[]")
|
||||||
|
except json.JSONDecodeError as exc:
|
||||||
|
raise RuntimeError("Unable to parse workflow run listing.") from exc
|
||||||
|
|
||||||
|
for run in runs:
|
||||||
|
if run.get("workflowName") == "rust-release":
|
||||||
|
url = run.get("url")
|
||||||
|
if url:
|
||||||
|
return url
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def resolve_release_workflow(version: str) -> dict:
|
||||||
|
stdout = subprocess.check_output(
|
||||||
|
[
|
||||||
|
"gh",
|
||||||
|
"run",
|
||||||
|
"list",
|
||||||
|
"--branch",
|
||||||
|
f"rust-v{version}",
|
||||||
|
"--json",
|
||||||
|
"workflowName,url,headSha",
|
||||||
|
"--jq",
|
||||||
|
'first(.[] | select(.workflowName == "rust-release"))',
|
||||||
|
],
|
||||||
|
text=True,
|
||||||
|
)
|
||||||
|
workflow = json.loads(stdout)
|
||||||
|
if not workflow:
|
||||||
|
raise RuntimeError(f"Unable to find rust-release workflow for version {version}.")
|
||||||
|
return workflow
|
||||||
|
|
||||||
|
|
||||||
|
def run_npm_pack(staging_dir: Path, output_path: Path) -> Path:
|
||||||
|
output_path = output_path.resolve()
|
||||||
|
output_path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
with tempfile.TemporaryDirectory(prefix="codex-npm-pack-") as pack_dir_str:
|
||||||
|
pack_dir = Path(pack_dir_str)
|
||||||
|
stdout = subprocess.check_output(
|
||||||
|
["npm", "pack", "--json", "--pack-destination", str(pack_dir)],
|
||||||
|
cwd=staging_dir,
|
||||||
|
text=True,
|
||||||
|
)
|
||||||
|
try:
|
||||||
|
pack_output = json.loads(stdout)
|
||||||
|
except json.JSONDecodeError as exc:
|
||||||
|
raise RuntimeError("Failed to parse npm pack output.") from exc
|
||||||
|
|
||||||
|
if not pack_output:
|
||||||
|
raise RuntimeError("npm pack did not produce an output tarball.")
|
||||||
|
|
||||||
|
tarball_name = pack_output[0].get("filename") or pack_output[0].get("name")
|
||||||
|
if not tarball_name:
|
||||||
|
raise RuntimeError("Unable to determine npm pack output filename.")
|
||||||
|
|
||||||
|
tarball_path = pack_dir / tarball_name
|
||||||
|
if not tarball_path.exists():
|
||||||
|
raise RuntimeError(f"Expected npm pack output not found: {tarball_path}")
|
||||||
|
|
||||||
|
shutil.move(str(tarball_path), output_path)
|
||||||
|
|
||||||
|
return output_path
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
import sys
|
||||||
|
|
||||||
|
sys.exit(main())
|
||||||
318
codex-cli/scripts/install_native_deps.py
Executable file
318
codex-cli/scripts/install_native_deps.py
Executable file
@@ -0,0 +1,318 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""Install Codex native binaries (Rust CLI plus ripgrep helpers)."""
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
import shutil
|
||||||
|
import subprocess
|
||||||
|
import tarfile
|
||||||
|
import tempfile
|
||||||
|
import zipfile
|
||||||
|
from concurrent.futures import ThreadPoolExecutor, as_completed
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Iterable, Sequence
|
||||||
|
from urllib.parse import urlparse
|
||||||
|
from urllib.request import urlopen
|
||||||
|
|
||||||
|
SCRIPT_DIR = Path(__file__).resolve().parent
|
||||||
|
CODEX_CLI_ROOT = SCRIPT_DIR.parent
|
||||||
|
DEFAULT_WORKFLOW_URL = "https://github.com/openai/codex/actions/runs/17952349351" # rust-v0.40.0
|
||||||
|
VENDOR_DIR_NAME = "vendor"
|
||||||
|
RG_MANIFEST = CODEX_CLI_ROOT / "bin" / "rg"
|
||||||
|
CODEX_TARGETS = (
|
||||||
|
"x86_64-unknown-linux-musl",
|
||||||
|
"aarch64-unknown-linux-musl",
|
||||||
|
"x86_64-apple-darwin",
|
||||||
|
"aarch64-apple-darwin",
|
||||||
|
"x86_64-pc-windows-msvc",
|
||||||
|
"aarch64-pc-windows-msvc",
|
||||||
|
)
|
||||||
|
|
||||||
|
RG_TARGET_PLATFORM_PAIRS: list[tuple[str, str]] = [
|
||||||
|
("x86_64-unknown-linux-musl", "linux-x86_64"),
|
||||||
|
("aarch64-unknown-linux-musl", "linux-aarch64"),
|
||||||
|
("x86_64-apple-darwin", "macos-x86_64"),
|
||||||
|
("aarch64-apple-darwin", "macos-aarch64"),
|
||||||
|
("x86_64-pc-windows-msvc", "windows-x86_64"),
|
||||||
|
("aarch64-pc-windows-msvc", "windows-aarch64"),
|
||||||
|
]
|
||||||
|
RG_TARGET_TO_PLATFORM = {target: platform for target, platform in RG_TARGET_PLATFORM_PAIRS}
|
||||||
|
DEFAULT_RG_TARGETS = [target for target, _ in RG_TARGET_PLATFORM_PAIRS]
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args() -> argparse.Namespace:
|
||||||
|
parser = argparse.ArgumentParser(description="Install native Codex binaries.")
|
||||||
|
parser.add_argument(
|
||||||
|
"--workflow-url",
|
||||||
|
help=(
|
||||||
|
"GitHub Actions workflow URL that produced the artifacts. Defaults to a "
|
||||||
|
"known good run when omitted."
|
||||||
|
),
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"root",
|
||||||
|
nargs="?",
|
||||||
|
type=Path,
|
||||||
|
help=(
|
||||||
|
"Directory containing package.json for the staged package. If omitted, the "
|
||||||
|
"repository checkout is used."
|
||||||
|
),
|
||||||
|
)
|
||||||
|
return parser.parse_args()
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> int:
|
||||||
|
args = parse_args()
|
||||||
|
|
||||||
|
codex_cli_root = (args.root or CODEX_CLI_ROOT).resolve()
|
||||||
|
vendor_dir = codex_cli_root / VENDOR_DIR_NAME
|
||||||
|
vendor_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
workflow_url = (args.workflow_url or DEFAULT_WORKFLOW_URL).strip()
|
||||||
|
if not workflow_url:
|
||||||
|
workflow_url = DEFAULT_WORKFLOW_URL
|
||||||
|
|
||||||
|
workflow_id = workflow_url.rstrip("/").split("/")[-1]
|
||||||
|
|
||||||
|
with tempfile.TemporaryDirectory(prefix="codex-native-artifacts-") as artifacts_dir_str:
|
||||||
|
artifacts_dir = Path(artifacts_dir_str)
|
||||||
|
_download_artifacts(workflow_id, artifacts_dir)
|
||||||
|
install_codex_binaries(artifacts_dir, vendor_dir, CODEX_TARGETS)
|
||||||
|
|
||||||
|
fetch_rg(vendor_dir, DEFAULT_RG_TARGETS, manifest_path=RG_MANIFEST)
|
||||||
|
|
||||||
|
print(f"Installed native dependencies into {vendor_dir}")
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
def fetch_rg(
|
||||||
|
vendor_dir: Path,
|
||||||
|
targets: Sequence[str] | None = None,
|
||||||
|
*,
|
||||||
|
manifest_path: Path,
|
||||||
|
) -> list[Path]:
|
||||||
|
"""Download ripgrep binaries described by the DotSlash manifest."""
|
||||||
|
|
||||||
|
if targets is None:
|
||||||
|
targets = DEFAULT_RG_TARGETS
|
||||||
|
|
||||||
|
if not manifest_path.exists():
|
||||||
|
raise FileNotFoundError(f"DotSlash manifest not found: {manifest_path}")
|
||||||
|
|
||||||
|
manifest = _load_manifest(manifest_path)
|
||||||
|
platforms = manifest.get("platforms", {})
|
||||||
|
|
||||||
|
vendor_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
targets = list(targets)
|
||||||
|
if not targets:
|
||||||
|
return []
|
||||||
|
|
||||||
|
task_configs: list[tuple[str, str, dict]] = []
|
||||||
|
for target in targets:
|
||||||
|
platform_key = RG_TARGET_TO_PLATFORM.get(target)
|
||||||
|
if platform_key is None:
|
||||||
|
raise ValueError(f"Unsupported ripgrep target '{target}'.")
|
||||||
|
|
||||||
|
platform_info = platforms.get(platform_key)
|
||||||
|
if platform_info is None:
|
||||||
|
raise RuntimeError(f"Platform '{platform_key}' not found in manifest {manifest_path}.")
|
||||||
|
|
||||||
|
task_configs.append((target, platform_key, platform_info))
|
||||||
|
|
||||||
|
results: dict[str, Path] = {}
|
||||||
|
max_workers = min(len(task_configs), max(1, (os.cpu_count() or 1)))
|
||||||
|
|
||||||
|
with ThreadPoolExecutor(max_workers=max_workers) as executor:
|
||||||
|
future_map = {
|
||||||
|
executor.submit(
|
||||||
|
_fetch_single_rg,
|
||||||
|
vendor_dir,
|
||||||
|
target,
|
||||||
|
platform_key,
|
||||||
|
platform_info,
|
||||||
|
manifest_path,
|
||||||
|
): target
|
||||||
|
for target, platform_key, platform_info in task_configs
|
||||||
|
}
|
||||||
|
|
||||||
|
for future in as_completed(future_map):
|
||||||
|
target = future_map[future]
|
||||||
|
results[target] = future.result()
|
||||||
|
|
||||||
|
return [results[target] for target in targets]
|
||||||
|
|
||||||
|
|
||||||
|
def _download_artifacts(workflow_id: str, dest_dir: Path) -> None:
|
||||||
|
cmd = [
|
||||||
|
"gh",
|
||||||
|
"run",
|
||||||
|
"download",
|
||||||
|
"--dir",
|
||||||
|
str(dest_dir),
|
||||||
|
"--repo",
|
||||||
|
"openai/codex",
|
||||||
|
workflow_id,
|
||||||
|
]
|
||||||
|
subprocess.check_call(cmd)
|
||||||
|
|
||||||
|
|
||||||
|
def install_codex_binaries(
|
||||||
|
artifacts_dir: Path, vendor_dir: Path, targets: Iterable[str]
|
||||||
|
) -> list[Path]:
|
||||||
|
targets = list(targets)
|
||||||
|
if not targets:
|
||||||
|
return []
|
||||||
|
|
||||||
|
results: dict[str, Path] = {}
|
||||||
|
max_workers = min(len(targets), max(1, (os.cpu_count() or 1)))
|
||||||
|
|
||||||
|
with ThreadPoolExecutor(max_workers=max_workers) as executor:
|
||||||
|
future_map = {
|
||||||
|
executor.submit(_install_single_codex_binary, artifacts_dir, vendor_dir, target): target
|
||||||
|
for target in targets
|
||||||
|
}
|
||||||
|
|
||||||
|
for future in as_completed(future_map):
|
||||||
|
target = future_map[future]
|
||||||
|
results[target] = future.result()
|
||||||
|
|
||||||
|
return [results[target] for target in targets]
|
||||||
|
|
||||||
|
|
||||||
|
def _install_single_codex_binary(artifacts_dir: Path, vendor_dir: Path, target: str) -> Path:
|
||||||
|
artifact_subdir = artifacts_dir / target
|
||||||
|
archive_name = _archive_name_for_target(target)
|
||||||
|
archive_path = artifact_subdir / archive_name
|
||||||
|
if not archive_path.exists():
|
||||||
|
raise FileNotFoundError(f"Expected artifact not found: {archive_path}")
|
||||||
|
|
||||||
|
dest_dir = vendor_dir / target / "codex"
|
||||||
|
dest_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
binary_name = "codex.exe" if "windows" in target else "codex"
|
||||||
|
dest = dest_dir / binary_name
|
||||||
|
dest.unlink(missing_ok=True)
|
||||||
|
extract_archive(archive_path, "zst", None, dest)
|
||||||
|
if "windows" not in target:
|
||||||
|
dest.chmod(0o755)
|
||||||
|
return dest
|
||||||
|
|
||||||
|
|
||||||
|
def _archive_name_for_target(target: str) -> str:
|
||||||
|
if "windows" in target:
|
||||||
|
return f"codex-{target}.exe.zst"
|
||||||
|
return f"codex-{target}.zst"
|
||||||
|
|
||||||
|
|
||||||
|
def _fetch_single_rg(
|
||||||
|
vendor_dir: Path,
|
||||||
|
target: str,
|
||||||
|
platform_key: str,
|
||||||
|
platform_info: dict,
|
||||||
|
manifest_path: Path,
|
||||||
|
) -> Path:
|
||||||
|
providers = platform_info.get("providers", [])
|
||||||
|
if not providers:
|
||||||
|
raise RuntimeError(f"No providers listed for platform '{platform_key}' in {manifest_path}.")
|
||||||
|
|
||||||
|
url = providers[0]["url"]
|
||||||
|
archive_format = platform_info.get("format", "zst")
|
||||||
|
archive_member = platform_info.get("path")
|
||||||
|
|
||||||
|
dest_dir = vendor_dir / target / "path"
|
||||||
|
dest_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
is_windows = platform_key.startswith("win")
|
||||||
|
binary_name = "rg.exe" if is_windows else "rg"
|
||||||
|
dest = dest_dir / binary_name
|
||||||
|
|
||||||
|
with tempfile.TemporaryDirectory() as tmp_dir_str:
|
||||||
|
tmp_dir = Path(tmp_dir_str)
|
||||||
|
archive_filename = os.path.basename(urlparse(url).path)
|
||||||
|
download_path = tmp_dir / archive_filename
|
||||||
|
_download_file(url, download_path)
|
||||||
|
|
||||||
|
dest.unlink(missing_ok=True)
|
||||||
|
extract_archive(download_path, archive_format, archive_member, dest)
|
||||||
|
|
||||||
|
if not is_windows:
|
||||||
|
dest.chmod(0o755)
|
||||||
|
|
||||||
|
return dest
|
||||||
|
|
||||||
|
|
||||||
|
def _download_file(url: str, dest: Path) -> None:
|
||||||
|
dest.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
with urlopen(url) as response, open(dest, "wb") as out:
|
||||||
|
shutil.copyfileobj(response, out)
|
||||||
|
|
||||||
|
|
||||||
|
def extract_archive(
|
||||||
|
archive_path: Path,
|
||||||
|
archive_format: str,
|
||||||
|
archive_member: str | None,
|
||||||
|
dest: Path,
|
||||||
|
) -> None:
|
||||||
|
dest.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
if archive_format == "zst":
|
||||||
|
output_path = archive_path.parent / dest.name
|
||||||
|
subprocess.check_call(
|
||||||
|
["zstd", "-f", "-d", str(archive_path), "-o", str(output_path)]
|
||||||
|
)
|
||||||
|
shutil.move(str(output_path), dest)
|
||||||
|
return
|
||||||
|
|
||||||
|
if archive_format == "tar.gz":
|
||||||
|
if not archive_member:
|
||||||
|
raise RuntimeError("Missing 'path' for tar.gz archive in DotSlash manifest.")
|
||||||
|
with tarfile.open(archive_path, "r:gz") as tar:
|
||||||
|
try:
|
||||||
|
member = tar.getmember(archive_member)
|
||||||
|
except KeyError as exc:
|
||||||
|
raise RuntimeError(
|
||||||
|
f"Entry '{archive_member}' not found in archive {archive_path}."
|
||||||
|
) from exc
|
||||||
|
tar.extract(member, path=archive_path.parent, filter="data")
|
||||||
|
extracted = archive_path.parent / archive_member
|
||||||
|
shutil.move(str(extracted), dest)
|
||||||
|
return
|
||||||
|
|
||||||
|
if archive_format == "zip":
|
||||||
|
if not archive_member:
|
||||||
|
raise RuntimeError("Missing 'path' for zip archive in DotSlash manifest.")
|
||||||
|
with zipfile.ZipFile(archive_path) as archive:
|
||||||
|
try:
|
||||||
|
with archive.open(archive_member) as src, open(dest, "wb") as out:
|
||||||
|
shutil.copyfileobj(src, out)
|
||||||
|
except KeyError as exc:
|
||||||
|
raise RuntimeError(
|
||||||
|
f"Entry '{archive_member}' not found in archive {archive_path}."
|
||||||
|
) from exc
|
||||||
|
return
|
||||||
|
|
||||||
|
raise RuntimeError(f"Unsupported archive format '{archive_format}'.")
|
||||||
|
|
||||||
|
|
||||||
|
def _load_manifest(manifest_path: Path) -> dict:
|
||||||
|
cmd = ["dotslash", "--", "parse", str(manifest_path)]
|
||||||
|
stdout = subprocess.check_output(cmd, text=True)
|
||||||
|
try:
|
||||||
|
manifest = json.loads(stdout)
|
||||||
|
except json.JSONDecodeError as exc:
|
||||||
|
raise RuntimeError(f"Invalid DotSlash manifest output from {manifest_path}.") from exc
|
||||||
|
|
||||||
|
if not isinstance(manifest, dict):
|
||||||
|
raise RuntimeError(
|
||||||
|
f"Unexpected DotSlash manifest structure for {manifest_path}: {type(manifest)!r}"
|
||||||
|
)
|
||||||
|
|
||||||
|
return manifest
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
import sys
|
||||||
|
|
||||||
|
sys.exit(main())
|
||||||
@@ -1,94 +0,0 @@
|
|||||||
#!/usr/bin/env bash
|
|
||||||
|
|
||||||
# Install native runtime dependencies for codex-cli.
|
|
||||||
#
|
|
||||||
# Usage
|
|
||||||
# install_native_deps.sh [--workflow-url URL] [CODEX_CLI_ROOT]
|
|
||||||
#
|
|
||||||
# The optional RELEASE_ROOT is the path that contains package.json. Omitting
|
|
||||||
# it installs the binaries into the repository's own bin/ folder to support
|
|
||||||
# local development.
|
|
||||||
|
|
||||||
set -euo pipefail
|
|
||||||
|
|
||||||
# ------------------
|
|
||||||
# Parse arguments
|
|
||||||
# ------------------
|
|
||||||
|
|
||||||
CODEX_CLI_ROOT=""
|
|
||||||
|
|
||||||
# Until we start publishing stable GitHub releases, we have to grab the binaries
|
|
||||||
# from the GitHub Action that created them. Update the URL below to point to the
|
|
||||||
# appropriate workflow run:
|
|
||||||
WORKFLOW_URL="https://github.com/openai/codex/actions/runs/17417194663" # rust-v0.28.0
|
|
||||||
|
|
||||||
while [[ $# -gt 0 ]]; do
|
|
||||||
case "$1" in
|
|
||||||
--workflow-url)
|
|
||||||
shift || { echo "--workflow-url requires an argument"; exit 1; }
|
|
||||||
if [ -n "$1" ]; then
|
|
||||||
WORKFLOW_URL="$1"
|
|
||||||
fi
|
|
||||||
;;
|
|
||||||
*)
|
|
||||||
if [[ -z "$CODEX_CLI_ROOT" ]]; then
|
|
||||||
CODEX_CLI_ROOT="$1"
|
|
||||||
else
|
|
||||||
echo "Unexpected argument: $1" >&2
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
;;
|
|
||||||
esac
|
|
||||||
shift
|
|
||||||
done
|
|
||||||
|
|
||||||
# ----------------------------------------------------------------------------
|
|
||||||
# Determine where the binaries should be installed.
|
|
||||||
# ----------------------------------------------------------------------------
|
|
||||||
|
|
||||||
if [ -n "$CODEX_CLI_ROOT" ]; then
|
|
||||||
# The caller supplied a release root directory.
|
|
||||||
BIN_DIR="$CODEX_CLI_ROOT/bin"
|
|
||||||
else
|
|
||||||
# No argument; fall back to the repo’s own bin directory.
|
|
||||||
# Resolve the path of this script, then walk up to the repo root.
|
|
||||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
|
||||||
CODEX_CLI_ROOT="$(cd "$SCRIPT_DIR/.." && pwd)"
|
|
||||||
BIN_DIR="$CODEX_CLI_ROOT/bin"
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Make sure the destination directory exists.
|
|
||||||
mkdir -p "$BIN_DIR"
|
|
||||||
|
|
||||||
# ----------------------------------------------------------------------------
|
|
||||||
# Download and decompress the artifacts from the GitHub Actions workflow.
|
|
||||||
# ----------------------------------------------------------------------------
|
|
||||||
|
|
||||||
WORKFLOW_ID="${WORKFLOW_URL##*/}"
|
|
||||||
|
|
||||||
ARTIFACTS_DIR="$(mktemp -d)"
|
|
||||||
trap 'rm -rf "$ARTIFACTS_DIR"' EXIT
|
|
||||||
|
|
||||||
# NB: The GitHub CLI `gh` must be installed and authenticated.
|
|
||||||
gh run download --dir "$ARTIFACTS_DIR" --repo openai/codex "$WORKFLOW_ID"
|
|
||||||
|
|
||||||
# x64 Linux
|
|
||||||
zstd -d "$ARTIFACTS_DIR/x86_64-unknown-linux-musl/codex-x86_64-unknown-linux-musl.zst" \
|
|
||||||
-o "$BIN_DIR/codex-x86_64-unknown-linux-musl"
|
|
||||||
# ARM64 Linux
|
|
||||||
zstd -d "$ARTIFACTS_DIR/aarch64-unknown-linux-musl/codex-aarch64-unknown-linux-musl.zst" \
|
|
||||||
-o "$BIN_DIR/codex-aarch64-unknown-linux-musl"
|
|
||||||
# x64 macOS
|
|
||||||
zstd -d "$ARTIFACTS_DIR/x86_64-apple-darwin/codex-x86_64-apple-darwin.zst" \
|
|
||||||
-o "$BIN_DIR/codex-x86_64-apple-darwin"
|
|
||||||
# ARM64 macOS
|
|
||||||
zstd -d "$ARTIFACTS_DIR/aarch64-apple-darwin/codex-aarch64-apple-darwin.zst" \
|
|
||||||
-o "$BIN_DIR/codex-aarch64-apple-darwin"
|
|
||||||
# x64 Windows
|
|
||||||
zstd -d "$ARTIFACTS_DIR/x86_64-pc-windows-msvc/codex-x86_64-pc-windows-msvc.exe.zst" \
|
|
||||||
-o "$BIN_DIR/codex-x86_64-pc-windows-msvc.exe"
|
|
||||||
# ARM64 Windows
|
|
||||||
zstd -d "$ARTIFACTS_DIR/aarch64-pc-windows-msvc/codex-aarch64-pc-windows-msvc.exe.zst" \
|
|
||||||
-o "$BIN_DIR/codex-aarch64-pc-windows-msvc.exe"
|
|
||||||
|
|
||||||
echo "Installed native dependencies into $BIN_DIR"
|
|
||||||
@@ -1,120 +0,0 @@
|
|||||||
#!/usr/bin/env bash
|
|
||||||
# -----------------------------------------------------------------------------
|
|
||||||
# stage_release.sh
|
|
||||||
# -----------------------------------------------------------------------------
|
|
||||||
# Stages an npm release for @openai/codex.
|
|
||||||
#
|
|
||||||
# Usage:
|
|
||||||
#
|
|
||||||
# --tmp <dir> : Use <dir> instead of a freshly created temp directory.
|
|
||||||
# -h|--help : Print usage.
|
|
||||||
#
|
|
||||||
# -----------------------------------------------------------------------------
|
|
||||||
|
|
||||||
set -euo pipefail
|
|
||||||
|
|
||||||
# Helper - usage / flag parsing
|
|
||||||
|
|
||||||
usage() {
|
|
||||||
cat <<EOF
|
|
||||||
Usage: $(basename "$0") [--tmp DIR] [--version VERSION]
|
|
||||||
|
|
||||||
Options
|
|
||||||
--tmp DIR Use DIR to stage the release (defaults to a fresh mktemp dir)
|
|
||||||
--version Specify the version to release (defaults to a timestamp-based version)
|
|
||||||
-h, --help Show this help
|
|
||||||
|
|
||||||
Legacy positional argument: the first non-flag argument is still interpreted
|
|
||||||
as the temporary directory (for backwards compatibility) but is deprecated.
|
|
||||||
EOF
|
|
||||||
exit "${1:-0}"
|
|
||||||
}
|
|
||||||
|
|
||||||
TMPDIR=""
|
|
||||||
# Default to a timestamp-based version (keep same scheme as before)
|
|
||||||
VERSION="$(printf '0.1.%d' "$(date +%y%m%d%H%M)")"
|
|
||||||
WORKFLOW_URL=""
|
|
||||||
|
|
||||||
# Manual flag parser - Bash getopts does not handle GNU long options well.
|
|
||||||
while [[ $# -gt 0 ]]; do
|
|
||||||
case "$1" in
|
|
||||||
--tmp)
|
|
||||||
shift || { echo "--tmp requires an argument"; usage 1; }
|
|
||||||
TMPDIR="$1"
|
|
||||||
;;
|
|
||||||
--tmp=*)
|
|
||||||
TMPDIR="${1#*=}"
|
|
||||||
;;
|
|
||||||
--version)
|
|
||||||
shift || { echo "--version requires an argument"; usage 1; }
|
|
||||||
VERSION="$1"
|
|
||||||
;;
|
|
||||||
--workflow-url)
|
|
||||||
shift || { echo "--workflow-url requires an argument"; exit 1; }
|
|
||||||
WORKFLOW_URL="$1"
|
|
||||||
;;
|
|
||||||
-h|--help)
|
|
||||||
usage 0
|
|
||||||
;;
|
|
||||||
--*)
|
|
||||||
echo "Unknown option: $1" >&2
|
|
||||||
usage 1
|
|
||||||
;;
|
|
||||||
*)
|
|
||||||
echo "Unexpected extra argument: $1" >&2
|
|
||||||
usage 1
|
|
||||||
;;
|
|
||||||
esac
|
|
||||||
shift
|
|
||||||
done
|
|
||||||
|
|
||||||
# Fallback when the caller did not specify a directory.
|
|
||||||
# If no directory was specified create a fresh temporary one.
|
|
||||||
if [[ -z "$TMPDIR" ]]; then
|
|
||||||
TMPDIR="$(mktemp -d)"
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Ensure the directory exists, then resolve to an absolute path.
|
|
||||||
mkdir -p "$TMPDIR"
|
|
||||||
TMPDIR="$(cd "$TMPDIR" && pwd)"
|
|
||||||
|
|
||||||
# Main build logic
|
|
||||||
|
|
||||||
echo "Staging release in $TMPDIR"
|
|
||||||
|
|
||||||
# The script lives in codex-cli/scripts/ - change into codex-cli root so that
|
|
||||||
# relative paths keep working.
|
|
||||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
|
||||||
CODEX_CLI_ROOT="$(cd "$SCRIPT_DIR/.." && pwd)"
|
|
||||||
|
|
||||||
pushd "$CODEX_CLI_ROOT" >/dev/null
|
|
||||||
|
|
||||||
# 1. Build the JS artifacts ---------------------------------------------------
|
|
||||||
|
|
||||||
# Paths inside the staged package
|
|
||||||
mkdir -p "$TMPDIR/bin"
|
|
||||||
|
|
||||||
cp -r bin/codex.js "$TMPDIR/bin/codex.js"
|
|
||||||
cp ../README.md "$TMPDIR" || true # README is one level up - ignore if missing
|
|
||||||
|
|
||||||
# Modify package.json - bump version and optionally add the native directory to
|
|
||||||
# the files array so that the binaries are published to npm.
|
|
||||||
|
|
||||||
jq --arg version "$VERSION" \
|
|
||||||
'.version = $version' \
|
|
||||||
package.json > "$TMPDIR/package.json"
|
|
||||||
|
|
||||||
# 2. Native runtime deps (sandbox plus optional Rust binaries)
|
|
||||||
|
|
||||||
./scripts/install_native_deps.sh --workflow-url "$WORKFLOW_URL" "$TMPDIR"
|
|
||||||
|
|
||||||
popd >/dev/null
|
|
||||||
|
|
||||||
echo "Staged version $VERSION for release in $TMPDIR"
|
|
||||||
|
|
||||||
echo "Verify the CLI:"
|
|
||||||
echo " node ${TMPDIR}/bin/codex.js --version"
|
|
||||||
echo " node ${TMPDIR}/bin/codex.js --help"
|
|
||||||
|
|
||||||
# Print final hint for convenience
|
|
||||||
echo "Next: cd \"$TMPDIR\" && npm publish"
|
|
||||||
@@ -1,70 +0,0 @@
|
|||||||
#!/usr/bin/env python3
|
|
||||||
|
|
||||||
import json
|
|
||||||
import subprocess
|
|
||||||
import sys
|
|
||||||
import argparse
|
|
||||||
from pathlib import Path
|
|
||||||
|
|
||||||
|
|
||||||
def main() -> int:
|
|
||||||
parser = argparse.ArgumentParser(
|
|
||||||
description="""Stage a release for the npm module.
|
|
||||||
|
|
||||||
Run this after the GitHub Release has been created and use
|
|
||||||
`--release-version` to specify the version to release.
|
|
||||||
|
|
||||||
Optionally pass `--tmp` to control the temporary staging directory that will be
|
|
||||||
forwarded to stage_release.sh.
|
|
||||||
"""
|
|
||||||
)
|
|
||||||
parser.add_argument(
|
|
||||||
"--release-version", required=True, help="Version to release, e.g., 0.3.0"
|
|
||||||
)
|
|
||||||
parser.add_argument(
|
|
||||||
"--tmp",
|
|
||||||
help="Optional path to stage the npm package; forwarded to stage_release.sh",
|
|
||||||
)
|
|
||||||
args = parser.parse_args()
|
|
||||||
version = args.release_version
|
|
||||||
|
|
||||||
gh_run = subprocess.run(
|
|
||||||
[
|
|
||||||
"gh",
|
|
||||||
"run",
|
|
||||||
"list",
|
|
||||||
"--branch",
|
|
||||||
f"rust-v{version}",
|
|
||||||
"--json",
|
|
||||||
"workflowName,url,headSha",
|
|
||||||
"--jq",
|
|
||||||
'first(.[] | select(.workflowName == "rust-release"))',
|
|
||||||
],
|
|
||||||
stdout=subprocess.PIPE,
|
|
||||||
check=True,
|
|
||||||
)
|
|
||||||
gh_run.check_returncode()
|
|
||||||
workflow = json.loads(gh_run.stdout)
|
|
||||||
sha = workflow["headSha"]
|
|
||||||
|
|
||||||
print(f"should `git checkout {sha}`")
|
|
||||||
|
|
||||||
current_dir = Path(__file__).parent.resolve()
|
|
||||||
cmd = [
|
|
||||||
str(current_dir / "stage_release.sh"),
|
|
||||||
"--version",
|
|
||||||
version,
|
|
||||||
"--workflow-url",
|
|
||||||
workflow["url"],
|
|
||||||
]
|
|
||||||
if args.tmp:
|
|
||||||
cmd.extend(["--tmp", args.tmp])
|
|
||||||
|
|
||||||
stage_release = subprocess.run(cmd)
|
|
||||||
stage_release.check_returncode()
|
|
||||||
|
|
||||||
return 0
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
sys.exit(main())
|
|
||||||
Reference in New Issue
Block a user