diff --git a/CHANGELOG.md b/CHANGELOG.md
index e005428..14d5c40 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -2,6 +2,35 @@
All notable changes to the Toolpath workspace are documented here.
+## `path resume` — one-shot resume into a coding agent — 2026-05-09
+
+`path-cli` 0.9.0. New subcommand `path resume ` that fetches a
+Toolpath document (Pathbase URL, `owner/repo/slug` shorthand, local
+file, or cache id), validates it as a single agent-bearing `Path`,
+launches an `fzf` picker over installed coding-agent harnesses
+(`--harness X` skips the picker), projects the session into that
+harness's on-disk layout under `-C, --cwd P` (default: shell cwd),
+and `execvp`'s the harness's resume command (`claude -r `,
+`gemini --resume `, `codex resume `, `opencode --session `,
+`pi --session `). On Windows the harness is spawned and waited on
+with the exit code propagated.
+
+Source-harness inference reads `path.meta.source` (`claude-code` /
+`gemini-cli` / `codex` / `opencode` / `pi`) with actor-string
+fallback; the picker pre-selects the source when it's installed.
+
+Implementation introduces five `pub(crate)` `project_`
+helpers in `cmd_export.rs` that compose the existing build + write
+pairs and return the projected session id. `cmd_resume.rs` adds an
+`ExecStrategy` trait (`RealExec` for production, `RecordingExec` for
+tests) so the integration tests can exercise the full
+resolve→pick→project pipeline without launching a real harness.
+
+Also fixed an unrelated env-var race in
+`cmd_export::tests::opencode_writes_into_db_with_project` that
+cleared `$HOME` on cleanup without restoring; this had been quietly
+flaking the parallel test suite.
+
## Conversation-stack realignment onto `toolpath` 0.4 + path-cli schema vendoring
Republish of every `toolpath-convo`-consuming crate so they pin the
diff --git a/CLAUDE.md b/CLAUDE.md
index 010d558..1556973 100644
--- a/CLAUDE.md
+++ b/CLAUDE.md
@@ -89,6 +89,10 @@ cargo run -p path-cli -- share
cargo run -p path-cli -- share --harness claude --session --project /path/to/project
cargo run -p path-cli -- share --url https://my-pathbase.example
+# Resume a Toolpath document into your coding agent of choice (interactive harness picker)
+cargo run -p path-cli -- resume
+cargo run -p path-cli -- resume --harness claude -C /path/to/project
+
# Export toolpath documents into external formats. [ is a cache id or a file path.
cargo run -p path-cli -- export claude --input ][ --project /tmp/sandbox
cargo run -p path-cli -- export claude --input ][ --output conv.jsonl
@@ -161,7 +165,7 @@ Tests live alongside the code (`#[cfg(test)] mod tests`), plus `path-cli` has in
- `toolpath-opencode`: 43 unit + 1 doc test (SQLite reader, JSON payload serde, provider assembly, snapshot-based derive, tool-input fallback for gitignored paths)
- `toolpath-pi`: 123 unit + 4 doc tests (types, paths, error, reader, io, provider)
- `toolpath-dot`: 30 unit + 2 doc tests (render, visual conventions, escaping)
-- `path-cli`: 187 unit + 31 integration tests (import/export/cache, track sessions, merge, validate, roundtrip, render-md snapshots, deprecation aliases, pathbase HTTP mock-server tests, fzf-friendly TSV output). For an end-to-end check against a real Pathbase deployment, run `scripts/test-pathbase-live.sh ` — it does an anon round-trip in a sandboxed config dir and, if you're logged into that URL, an authed pathstash round-trip too.
+- `path-cli`: 260 unit + 63 integration tests (import/export/cache, track sessions, merge, validate, roundtrip, render-md snapshots, deprecation aliases, pathbase HTTP mock-server tests, fzf-friendly TSV output, `path resume` orchestration with injectable `ExecStrategy`). For an end-to-end check against a real Pathbase deployment, run `scripts/test-pathbase-live.sh ` — it does an anon round-trip in a sandboxed config dir and, if you're logged into that URL, an authed pathstash round-trip too.
- `toolpath-cli`: 0 tests (it's a one-line `path_cli::run()` shim crate that exists only so `cargo install toolpath-cli` keeps installing the `path` binary)
Validate example documents: `for f in examples/*.json; do cargo run -p path-cli -- validate --input "$f"; done`
@@ -224,3 +228,4 @@ Build the site after changes: `cd site && pnpm run build` (should produce 7 page
- Interactive session selection: `path import ` (claude / gemini / pi / codex / opencode) auto-launches `fzf` when stdin and stderr are TTYs, `fzf` is on `$PATH`, and no `--session` was given. Multi-select (TAB) produces a `Graph` document; single-select produces a `Path`. The picker uses `path show --…` as its `--preview` command. When fzf isn't available, it falls back to most-recent (with `--project`) or prints the manual recipe (without). `path list --format tsv` is the documented machine-readable surface — column 1 is the project (for claude/gemini/pi) or session id (for codex/opencode), and the trailing column carries `first_user_message` so consumers can fuzzy-match by topic.
- Conversation metadata title field: `toolpath-claude::ConversationMetadata`, `toolpath-gemini::ConversationMetadata`, and `toolpath-pi::SessionMeta` all expose `first_user_message: Option` — the first non-empty user-prompt text. Populated cheaply during the metadata pass (single-pass for Claude/Gemini; one extra short read for Pi). Used by the picker UI but useful for any "list sessions by topic" surface.
- `path share` is the one-shot equivalent of `path import | path export pathbase`. It probes installed agent harnesses (claude/gemini/codex/opencode/pi), aggregates their sessions into a single fzf picker, and ranks rows whose project (claude/gemini/pi) or recorded cwd (codex/opencode) canonicalizes to the current directory at the top. `--harness` narrows the picker to one provider; `--harness X --session Y` (and `--project P` for keyed providers) skips the picker entirely. Pathbase flags (`--url`, `--anon`, `--repo`, `--slug`, `--public`) match `path export pathbase`. By default the derived doc is written to the cache like `import` does; pass `--no-cache` to skip.
+- `path resume ` is the inverse of `path share`. It accepts a Pathbase URL, an `owner/repo/slug` shorthand, a local toolpath JSON file, or a cache id; resolves it (caching URL fetches under `~/.toolpath/documents/` unless `--no-cache`); validates that the document is a single agent-bearing `Path`; then opens an `fzf` harness picker (skipped with `--harness X`). The picker pre-selects the source harness inferred from `path.meta.source` (`claude-code`/`gemini-cli`/`codex`/`opencode`/`pi`) when it's installed. After picking, `path resume` projects the session into the harness's on-disk layout under the chosen working directory (default: shell cwd; override with `-C, --cwd P`) and `execvp`'s the harness's resume command (`claude -r ` / `gemini --resume ` / `codex resume ` / `opencode --session ` / `pi --session `). On Windows it spawns and waits, propagating the exit code. The exec is mockable via `cmd_resume::ExecStrategy` — production uses `RealExec`; integration tests use `RecordingExec` to capture the recipe without launching a real harness.
diff --git a/Cargo.lock b/Cargo.lock
index eaf8882..3ee21a6 100644
--- a/Cargo.lock
+++ b/Cargo.lock
@@ -1589,7 +1589,7 @@ dependencies = [
[[package]]
name = "path-cli"
-version = "0.8.0"
+version = "0.9.0"
dependencies = [
"anyhow",
"assert_cmd",
diff --git a/Cargo.toml b/Cargo.toml
index 6fbe6e2..a07371f 100644
--- a/Cargo.toml
+++ b/Cargo.toml
@@ -33,7 +33,7 @@ toolpath-github = { version = "0.3.0", path = "crates/toolpath-github" }
toolpath-dot = { version = "0.2.0", path = "crates/toolpath-dot" }
toolpath-md = { version = "0.4.0", path = "crates/toolpath-md" }
toolpath-pi = { version = "0.3.0", path = "crates/toolpath-pi" }
-path-cli = { version = "0.8.0", path = "crates/path-cli" }
+path-cli = { version = "0.9.0", path = "crates/path-cli" }
pathbase-client = { version = "0.1.0", path = "crates/pathbase-client" }
reqwest = { version = "0.13", default-features = false, features = ["blocking", "json", "rustls"] }
diff --git a/README.md b/README.md
index a5d96cc..2d11c5b 100644
--- a/README.md
+++ b/README.md
@@ -108,6 +108,11 @@ path export pathbase --input claude-
# (full URL or bare `//` triple)
path import pathbase https://pathbase.dev/alex/pathstash/path-pr-42
+# Resume a Toolpath document into your coding agent of choice (interactive
+# harness picker; project the session and exec the harness's resume command)
+path resume https://pathbase.dev/alex/pathstash/path-pr-42
+path resume claude- --harness claude -C /path/to/project
+
# Query for dead ends (abandoned approaches)
path query dead-ends --input doc.json
diff --git a/crates/path-cli/Cargo.toml b/crates/path-cli/Cargo.toml
index 8d34099..c400c2e 100644
--- a/crates/path-cli/Cargo.toml
+++ b/crates/path-cli/Cargo.toml
@@ -1,6 +1,6 @@
[package]
name = "path-cli"
-version = "0.8.0"
+version = "0.9.0"
edition.workspace = true
license.workspace = true
repository = "https://github.com/empathic/toolpath"
diff --git a/crates/path-cli/src/cmd_export.rs b/crates/path-cli/src/cmd_export.rs
index e8d10aa..d4bb800 100644
--- a/crates/path-cli/src/cmd_export.rs
+++ b/crates/path-cli/src/cmd_export.rs
@@ -239,6 +239,112 @@ pub(crate) struct PathbaseUploadArgs {
pub(crate) public: bool,
}
+// ── pub(crate) project_ wrappers ────────────────────────────
+//
+// These compose the private build + write helpers below and return the
+// projected session id. They are called by `path resume`; the existing
+// `run_` functions are untouched.
+
+/// Project `path` into a Claude session under `project_dir` and return
+/// the resulting session id.
+#[cfg(not(target_os = "emscripten"))]
+pub(crate) fn project_claude(
+ path: &toolpath::v1::Path,
+ project_dir: &std::path::Path,
+) -> Result {
+ let conv = build_claude_conversation(path)?;
+ let jsonl = serialize_jsonl(&conv)?;
+ write_into_claude_project(&conv, &jsonl, project_dir)?;
+ Ok(conv.session_id)
+}
+
+/// Project `path` into a Gemini session under `project_dir` and return
+/// the resulting session UUID.
+#[cfg(not(target_os = "emscripten"))]
+pub(crate) fn project_gemini(
+ path: &toolpath::v1::Path,
+ project_dir: &std::path::Path,
+) -> Result {
+ use toolpath_convo::ConversationProjector;
+ let project_dir = std::fs::canonicalize(project_dir)
+ .with_context(|| format!("resolve project path {}", project_dir.display()))?;
+ let project_path = project_dir.to_string_lossy().to_string();
+
+ let view = toolpath_convo::extract_conversation(path);
+ let project_hash = toolpath_gemini::paths::project_hash(&project_path);
+ let projector = toolpath_gemini::project::GeminiProjector::new()
+ .with_project_hash(project_hash)
+ .with_project_path(project_path.clone());
+ let conv = projector
+ .project(&view)
+ .map_err(|e| anyhow::anyhow!("Projection failed: {}", e))?;
+ if conv.session_uuid.is_empty() {
+ anyhow::bail!("Projected conversation has no session UUID");
+ }
+ write_into_gemini_project(&conv, &project_path)?;
+ Ok(conv.session_uuid)
+}
+
+/// Project `path` into a Codex session and return the resulting session id.
+#[cfg(not(target_os = "emscripten"))]
+pub(crate) fn project_codex(
+ path: &toolpath::v1::Path,
+ project_dir: &std::path::Path,
+) -> Result {
+ use toolpath_convo::ConversationProjector;
+ let project_dir = std::fs::canonicalize(project_dir)
+ .with_context(|| format!("resolve project path {}", project_dir.display()))?;
+ let cwd_str = project_dir.to_string_lossy().to_string();
+
+ let view = toolpath_convo::extract_conversation(path);
+ let projector = toolpath_codex::project::CodexProjector::new().with_cwd(cwd_str);
+ let session = projector
+ .project(&view)
+ .map_err(|e| anyhow::anyhow!("Projection failed: {}", e))?;
+ if session.id.is_empty() {
+ anyhow::bail!("Projected session has no id");
+ }
+ write_into_codex_project(&session)?;
+ Ok(session.id)
+}
+
+/// Project `path` into an opencode session under `project_dir` and return
+/// the resulting session id.
+#[cfg(not(target_os = "emscripten"))]
+pub(crate) fn project_opencode(
+ path: &toolpath::v1::Path,
+ project_dir: &std::path::Path,
+) -> Result {
+ let session = build_opencode_session(path, Some(project_dir))?;
+ let id = session.id.clone();
+ write_into_opencode_db(&session, project_dir)?;
+ Ok(id)
+}
+
+/// Project `path` into a Pi session under `project_dir` and return the
+/// resulting session id.
+#[cfg(not(target_os = "emscripten"))]
+pub(crate) fn project_pi(
+ path: &toolpath::v1::Path,
+ project_dir: &std::path::Path,
+) -> Result {
+ use toolpath_convo::ConversationProjector;
+ let project_dir = std::fs::canonicalize(project_dir)
+ .with_context(|| format!("resolve project path {}", project_dir.display()))?;
+ let cwd_str = project_dir.to_string_lossy().to_string();
+
+ let view = toolpath_convo::extract_conversation(path);
+ let projector = toolpath_pi::project::PiProjector::new().with_cwd(cwd_str.clone());
+ let session = projector
+ .project(&view)
+ .map_err(|e| anyhow::anyhow!("Projection failed: {}", e))?;
+ if session.header.id.is_empty() {
+ anyhow::bail!("Projected session has no id");
+ }
+ write_into_pi_project(&session, &cwd_str)?;
+ Ok(session.header.id)
+}
+
fn run_claude(input: String, project: Option, output: Option) -> Result<()> {
#[cfg(target_os = "emscripten")]
{
@@ -2471,6 +2577,8 @@ mod tests {
let _g = crate::config::TEST_ENV_LOCK
.lock()
.unwrap_or_else(|e| e.into_inner());
+ let prev_home = std::env::var_os("HOME");
+ let prev_xdg = std::env::var_os("XDG_DATA_HOME");
unsafe {
std::env::set_var("HOME", &fake_home);
std::env::remove_var("XDG_DATA_HOME");
@@ -2481,7 +2589,14 @@ mod tests {
None,
);
unsafe {
- std::env::remove_var("HOME");
+ match prev_home {
+ Some(v) => std::env::set_var("HOME", v),
+ None => std::env::remove_var("HOME"),
+ }
+ match prev_xdg {
+ Some(v) => std::env::set_var("XDG_DATA_HOME", v),
+ None => std::env::remove_var("XDG_DATA_HOME"),
+ }
}
result.expect("export opencode --project");
@@ -2513,4 +2628,271 @@ mod tests {
let err = run_opencode(input_path.to_string_lossy().to_string(), None, None).unwrap_err();
assert!(err.to_string().contains("single-path"));
}
+
+ // ── project_ wrapper tests ──────────────────────────────
+
+ /// Build a minimal `toolpath::v1::Path` with a single `conversation.append`
+ /// step using the given `artifact_key` (e.g. `"claude-code://my-session"`).
+ /// The projectors read `view.id` from the first `://` artifact
+ /// key they see, so this gives them a non-empty session id to work with.
+ fn make_convo_path(artifact_key: &str) -> toolpath::v1::Path {
+ let mut extra = HashMap::new();
+ extra.insert("role".to_string(), serde_json::json!("user"));
+ extra.insert("text".to_string(), serde_json::json!("hello"));
+ let step = toolpath::v1::Step {
+ step: toolpath::v1::StepIdentity {
+ id: "s1".to_string(),
+ parents: vec![],
+ actor: "human:test".to_string(),
+ timestamp: "2026-01-01T00:00:00Z".to_string(),
+ },
+ change: {
+ let mut m = HashMap::new();
+ m.insert(
+ artifact_key.to_string(),
+ toolpath::v1::ArtifactChange {
+ raw: None,
+ structural: Some(toolpath::v1::StructuralChange {
+ change_type: "conversation.append".to_string(),
+ extra,
+ }),
+ },
+ );
+ m
+ },
+ meta: None,
+ };
+ toolpath::v1::Path {
+ path: toolpath::v1::PathIdentity {
+ id: "test-path".to_string(),
+ base: None,
+ head: "s1".to_string(),
+ graph_ref: None,
+ },
+ steps: vec![step],
+ meta: None,
+ }
+ }
+
+ #[test]
+ fn project_claude_returns_session_id_and_writes_jsonl() {
+ let temp = tempfile::tempdir().unwrap();
+ let fake_home = temp.path().join("home");
+ std::fs::create_dir_all(&fake_home).unwrap();
+ let cwd = temp.path().join("proj");
+ std::fs::create_dir_all(&cwd).unwrap();
+
+ // Use a deterministic session id embedded in the artifact key.
+ let session_id = "claude-wrapper-test-session";
+ let path = make_convo_path(&format!("claude-code://{}", session_id));
+
+ let _g = crate::config::TEST_ENV_LOCK
+ .lock()
+ .unwrap_or_else(|e| e.into_inner());
+ let prior_home = std::env::var_os("HOME");
+ unsafe {
+ std::env::set_var("HOME", &fake_home);
+ }
+ let result = project_claude(&path, &cwd);
+ unsafe {
+ match prior_home {
+ Some(v) => std::env::set_var("HOME", v),
+ None => std::env::remove_var("HOME"),
+ }
+ }
+
+ let returned_id = result.expect("project_claude should succeed");
+ assert_eq!(returned_id, session_id);
+
+ let claude_projects = fake_home.join(".claude/projects");
+ assert!(
+ claude_projects.exists(),
+ "claude projects dir missing under HOME"
+ );
+ }
+
+ #[test]
+ fn project_gemini_returns_session_id_and_writes_chat_file() {
+ let temp = tempfile::tempdir().unwrap();
+ let fake_home = temp.path().join("home");
+ std::fs::create_dir_all(&fake_home).unwrap();
+ let cwd = temp.path().join("proj");
+ std::fs::create_dir_all(&cwd).unwrap();
+
+ let session_uuid = "11111111-2222-3333-4444-aaaaaaaaaaaa";
+ let path = make_convo_path(&format!("gemini-cli://{}", session_uuid));
+
+ let _g = crate::config::TEST_ENV_LOCK
+ .lock()
+ .unwrap_or_else(|e| e.into_inner());
+ let prior_home = std::env::var_os("HOME");
+ unsafe {
+ std::env::set_var("HOME", &fake_home);
+ }
+ let result = project_gemini(&path, &cwd);
+ unsafe {
+ match prior_home {
+ Some(v) => std::env::set_var("HOME", v),
+ None => std::env::remove_var("HOME"),
+ }
+ }
+
+ let returned_id = result.expect("project_gemini should succeed");
+ assert_eq!(returned_id, session_uuid);
+
+ let gemini_tmp = fake_home.join(".gemini/tmp");
+ assert!(gemini_tmp.exists(), "gemini tmp dir missing under HOME");
+ }
+
+ #[test]
+ fn project_codex_returns_session_id_and_writes_rollout() {
+ let temp = tempfile::tempdir().unwrap();
+ let fake_home = temp.path().join("home");
+ std::fs::create_dir_all(&fake_home).unwrap();
+ let cwd = temp.path().join("proj");
+ std::fs::create_dir_all(&cwd).unwrap();
+
+ let session_uuid = "019dabc6-cccc-dddd-eeee-ffffffffffff";
+ let path = make_convo_path(&format!("codex://{}", session_uuid));
+
+ let _g = crate::config::TEST_ENV_LOCK
+ .lock()
+ .unwrap_or_else(|e| e.into_inner());
+ let prior_home = std::env::var_os("HOME");
+ unsafe {
+ std::env::set_var("HOME", &fake_home);
+ }
+ let result = project_codex(&path, &cwd);
+ unsafe {
+ match prior_home {
+ Some(v) => std::env::set_var("HOME", v),
+ None => std::env::remove_var("HOME"),
+ }
+ }
+
+ let returned_id = result.expect("project_codex should succeed");
+ assert_eq!(returned_id, session_uuid);
+
+ let codex_sessions = fake_home.join(".codex/sessions");
+ assert!(codex_sessions.exists(), "codex sessions dir missing");
+ }
+
+ #[test]
+ fn project_opencode_returns_session_id_and_inserts_row() {
+ let temp = tempfile::tempdir().unwrap();
+ let fake_home = temp.path().join("home");
+ std::fs::create_dir_all(&fake_home).unwrap();
+ let cwd = temp.path().join("proj");
+ std::fs::create_dir_all(&cwd).unwrap();
+
+ // Bootstrap the opencode DB (no public schema helper exists; inline
+ // the same DDL used in the existing opencode_writes_into_db_with_project test).
+ let data_dir = fake_home.join(".local/share/opencode");
+ std::fs::create_dir_all(&data_dir).unwrap();
+ let db_path = data_dir.join("opencode.db");
+ {
+ let conn = rusqlite::Connection::open(&db_path).unwrap();
+ conn.execute_batch(
+ r#"
+ CREATE TABLE project (
+ id text PRIMARY KEY, worktree text NOT NULL, vcs text, name text,
+ icon_url text, icon_color text,
+ time_created integer NOT NULL, time_updated integer NOT NULL,
+ time_initialized integer, sandboxes text NOT NULL, commands text
+ );
+ CREATE TABLE session (
+ id text PRIMARY KEY, project_id text NOT NULL, parent_id text,
+ slug text NOT NULL, directory text NOT NULL, title text NOT NULL,
+ version text NOT NULL, share_url text,
+ summary_additions integer, summary_deletions integer,
+ summary_files integer, summary_diffs text, revert text, permission text,
+ time_created integer NOT NULL, time_updated integer NOT NULL,
+ time_compacting integer, time_archived integer, workspace_id text
+ );
+ CREATE TABLE message (
+ id text PRIMARY KEY, session_id text NOT NULL,
+ time_created integer NOT NULL, time_updated integer NOT NULL,
+ data text NOT NULL
+ );
+ CREATE TABLE part (
+ id text PRIMARY KEY, message_id text NOT NULL, session_id text NOT NULL,
+ time_created integer NOT NULL, time_updated integer NOT NULL,
+ data text NOT NULL
+ );
+ "#,
+ )
+ .unwrap();
+ }
+
+ // opencode session ids are derived from view.id via mint_session_id,
+ // which adds the `ses_` prefix if not already present.
+ let path = make_convo_path("opencode://ses_wrapper-test");
+
+ let _g = crate::config::TEST_ENV_LOCK
+ .lock()
+ .unwrap_or_else(|e| e.into_inner());
+ let prior_home = std::env::var_os("HOME");
+ let prior_xdg = std::env::var_os("XDG_DATA_HOME");
+ unsafe {
+ std::env::set_var("HOME", &fake_home);
+ std::env::remove_var("XDG_DATA_HOME");
+ }
+ let result = project_opencode(&path, &cwd);
+ unsafe {
+ match prior_home {
+ Some(v) => std::env::set_var("HOME", v),
+ None => std::env::remove_var("HOME"),
+ }
+ match prior_xdg {
+ Some(v) => std::env::set_var("XDG_DATA_HOME", v),
+ None => std::env::remove_var("XDG_DATA_HOME"),
+ }
+ }
+
+ let returned_id = result.expect("project_opencode should succeed");
+ assert_eq!(returned_id, "ses_wrapper-test");
+
+ let conn = rusqlite::Connection::open(&db_path).unwrap();
+ let count: i64 = conn
+ .query_row(
+ "SELECT COUNT(*) FROM session WHERE id = ?1",
+ [&returned_id],
+ |r| r.get(0),
+ )
+ .unwrap();
+ assert_eq!(count, 1, "expected one session row with id {returned_id}");
+ }
+
+ #[test]
+ fn project_pi_returns_session_id_and_writes_jsonl() {
+ let temp = tempfile::tempdir().unwrap();
+ let fake_home = temp.path().join("home");
+ std::fs::create_dir_all(&fake_home).unwrap();
+ let cwd = temp.path().join("proj");
+ std::fs::create_dir_all(&cwd).unwrap();
+
+ let session_id = "pi-wrapper-test-session";
+ let path = make_convo_path(&format!("pi://{}", session_id));
+
+ let _g = crate::config::TEST_ENV_LOCK
+ .lock()
+ .unwrap_or_else(|e| e.into_inner());
+ let prior_home = std::env::var_os("HOME");
+ unsafe {
+ std::env::set_var("HOME", &fake_home);
+ }
+ let result = project_pi(&path, &cwd);
+ unsafe {
+ match prior_home {
+ Some(v) => std::env::set_var("HOME", v),
+ None => std::env::remove_var("HOME"),
+ }
+ }
+
+ let returned_id = result.expect("project_pi should succeed");
+ assert_eq!(returned_id, session_id);
+
+ let pi_sessions = fake_home.join(".pi/agent/sessions");
+ assert!(pi_sessions.exists(), "pi sessions dir missing");
+ }
}
diff --git a/crates/path-cli/src/cmd_import.rs b/crates/path-cli/src/cmd_import.rs
index 217b3b0..9860592 100644
--- a/crates/path-cli/src/cmd_import.rs
+++ b/crates/path-cli/src/cmd_import.rs
@@ -1362,6 +1362,39 @@ fn project_short(p: &str) -> String {
out.join("/")
}
+/// Fetch a Pathbase ref (`https://host/owner/repo/slug` URL or bare
+/// Compute the local cache id a Pathbase ref would land at, without
+/// hitting the network. Lets `path resume` probe the cache before
+/// deciding whether to fetch.
+#[cfg(not(target_os = "emscripten"))]
+pub(crate) fn pathbase_cache_id_of(target: &str, url_flag: Option<&str>) -> Result {
+ let (_base, ref_) = parse_pathbase_ref(target, url_flag)?;
+ let PathRef { owner, repo, slug } = ref_;
+ Ok(make_id("pathbase", &format!("{owner}-{repo}-{slug}")))
+}
+
+/// `owner/repo/slug` triple) and parse it as a toolpath document. Used
+/// by `path import pathbase` and by `path resume `.
+#[cfg(not(target_os = "emscripten"))]
+pub(crate) fn pathbase_fetch_to_doc(target: &str, url_flag: Option<&str>) -> Result {
+ use crate::cmd_pathbase::{credentials_path, load_session, paths_download, resolve_url};
+
+ let (base, ref_) = parse_pathbase_ref(target, url_flag)?;
+ let stored = load_session(&credentials_path()?)?;
+ let base_url = base
+ .or_else(|| stored.as_ref().map(|s| s.url.clone()))
+ .unwrap_or_else(|| resolve_url(None));
+
+ let token = stored.as_ref().map(|s| s.token.as_str());
+
+ let PathRef { owner, repo, slug } = ref_;
+ let body = paths_download(&base_url, token, &owner, &repo, &slug)?;
+ let cache_id = make_id("pathbase", &format!("{owner}-{repo}-{slug}"));
+ let doc = Graph::from_json(&body)
+ .map_err(|e| anyhow::anyhow!("server returned a non-toolpath document: {e}"))?;
+ Ok(DerivedDoc { cache_id, doc })
+}
+
fn derive_pathbase(target: String, url_flag: Option) -> Result> {
#[cfg(target_os = "emscripten")]
{
@@ -1371,22 +1404,7 @@ fn derive_pathbase(target: String, url_flag: Option) -> Result Result<()> {
}
#[cfg(test)]
-mod tests {
+pub(crate) mod tests {
use super::*;
fn sample() -> StoredSession {
@@ -735,13 +735,13 @@ mod tests {
/// A one-shot HTTP/1.1 responder. Binds to 127.0.0.1 on a free port,
/// reads one request (headers + body), writes a canned response, closes.
- struct MockServer {
+ pub(crate) struct MockServer {
port: u16,
thread: Option>>,
}
impl MockServer {
- fn start(status_line: &'static str, body: &'static str) -> Self {
+ pub(crate) fn start(status_line: &'static str, body: &'static str) -> Self {
use std::io::{BufRead, BufReader, Write};
use std::net::TcpListener;
@@ -794,7 +794,7 @@ mod tests {
}
}
- fn base(&self) -> String {
+ pub(crate) fn base(&self) -> String {
format!("http://127.0.0.1:{}", self.port)
}
diff --git a/crates/path-cli/src/cmd_resume.rs b/crates/path-cli/src/cmd_resume.rs
new file mode 100644
index 0000000..8021838
--- /dev/null
+++ b/crates/path-cli/src/cmd_resume.rs
@@ -0,0 +1,969 @@
+//! `path resume ` — fetch / load a Toolpath document, pick an
+//! installed coding-agent harness, project the session into that
+//! harness's on-disk layout, and exec the harness's resume command.
+//!
+//! ## Inputs
+//!
+//! `` is resolved in this order:
+//! 1. `https://` / `http://` URL → fetched via `pathbase-client`,
+//! cached unless `--no-cache`.
+//! 2. `owner/repo/slug` shorthand → same Pathbase fetch flow.
+//! 3. Existing file path → read directly.
+//! 4. Otherwise treated as a cache id under `~/.toolpath/documents/`.
+//!
+//! ## Harness selection
+//!
+//! With `--harness X`, `X` is validated against `$PATH` and used.
+//! Without `--harness`, an `fzf` picker shows installed harnesses
+//! with the source harness pre-selected. Source comes from
+//! `path.meta.source` (`claude-code`, `gemini-cli`, `codex`,
+//! `opencode`, `pi`) with actor-string fallback.
+//!
+//! ## Project directory
+//!
+//! `-C / --cwd P` overrides the shell cwd. The harness is exec'd
+//! with cwd set to P and the on-disk projection is keyed on P.
+//!
+//! ## Launch
+//!
+//! On Unix the harness binary is `execvp`'d, replacing the current
+//! process. On Windows it's spawned and waited on with the exit
+//! code propagated. If `exec` itself fails (e.g. the binary disappears
+//! between PATH check and exec), the recipe is printed to stderr.
+//!
+//! Exec is mockable via [`ExecStrategy`]: production uses [`RealExec`],
+//! integration tests use [`RecordingExec`] to capture
+//! `(binary, args, cwd)` without launching anything.
+//!
+//! See `docs/superpowers/specs/2026-05-08-path-resume-command-design.md`
+//! for the full design.
+
+#![cfg(not(target_os = "emscripten"))]
+
+use anyhow::{Context, Result};
+use clap::Args;
+use std::path::PathBuf;
+
+/// Re-exported so external callers (integration tests, future consumers)
+/// can construct [`ResumeArgs`] without depending on the `cmd_share`
+/// module directly.
+pub use crate::cmd_share::HarnessArg;
+
+#[derive(Args, Debug)]
+pub struct ResumeArgs {
+ /// Toolpath document to resume from. Accepted shapes: a Pathbase
+ /// URL (`https://host/owner/repo/slug`), a bare Pathbase shorthand
+ /// (`owner/repo/slug`), a path to a local toolpath JSON file, or a
+ /// cache id (e.g. `claude-abc`, `pathbase-foo-bar-baz`).
+ pub input: String,
+
+ /// Working directory to run the resumed harness from. Defaults to
+ /// the current shell cwd. The on-disk projection is keyed on this
+ /// directory and the harness will be exec'd with cwd set to it.
+ #[arg(short = 'C', long)]
+ pub cwd: Option,
+
+ /// Pin the resume target. Skips the interactive picker.
+ #[arg(long, value_enum)]
+ pub harness: Option,
+
+ /// Skip the cache entirely when fetching from Pathbase: don't read
+ /// an existing entry, don't write the fetched body. Useful for
+ /// ephemeral environments where you don't want the cache to grow.
+ #[arg(long)]
+ pub no_cache: bool,
+
+ /// Force a re-fetch from Pathbase even if a cache entry exists,
+ /// overwriting it with the new bytes. Default behavior is to use
+ /// the cached doc on hit and never round-trip.
+ #[arg(long)]
+ pub force: bool,
+
+ /// Pathbase server URL. Falls back to the stored session's URL,
+ /// then `$PATHBASE_URL`, then `https://pathbase.dev`.
+ #[arg(long)]
+ pub url: Option,
+}
+
+pub fn run(args: ResumeArgs) -> Result<()> {
+ run_with_strategy(args, &RealExec)
+}
+
+/// Internal entry point that the integration tests call with a
+/// `RecordingExec` strategy. Production callers use [`run`].
+pub fn run_with_strategy(args: ResumeArgs, exec: &dyn ExecStrategy) -> Result<()> {
+ let (graph, source_harness) = resolve_input(&args)?;
+ let path = ensure_path_with_agent(&graph)?;
+
+ let cwd = match args.cwd.as_ref() {
+ Some(p) => std::fs::canonicalize(p)
+ .with_context(|| format!("resolve cwd path {}", p.display()))?,
+ None => std::env::current_dir()?,
+ };
+
+ let target = pick_harness(args.harness, source_harness, None)?;
+ eprintln!(
+ "Picked harness: {}{}",
+ target.name(),
+ if Some(target) == source_harness { " (source)" } else { "" }
+ );
+
+ let session_id = project_into_harness(path, target, &cwd)?;
+ let argv = argv_for(target, &session_id);
+ exec_harness(target.name(), &argv, &cwd, exec)
+}
+
+use toolpath::v1::{Graph, Path as TPath, PathOrRef};
+
+/// Read a path's source harness from `meta.source` (set by
+/// `toolpath-convo::derive_path` to the provider id), falling back to
+/// actor-string sniffing across the path's steps.
+pub(crate) fn infer_source_harness(path: &TPath) -> Option {
+ use crate::cmd_share::Harness;
+ let meta_source = path.meta.as_ref().and_then(|m| m.source.as_deref());
+ if let Some(source) = meta_source {
+ match source {
+ "claude-code" => return Some(Harness::Claude),
+ "gemini-cli" => return Some(Harness::Gemini),
+ "codex" => return Some(Harness::Codex),
+ "opencode" => return Some(Harness::Opencode),
+ "pi" => return Some(Harness::Pi),
+ _ => {} // fall through to actor sniffing
+ }
+ }
+ for step in &path.steps {
+ let actor = &step.step.actor;
+ if actor.starts_with("agent:claude-code") {
+ return Some(Harness::Claude);
+ }
+ if actor.starts_with("agent:gemini-cli") || actor.starts_with("agent:gemini") {
+ return Some(Harness::Gemini);
+ }
+ if actor.starts_with("agent:codex") {
+ return Some(Harness::Codex);
+ }
+ if actor.starts_with("agent:opencode") {
+ return Some(Harness::Opencode);
+ }
+ if actor.starts_with("agent:pi") {
+ return Some(Harness::Pi);
+ }
+ }
+ None
+}
+
+/// Validate that a parsed Toolpath document is a single inline Path
+/// carrying at least one `agent:*` actor. Returns the inner Path borrow
+/// on success.
+pub(crate) fn ensure_path_with_agent(g: &Graph) -> Result<&TPath> {
+ if g.paths.is_empty() {
+ anyhow::bail!("resume needs a `Path`; expected one path, got an empty graph");
+ }
+ if g.paths.len() > 1 {
+ anyhow::bail!(
+ "resume needs a single `Path`; input is a graph with {} paths. \
+ Pick one with `path query …` or split first.",
+ g.paths.len()
+ );
+ }
+ let path = match &g.paths[0] {
+ PathOrRef::Path(p) => p.as_ref(),
+ PathOrRef::Ref(_) => anyhow::bail!(
+ "resume needs an inline `Path`; got a $ref. Resolve it first with `path import` or fetch the document."
+ ),
+ };
+ let has_agent = path
+ .steps
+ .iter()
+ .any(|s| s.step.actor.starts_with("agent:"));
+ if !has_agent {
+ anyhow::bail!(
+ "no agent session in input — `path resume` only works on harness-derived paths"
+ );
+ }
+ Ok(path)
+}
+
+/// Resolve the user-supplied `` argument into a parsed `Graph`
+/// plus the source harness inferred from its single inline path (if
+/// any). See spec § "Input resolution" for the order.
+pub(crate) fn resolve_input(
+ args: &ResumeArgs,
+) -> Result<(Graph, Option)> {
+ let raw = args.input.as_str();
+
+ enum Shape<'a> {
+ PathbaseUrl(&'a str),
+ PathbaseShorthand(&'a str),
+ FilePath(&'a str),
+ CacheId(&'a str),
+ }
+
+ let shape = if raw.starts_with("http://") || raw.starts_with("https://") {
+ Shape::PathbaseUrl(raw)
+ } else if looks_like_pathbase_shorthand(raw) {
+ Shape::PathbaseShorthand(raw)
+ } else if std::path::Path::new(raw).is_file() {
+ Shape::FilePath(raw)
+ } else {
+ Shape::CacheId(raw)
+ };
+
+ let graph: Graph = match shape {
+ Shape::PathbaseUrl(u) | Shape::PathbaseShorthand(u) => {
+ // Probe the local cache before going to the network. The cache
+ // id is purely a function of (owner, repo, slug), so we can
+ // compute it without fetching. `--force` skips the probe and
+ // re-fetches; `--no-cache` skips both the probe AND the post-
+ // fetch write (still useful for ephemeral environments).
+ let cache_id = crate::cmd_import::pathbase_cache_id_of(u, args.url.as_deref())?;
+ if !args.force
+ && !args.no_cache
+ && let Ok(cache_path) = crate::cmd_cache::cache_path(&cache_id)
+ && cache_path.exists()
+ {
+ let json = std::fs::read_to_string(&cache_path)
+ .with_context(|| format!("read {}", cache_path.display()))?;
+ eprintln!("Resolved {} → {} (cached)", raw, cache_id);
+ Graph::from_json(&json)
+ .map_err(|e| anyhow::anyhow!("cached toolpath document is invalid: {}", e))?
+ } else {
+ let derived = crate::cmd_import::pathbase_fetch_to_doc(u, args.url.as_deref())?;
+ if !args.no_cache {
+ // force=true here: we either short-circuited above
+ // (cache miss) or the user explicitly passed --force,
+ // and either way we want the new bytes to land.
+ crate::cmd_cache::write_cached(&derived.cache_id, &derived.doc, true)?;
+ eprintln!("Resolved {} → {}", raw, derived.cache_id);
+ }
+ derived.doc
+ }
+ }
+ Shape::FilePath(p) => {
+ let json = std::fs::read_to_string(p).with_context(|| format!("read {}", p))?;
+ Graph::from_json(&json)
+ .map_err(|e| anyhow::anyhow!("not a valid toolpath document: {}", e))?
+ }
+ Shape::CacheId(id) => {
+ let file = crate::cmd_cache::cache_ref(id).map_err(|e| {
+ anyhow::anyhow!(
+ "couldn't resolve `{}` as a URL, file path, or cache id: {}",
+ raw,
+ e
+ )
+ })?;
+ let json = std::fs::read_to_string(&file)
+ .with_context(|| format!("read {}", file.display()))?;
+ Graph::from_json(&json)
+ .map_err(|e| anyhow::anyhow!("not a valid toolpath document: {}", e))?
+ }
+ };
+
+ let harness = graph.single_path().and_then(infer_source_harness);
+ Ok((graph, harness))
+}
+
+/// Probe `$PATH` (or `path_override`, for tests) for a given binary name.
+/// Cross-platform: on Windows, also tries `.exe`.
+pub(crate) fn binary_on_path(name: &str, path_override: Option<&std::path::Path>) -> bool {
+ let dirs: Vec = match path_override {
+ Some(p) => vec![p.to_path_buf()],
+ None => std::env::var_os("PATH")
+ .map(|p| std::env::split_paths(&p).collect())
+ .unwrap_or_default(),
+ };
+ for d in dirs {
+ let candidate = d.join(name);
+ if candidate.is_file() {
+ return true;
+ }
+ #[cfg(windows)]
+ {
+ let exe = d.join(format!("{name}.exe"));
+ if exe.is_file() {
+ return true;
+ }
+ }
+ }
+ false
+}
+
+const ALL_HARNESSES: &[crate::cmd_share::Harness] = &[
+ crate::cmd_share::Harness::Claude,
+ crate::cmd_share::Harness::Gemini,
+ crate::cmd_share::Harness::Codex,
+ crate::cmd_share::Harness::Opencode,
+ crate::cmd_share::Harness::Pi,
+];
+
+/// Decide which harness to resume in.
+///
+/// - If `arg` is `Some`, validate the named harness is on PATH and return it.
+/// - Otherwise, enumerate installed harnesses and launch the fzf picker.
+/// `source` is used to label the source row in the picker UI.
+///
+/// `path_override` is `None` in production; tests pass `Some(dir)` to fake `$PATH`.
+pub(crate) fn pick_harness(
+ arg: Option,
+ source: Option,
+ path_override: Option<&std::path::Path>,
+) -> Result {
+ use crate::cmd_share::Harness;
+
+ if let Some(a) = arg {
+ let h = Harness::from_arg(a);
+ if !binary_on_path(h.name(), path_override) {
+ anyhow::bail!(
+ "harness `{}` isn't on PATH; install it or pick another with `--harness`",
+ h.name()
+ );
+ }
+ return Ok(h);
+ }
+
+ let installed: Vec = ALL_HARNESSES
+ .iter()
+ .copied()
+ .filter(|h| binary_on_path(h.name(), path_override))
+ .collect();
+
+ if installed.is_empty() {
+ anyhow::bail!(
+ "no installed harnesses found on PATH; install one of: claude, gemini, codex, opencode, pi"
+ );
+ }
+
+ interactive_pick(&installed, source)
+}
+
+fn interactive_pick(
+ installed: &[crate::cmd_share::Harness],
+ source: Option,
+) -> Result {
+ if !crate::fzf::available() {
+ anyhow::bail!(
+ "interactive picker requires `fzf` on PATH and a TTY; pass `--harness ` or rerun in a terminal"
+ );
+ }
+ let mut lines: Vec = Vec::with_capacity(installed.len());
+ for h in installed {
+ let suffix = if Some(*h) == source { " (source)" } else { "" };
+ lines.push(format!("{}{}", h.symbol(), suffix));
+ }
+
+ let header = match source {
+ Some(s) => format!("pick a harness to resume in (source: {})", s.name()),
+ None => "pick a harness to resume in".to_string(),
+ };
+
+ let opts = crate::fzf::PickOptions {
+ with_nth: "1..",
+ header: Some(&header),
+ ..Default::default()
+ };
+ let selected = match crate::fzf::pick(&lines, &opts)
+ .map_err(|e| anyhow::anyhow!("fzf failed: {}", e))?
+ {
+ crate::fzf::PickResult::Selected(rows) => rows.into_iter().next().unwrap_or_default(),
+ crate::fzf::PickResult::Cancelled => std::process::exit(130),
+ crate::fzf::PickResult::NoMatch => {
+ anyhow::bail!("fzf returned no match — picker UI was empty?");
+ }
+ };
+
+ for h in installed {
+ if selected.starts_with(h.symbol()) {
+ return Ok(*h);
+ }
+ }
+ anyhow::bail!("picker returned an unrecognized row: {selected}")
+}
+
+/// Static map from harness to resume-argv shape. Lives here because
+/// it's a per-harness CLI convention, not a projection concern.
+pub(crate) fn argv_for(harness: crate::cmd_share::Harness, session_id: &str) -> Vec {
+ use crate::cmd_share::Harness;
+ match harness {
+ Harness::Claude => vec!["-r".into(), session_id.into()],
+ Harness::Gemini => vec!["--resume".into(), session_id.into()],
+ Harness::Codex => vec!["resume".into(), session_id.into()],
+ Harness::Opencode => vec!["--session".into(), session_id.into()],
+ Harness::Pi => vec!["--session".into(), session_id.into()],
+ }
+}
+
+/// Project a Path into the chosen harness's on-disk layout under `cwd`,
+/// returning the projected session id.
+pub(crate) fn project_into_harness(
+ path: &TPath,
+ harness: crate::cmd_share::Harness,
+ cwd: &std::path::Path,
+) -> Result {
+ use crate::cmd_share::Harness;
+ match harness {
+ Harness::Claude => crate::cmd_export::project_claude(path, cwd),
+ Harness::Gemini => crate::cmd_export::project_gemini(path, cwd),
+ Harness::Codex => crate::cmd_export::project_codex(path, cwd),
+ Harness::Opencode => crate::cmd_export::project_opencode(path, cwd),
+ Harness::Pi => crate::cmd_export::project_pi(path, cwd),
+ }
+}
+
+/// What `exec_harness` saw (for tests).
+#[derive(Debug, Clone, Default)]
+pub struct CapturedExec {
+ pub binary: String,
+ pub args: Vec,
+ pub cwd: std::path::PathBuf,
+}
+
+/// Pluggable exec backend. Production uses `RealExec` (`execvp` on
+/// Unix, spawn-and-wait on Windows). Tests use `RecordingExec`.
+pub trait ExecStrategy {
+ fn exec(&self, binary: &str, args: &[String], cwd: &std::path::Path) -> Result<()>;
+}
+
+/// Production implementation. On Unix this never returns on success
+/// (the current process is replaced); on Windows it spawns the child,
+/// waits, and propagates the exit code.
+pub struct RealExec;
+
+impl ExecStrategy for RealExec {
+ fn exec(&self, binary: &str, args: &[String], cwd: &std::path::Path) -> Result<()> {
+ let mut cmd = std::process::Command::new(binary);
+ cmd.args(args);
+ cmd.current_dir(cwd);
+
+ eprintln!(
+ "Resuming: {} {} (cwd: {})",
+ binary,
+ args.join(" "),
+ cwd.display()
+ );
+
+ #[cfg(unix)]
+ {
+ use std::os::unix::process::CommandExt;
+ // exec only returns if it fails.
+ let err = cmd.exec();
+ anyhow::bail!(
+ "couldn't exec `{}`: {}. Recipe: {} {} (run from {})",
+ binary,
+ err,
+ binary,
+ args.join(" "),
+ cwd.display()
+ );
+ }
+ #[cfg(not(unix))]
+ {
+ let status = cmd.spawn()
+ .with_context(|| format!("spawn {}", binary))?
+ .wait()
+ .with_context(|| format!("wait for {}", binary))?;
+ std::process::exit(status.code().unwrap_or(1));
+ }
+ }
+}
+
+/// Recording strategy for tests. `captured()` returns the most recent
+/// invocation.
+#[derive(Default)]
+pub struct RecordingExec {
+ inner: std::sync::Mutex,
+}
+
+impl RecordingExec {
+ pub fn captured(&self) -> CapturedExec {
+ self.inner.lock().unwrap().clone()
+ }
+}
+
+impl ExecStrategy for RecordingExec {
+ fn exec(&self, binary: &str, args: &[String], cwd: &std::path::Path) -> Result<()> {
+ let mut g = self.inner.lock().unwrap();
+ *g = CapturedExec {
+ binary: binary.to_string(),
+ args: args.to_vec(),
+ cwd: cwd.to_path_buf(),
+ };
+ Ok(())
+ }
+}
+
+pub(crate) fn exec_harness(
+ binary: &str,
+ args: &[String],
+ cwd: &std::path::Path,
+ strategy: &dyn ExecStrategy,
+) -> Result<()> {
+ strategy.exec(binary, args, cwd)
+}
+
+fn looks_like_pathbase_shorthand(s: &str) -> bool {
+ // Three non-empty slash-separated segments, none containing whitespace
+ // or starting with a dot/slash (which would indicate a relative or
+ // absolute path).
+ if s.starts_with('.') || s.starts_with('/') {
+ return false;
+ }
+ let segs: Vec<&str> = s.split('/').collect();
+ segs.len() == 3 && segs.iter().all(|s| !s.is_empty() && !s.contains(char::is_whitespace))
+}
+
+#[cfg(test)]
+mod tests {
+ use super::*;
+
+ #[test]
+ fn run_with_strategy_records_invocation_for_file_input_with_explicit_harness() {
+ let _env = crate::config::TEST_ENV_LOCK.lock().unwrap_or_else(|e| e.into_inner());
+ let _home = scoped_home_for_resume();
+ let _path_guard = ScopedPathForResume::with_binaries(&["claude"]);
+ let cwd = tempfile::tempdir().unwrap();
+ let doc_file = cwd.path().join("doc.json");
+
+ // Build a minimal path with a conversation.append step that
+ // project_claude can consume, reusing the existing helper.
+ let mut path = make_convo_path_for_resume("claude-code://resume-test-session");
+ // Overwrite the actor to agent:claude-code so run_with_strategy can
+ // pass the ensure_path_with_agent check.
+ path.steps[0].step.actor = "agent:claude-code".to_string();
+
+ let graph = toolpath::v1::Graph::from_path(path);
+ std::fs::write(&doc_file, graph.to_json().unwrap()).unwrap();
+
+ let args = ResumeArgs {
+ input: doc_file.to_string_lossy().to_string(),
+ cwd: Some(cwd.path().to_path_buf()),
+ harness: Some(HarnessArg::Claude),
+ no_cache: false, force: false, url: None,
+ };
+
+ let recorder = RecordingExec::default();
+ run_with_strategy(args, &recorder).unwrap();
+
+ let cap = recorder.captured();
+ assert_eq!(cap.binary, "claude");
+ assert_eq!(cap.args[0], "-r");
+ assert_eq!(cap.cwd, std::fs::canonicalize(cwd.path()).unwrap());
+ }
+
+ use crate::cmd_share::Harness;
+ use toolpath::v1::{Graph, PathMeta, PathOrRef};
+
+ fn make_step_with_actor(id: &str, actor: &str) -> toolpath::v1::Step {
+ toolpath::v1::Step::new(id, actor, "2026-01-01T00:00:00Z")
+ .with_raw_change("src/main.rs", "@@ -1 +1 @@\n-old\n+new")
+ }
+
+ fn make_path_with_actor(actor: &str) -> toolpath::v1::Path {
+ use toolpath::v1::{Path, PathIdentity};
+ let step = make_step_with_actor("s1", actor);
+ Path {
+ path: PathIdentity {
+ id: "p1".to_string(),
+ base: None,
+ head: "s1".to_string(),
+ graph_ref: None,
+ },
+ steps: vec![step],
+ meta: None,
+ }
+ }
+
+ #[test]
+ fn infer_source_harness_meta_source_wins() {
+ let mut path = make_path_with_actor("agent:codex");
+ path.meta = Some(PathMeta {
+ source: Some("claude-code".to_string()),
+ ..Default::default()
+ });
+ assert_eq!(infer_source_harness(&path), Some(Harness::Claude));
+ }
+
+ #[test]
+ fn infer_source_harness_meta_source_unknown_falls_through_to_actor() {
+ let mut path = make_path_with_actor("agent:gemini-cli");
+ path.meta = Some(PathMeta {
+ source: Some("something-bespoke".to_string()),
+ ..Default::default()
+ });
+ assert_eq!(infer_source_harness(&path), Some(Harness::Gemini));
+ }
+
+ #[test]
+ fn infer_source_harness_actor_sniff_codex() {
+ let path = make_path_with_actor("agent:codex");
+ assert_eq!(infer_source_harness(&path), Some(Harness::Codex));
+ }
+
+ #[test]
+ fn infer_source_harness_actor_sniff_opencode() {
+ let path = make_path_with_actor("agent:opencode");
+ assert_eq!(infer_source_harness(&path), Some(Harness::Opencode));
+ }
+
+ #[test]
+ fn infer_source_harness_actor_sniff_pi() {
+ let path = make_path_with_actor("agent:pi");
+ assert_eq!(infer_source_harness(&path), Some(Harness::Pi));
+ }
+
+ #[test]
+ fn infer_source_harness_returns_none_when_no_signal() {
+ let path = make_path_with_actor("human:alex");
+ assert_eq!(infer_source_harness(&path), None);
+ }
+
+ #[test]
+ fn ensure_path_with_agent_accepts_single_path_with_agent_actor() {
+ let g = Graph::from_path(make_path_with_actor("agent:claude-code"));
+ assert!(ensure_path_with_agent(&g).is_ok());
+ }
+
+ #[test]
+ fn ensure_path_with_agent_rejects_empty_graph() {
+ let mut g = Graph::from_path(make_path_with_actor("agent:claude-code"));
+ g.paths.clear();
+ let err = ensure_path_with_agent(&g).unwrap_err();
+ assert!(err.to_string().contains("expected"));
+ assert!(err.to_string().contains("empty"));
+ }
+
+ #[test]
+ fn ensure_path_with_agent_rejects_multi_path_graph() {
+ let mut g = Graph::from_path(make_path_with_actor("agent:claude-code"));
+ g.paths
+ .push(PathOrRef::Path(Box::new(make_path_with_actor("agent:claude-code"))));
+ let err = ensure_path_with_agent(&g).unwrap_err();
+ let s = err.to_string();
+ assert!(s.contains("single `Path`"), "actual: {s}");
+ assert!(s.contains("2 paths"), "actual: {s}");
+ }
+
+ #[test]
+ fn ensure_path_with_agent_rejects_agentless_path() {
+ let g = Graph::from_path(make_path_with_actor("human:alex"));
+ let err = ensure_path_with_agent(&g).unwrap_err();
+ assert!(err.to_string().contains("no agent session"));
+ }
+
+ #[test]
+ fn ensure_path_with_agent_rejects_path_ref_only_graph() {
+ use toolpath::v1::PathRef;
+ let mut g = Graph::from_path(make_path_with_actor("agent:claude-code"));
+ g.paths = vec![PathOrRef::Ref(PathRef {
+ ref_url: "$ref://something".into(),
+ })];
+ let err = ensure_path_with_agent(&g).unwrap_err();
+ assert!(err.to_string().contains("inline `Path`"), "actual: {}", err);
+ }
+
+ #[test]
+ fn resolve_input_file_path() {
+ let tmp = tempfile::tempdir().unwrap();
+ let p = tmp.path().join("doc.json");
+ let graph = toolpath::v1::Graph::from_path(make_path_with_actor("agent:claude-code"));
+ std::fs::write(&p, graph.to_json().unwrap()).unwrap();
+
+ let args = ResumeArgs {
+ input: p.to_string_lossy().to_string(),
+ cwd: None,
+ harness: None,
+ no_cache: false,
+ force: false,
+ url: None,
+ };
+ let (g, harness) = resolve_input(&args).unwrap();
+ let _path = ensure_path_with_agent(&g).unwrap();
+ assert_eq!(harness, Some(Harness::Claude));
+ }
+
+ #[test]
+ fn resolve_input_url_dispatches_to_pathbase_fetch() {
+ let _env = crate::config::TEST_ENV_LOCK.lock().unwrap_or_else(|e| e.into_inner());
+ use crate::cmd_pathbase::tests::MockServer;
+ let body = {
+ let mut path = make_path_with_actor("agent:codex");
+ path.meta = Some(toolpath::v1::PathMeta {
+ source: Some("codex".to_string()),
+ ..Default::default()
+ });
+ toolpath::v1::Graph::from_path(path).to_json().unwrap()
+ };
+ // MockServer::start requires &'static str — leak the body to satisfy this.
+ let body_static: &'static str = Box::leak(body.into_boxed_str());
+ let server = MockServer::start("HTTP/1.1 200 OK", body_static);
+
+ let args = ResumeArgs {
+ input: format!("{}/alex/pathstash/p", server.base()),
+ cwd: None,
+ harness: None,
+ no_cache: true, // skip cache write in tests
+ force: false,
+ url: None,
+ };
+ let (g, harness) = resolve_input(&args).unwrap();
+ let _ = ensure_path_with_agent(&g).unwrap();
+ assert_eq!(harness, Some(Harness::Codex));
+ }
+
+ #[test]
+ fn resolve_input_url_uses_cache_on_hit_without_refetching() {
+ // Regression for the second-invocation cache-hit error: re-running
+ // `path resume ` should silently reuse the cached doc instead
+ // of erroring. We seed the cache with a known-good doc, point the
+ // input at a 500-erroring mock server (so any network round-trip
+ // would surface as an error), and confirm resolve_input still
+ // returns the cached graph.
+ let _env = crate::config::TEST_ENV_LOCK.lock().unwrap_or_else(|e| e.into_inner());
+
+ // Pin TOOLPATH_CONFIG_DIR to a tempdir so we don't pollute the
+ // user's real cache.
+ let cfg_dir = tempfile::tempdir().unwrap();
+ let prev_cfg = std::env::var_os("TOOLPATH_CONFIG_DIR");
+ unsafe {
+ std::env::set_var("TOOLPATH_CONFIG_DIR", cfg_dir.path());
+ }
+
+ // Seed the cache with a codex-source graph.
+ let cache_id = "pathbase-alex-pathstash-cached-fixture";
+ let documents = cfg_dir.path().join("documents");
+ std::fs::create_dir_all(&documents).unwrap();
+ let cached_graph = {
+ let mut path = make_path_with_actor("agent:codex");
+ path.meta = Some(toolpath::v1::PathMeta {
+ source: Some("codex".to_string()),
+ ..Default::default()
+ });
+ toolpath::v1::Graph::from_path(path)
+ };
+ std::fs::write(
+ documents.join(format!("{cache_id}.json")),
+ cached_graph.to_json().unwrap(),
+ )
+ .unwrap();
+
+ // Mock server that 500s any request — proves we never call out.
+ use crate::cmd_pathbase::tests::MockServer;
+ let server = MockServer::start("HTTP/1.1 500 Internal Server Error", "boom");
+
+ let args = ResumeArgs {
+ input: format!("{}/alex/pathstash/cached-fixture", server.base()),
+ cwd: None,
+ harness: None,
+ no_cache: false,
+ force: false,
+ url: None,
+ };
+ let result = resolve_input(&args);
+
+ // Restore env before asserting so a panic doesn't poison sibling tests.
+ unsafe {
+ match prev_cfg {
+ Some(v) => std::env::set_var("TOOLPATH_CONFIG_DIR", v),
+ None => std::env::remove_var("TOOLPATH_CONFIG_DIR"),
+ }
+ }
+
+ let (g, harness) = result.expect("resolve_input should reuse cache without refetching");
+ let _ = ensure_path_with_agent(&g).unwrap();
+ assert_eq!(harness, Some(Harness::Codex));
+ }
+
+ #[test]
+ fn resolve_input_unresolvable_errors_clearly() {
+ let _env = crate::config::TEST_ENV_LOCK.lock().unwrap_or_else(|e| e.into_inner());
+ let args = ResumeArgs {
+ input: "definitely/not/a/real/cache/id".to_string(),
+ cwd: None,
+ harness: None,
+ no_cache: false,
+ force: false,
+ url: None,
+ };
+ let err = resolve_input(&args).unwrap_err();
+ let s = err.to_string();
+ assert!(s.contains("couldn't resolve"), "actual: {s}");
+ }
+
+ fn fake_path_with(binaries: &[&str]) -> tempfile::TempDir {
+ let td = tempfile::tempdir().unwrap();
+ for b in binaries {
+ let p = td.path().join(b);
+ std::fs::write(&p, "#!/bin/sh\nexit 0\n").unwrap();
+ #[cfg(unix)]
+ {
+ use std::os::unix::fs::PermissionsExt;
+ let mut perm = std::fs::metadata(&p).unwrap().permissions();
+ perm.set_mode(0o755);
+ std::fs::set_permissions(&p, perm).unwrap();
+ }
+ }
+ td
+ }
+
+ #[test]
+ fn binary_on_path_finds_present_binary() {
+ let td = fake_path_with(&["claude"]);
+ assert!(binary_on_path("claude", Some(td.path())));
+ assert!(!binary_on_path("gemini", Some(td.path())));
+ }
+
+ #[test]
+ fn pick_harness_explicit_arg_validates_path() {
+ let td = fake_path_with(&["claude"]);
+ let result = pick_harness(Some(HarnessArg::Claude), None, Some(td.path()));
+ assert_eq!(result.unwrap(), Harness::Claude);
+
+ let err = pick_harness(Some(HarnessArg::Gemini), None, Some(td.path())).unwrap_err();
+ assert!(err.to_string().contains("`gemini` isn't on PATH"));
+ }
+
+ #[test]
+ fn pick_harness_zero_installed_errors() {
+ let td = fake_path_with(&[]);
+ let err = pick_harness(None, Some(Harness::Claude), Some(td.path())).unwrap_err();
+ assert!(
+ err.to_string().contains("no installed harnesses")
+ || err.to_string().contains("no harnesses on PATH"),
+ "actual: {}",
+ err
+ );
+ }
+
+ #[test]
+ fn argv_for_returns_harness_specific_shape() {
+ assert_eq!(argv_for(Harness::Claude, "abc"), vec!["-r".to_string(), "abc".to_string()]);
+ assert_eq!(argv_for(Harness::Gemini, "abc"), vec!["--resume".to_string(), "abc".to_string()]);
+ assert_eq!(argv_for(Harness::Codex, "abc"), vec!["resume".to_string(), "abc".to_string()]);
+ assert_eq!(argv_for(Harness::Opencode, "abc"), vec!["--session".to_string(), "abc".to_string()]);
+ assert_eq!(argv_for(Harness::Pi, "abc"), vec!["--session".to_string(), "abc".to_string()]);
+ }
+
+ #[test]
+ fn project_into_harness_claude_round_trip() {
+ let _env = crate::config::TEST_ENV_LOCK.lock().unwrap_or_else(|e| e.into_inner());
+ let _home = scoped_home_for_resume();
+ let cwd = tempfile::tempdir().unwrap();
+ let path = make_convo_path_for_resume("claude-code://resume-test-session");
+
+ let session_id = project_into_harness(&path, Harness::Claude, cwd.path()).unwrap();
+ assert!(!session_id.is_empty());
+ }
+
+ /// Build a minimal `toolpath::v1::Path` with a single `conversation.append`
+ /// step using the given `artifact_key` (e.g. `"claude-code://my-session"`).
+ /// Required for projectors that extract the session id from the artifact key.
+ fn make_convo_path_for_resume(artifact_key: &str) -> toolpath::v1::Path {
+ use std::collections::HashMap;
+ let mut extra = HashMap::new();
+ extra.insert("role".to_string(), serde_json::json!("user"));
+ extra.insert("text".to_string(), serde_json::json!("hello"));
+ let step = toolpath::v1::Step {
+ step: toolpath::v1::StepIdentity {
+ id: "s1".to_string(),
+ parents: vec![],
+ actor: "human:test".to_string(),
+ timestamp: "2026-01-01T00:00:00Z".to_string(),
+ },
+ change: {
+ let mut m = HashMap::new();
+ m.insert(
+ artifact_key.to_string(),
+ toolpath::v1::ArtifactChange {
+ raw: None,
+ structural: Some(toolpath::v1::StructuralChange {
+ change_type: "conversation.append".to_string(),
+ extra,
+ }),
+ },
+ );
+ m
+ },
+ meta: None,
+ };
+ toolpath::v1::Path {
+ path: toolpath::v1::PathIdentity {
+ id: "test-path".to_string(),
+ base: None,
+ head: "s1".to_string(),
+ graph_ref: None,
+ },
+ steps: vec![step],
+ meta: None,
+ }
+ }
+
+ fn scoped_home_for_resume() -> ScopedHomeForResume {
+ ScopedHomeForResume::new()
+ }
+
+ struct ScopedPathForResume {
+ _bin_dir: tempfile::TempDir,
+ prev: Option,
+ }
+
+ impl ScopedPathForResume {
+ /// Prepends a tempdir containing the named binaries to `PATH` for
+ /// the guard's lifetime.
+ fn with_binaries(binaries: &[&str]) -> Self {
+ let bin_dir = fake_path_with(binaries);
+ let prev = std::env::var_os("PATH");
+ let new_path = std::env::join_paths(
+ std::iter::once(bin_dir.path().to_path_buf())
+ .chain(std::env::split_paths(&prev.clone().unwrap_or_default())),
+ )
+ .unwrap();
+ unsafe { std::env::set_var("PATH", new_path); }
+ Self { _bin_dir: bin_dir, prev }
+ }
+ }
+
+ impl Drop for ScopedPathForResume {
+ fn drop(&mut self) {
+ unsafe {
+ match &self.prev {
+ Some(v) => std::env::set_var("PATH", v),
+ None => std::env::remove_var("PATH"),
+ }
+ }
+ }
+ }
+
+ struct ScopedHomeForResume { _td: tempfile::TempDir, prev: Option }
+
+ impl ScopedHomeForResume {
+ fn new() -> Self {
+ let td = tempfile::tempdir().unwrap();
+ let prev = std::env::var_os("HOME");
+ unsafe { std::env::set_var("HOME", td.path()); }
+ Self { _td: td, prev }
+ }
+ }
+
+ impl Drop for ScopedHomeForResume {
+ fn drop(&mut self) {
+ unsafe {
+ match &self.prev {
+ Some(v) => std::env::set_var("HOME", v),
+ None => std::env::remove_var("HOME"),
+ }
+ }
+ }
+ }
+
+ #[test]
+ fn exec_strategy_recording_captures_invocation() {
+ let recorder = RecordingExec::default();
+ let strategy: &dyn ExecStrategy = &recorder;
+ exec_harness("claude", &["-r".into(), "abc123".into()], std::path::Path::new("/tmp/x"), strategy)
+ .unwrap();
+
+ let captured = recorder.captured();
+ assert_eq!(captured.binary, "claude");
+ assert_eq!(captured.args, vec!["-r".to_string(), "abc123".to_string()]);
+ assert_eq!(captured.cwd, std::path::PathBuf::from("/tmp/x"));
+ }
+}
diff --git a/crates/path-cli/src/lib.rs b/crates/path-cli/src/lib.rs
index c7f53af..63b840d 100644
--- a/crates/path-cli/src/lib.rs
+++ b/crates/path-cli/src/lib.rs
@@ -14,6 +14,8 @@ mod cmd_project;
mod cmd_query;
mod cmd_render;
#[cfg(not(target_os = "emscripten"))]
+pub mod cmd_resume;
+#[cfg(not(target_os = "emscripten"))]
mod cmd_share;
#[cfg(not(target_os = "emscripten"))]
mod cmd_show;
@@ -122,6 +124,13 @@ enum Commands {
#[command(flatten)]
args: cmd_share::ShareArgs,
},
+ /// Resume an agent session into the chosen harness, projecting the
+ /// document and exec'ing the harness's resume command.
+ #[cfg(not(target_os = "emscripten"))]
+ Resume {
+ #[command(flatten)]
+ args: cmd_resume::ResumeArgs,
+ },
// ── Deprecated aliases ────────────────────────────────────────────
#[command(hide = true, about = "[deprecated] Use `path import`")]
@@ -168,6 +177,8 @@ pub fn run() -> Result<()> {
Commands::Auth { op } => cmd_auth::run(op),
#[cfg(not(target_os = "emscripten"))]
Commands::Share { args } => cmd_share::run(args),
+ #[cfg(not(target_os = "emscripten"))]
+ Commands::Resume { args } => cmd_resume::run(args),
Commands::Derive { source } => cmd_derive::run(source, cli.pretty),
Commands::Incept { args } => cmd_incept::run(args),
diff --git a/crates/path-cli/tests/resume.rs b/crates/path-cli/tests/resume.rs
new file mode 100644
index 0000000..a053f6a
--- /dev/null
+++ b/crates/path-cli/tests/resume.rs
@@ -0,0 +1,300 @@
+//! Integration tests for `path resume`.
+//!
+//! Tests dispatch through `path_cli::cmd_resume::run_with_strategy`
+//! with a `RecordingExec` strategy so the would-be `execvp` becomes a
+//! captured `(binary, args, cwd)` tuple. Each test isolates `$HOME`,
+//! `$TOOLPATH_CONFIG_DIR`, and `$PATH` via RAII guards under a shared
+//! lock.
+
+#![cfg(not(target_os = "emscripten"))]
+
+use path_cli::cmd_resume::{run_with_strategy, HarnessArg, RecordingExec, ResumeArgs};
+
+mod support;
+use support::*;
+
+// ── Per-harness positive cases ──────────────────────────────────────
+
+#[test]
+fn file_input_explicit_claude_projects_and_records_exec() {
+ let _env = env_lock();
+ let _home = ScopedHome::new();
+ let _path = ScopedPath::with_binary("claude");
+ let cwd = tempfile::tempdir().unwrap();
+
+ let path = make_convo_path("agent:claude-code", "claude-code://resume-claude-int");
+ let doc_file = write_path_to_temp(cwd.path(), path);
+
+ let recorder = RecordingExec::default();
+ run_with_strategy(args_explicit(doc_file, cwd.path(), HarnessArg::Claude), &recorder).unwrap();
+
+ let cap = recorder.captured();
+ assert_eq!(cap.binary, "claude");
+ assert_eq!(cap.args[0], "-r");
+ assert!(!cap.args[1].is_empty(), "session id should be non-empty");
+ assert_eq!(cap.cwd, std::fs::canonicalize(cwd.path()).unwrap());
+
+ // Side effect: a JSONL was written under HOME/.claude/projects.
+ let projects = std::env::var_os("HOME")
+ .map(|h| std::path::PathBuf::from(h).join(".claude/projects"))
+ .unwrap();
+ assert!(projects.exists(), "claude projects dir not created");
+ assert!(
+ dir_contains_file_with_ext(&projects, "jsonl"),
+ "no JSONL written under claude projects"
+ );
+}
+
+#[test]
+fn file_input_explicit_gemini_projects_and_records_exec() {
+ let _env = env_lock();
+ let _home = ScopedHome::new();
+ let _path = ScopedPath::with_binary("gemini");
+ let cwd = tempfile::tempdir().unwrap();
+
+ let path = make_convo_path("agent:gemini-cli", "gemini-cli://resume-gemini-int");
+ let doc_file = write_path_to_temp(cwd.path(), path);
+
+ let recorder = RecordingExec::default();
+ run_with_strategy(args_explicit(doc_file, cwd.path(), HarnessArg::Gemini), &recorder).unwrap();
+
+ let cap = recorder.captured();
+ assert_eq!(cap.binary, "gemini");
+ assert_eq!(cap.args[0], "--resume");
+ assert!(!cap.args[1].is_empty());
+
+ let tmp_root = std::env::var_os("HOME")
+ .map(|h| std::path::PathBuf::from(h).join(".gemini/tmp"))
+ .unwrap();
+ assert!(tmp_root.exists(), "gemini tmp dir not created");
+}
+
+#[test]
+fn file_input_explicit_codex_projects_and_records_exec() {
+ let _env = env_lock();
+ let _home = ScopedHome::new();
+ let _path = ScopedPath::with_binary("codex");
+ let cwd = tempfile::tempdir().unwrap();
+
+ let path = make_convo_path("agent:codex", "codex://resume-codex-int");
+ let doc_file = write_path_to_temp(cwd.path(), path);
+
+ let recorder = RecordingExec::default();
+ run_with_strategy(args_explicit(doc_file, cwd.path(), HarnessArg::Codex), &recorder).unwrap();
+
+ let cap = recorder.captured();
+ assert_eq!(cap.binary, "codex");
+ assert_eq!(cap.args[0], "resume");
+ assert!(!cap.args[1].is_empty());
+
+ let sessions = std::env::var_os("HOME")
+ .map(|h| std::path::PathBuf::from(h).join(".codex/sessions"))
+ .unwrap();
+ assert!(sessions.exists(), "codex sessions dir not created");
+}
+
+#[test]
+fn file_input_explicit_opencode_projects_and_records_exec() {
+ let _env = env_lock();
+ let _home = ScopedHome::new();
+ let _path = ScopedPath::with_binary("opencode");
+ let cwd = tempfile::tempdir().unwrap();
+
+ // Pre-create the opencode db with the canonical schema. (Schema DDL
+ // copied from cmd_export's existing opencode test until/unless
+ // toolpath-opencode exposes a public bootstrap helper.)
+ let resolver = toolpath_opencode::PathResolver::new();
+ let db_path = resolver.db_path().unwrap();
+ std::fs::create_dir_all(db_path.parent().unwrap()).unwrap();
+ {
+ let conn = rusqlite::Connection::open(&db_path).unwrap();
+ conn.execute_batch(
+ r#"
+ CREATE TABLE project (
+ id text PRIMARY KEY, worktree text NOT NULL, vcs text NOT NULL,
+ name text, time_created integer NOT NULL, time_updated integer NOT NULL,
+ time_initialized integer, sandboxes text NOT NULL, commands text
+ );
+ CREATE TABLE session (
+ id text PRIMARY KEY, project_id text NOT NULL, parent_id text,
+ slug text NOT NULL, directory text NOT NULL, title text NOT NULL,
+ version text NOT NULL, share_url text,
+ summary_additions integer, summary_deletions integer,
+ summary_files integer, summary_diffs text, revert text, permission text,
+ time_created integer NOT NULL, time_updated integer NOT NULL,
+ time_compacting integer, time_archived integer, workspace_id text
+ );
+ CREATE TABLE message (
+ id text PRIMARY KEY, session_id text NOT NULL,
+ time_created integer NOT NULL, time_updated integer NOT NULL,
+ data text NOT NULL
+ );
+ CREATE TABLE part (
+ id text PRIMARY KEY, message_id text NOT NULL, session_id text NOT NULL,
+ time_created integer NOT NULL, time_updated integer NOT NULL,
+ data text NOT NULL
+ );
+ "#,
+ )
+ .unwrap();
+ }
+
+ let path = make_convo_path("agent:opencode", "opencode://ses_resume-opencode-int");
+ let doc_file = write_path_to_temp(cwd.path(), path);
+
+ let recorder = RecordingExec::default();
+ run_with_strategy(
+ args_explicit(doc_file, cwd.path(), HarnessArg::Opencode),
+ &recorder,
+ )
+ .unwrap();
+
+ let cap = recorder.captured();
+ assert_eq!(cap.binary, "opencode");
+ assert_eq!(cap.args[0], "--session");
+ assert!(!cap.args[1].is_empty());
+
+ let conn = rusqlite::Connection::open(&db_path).unwrap();
+ let session_count: i64 = conn
+ .query_row("SELECT COUNT(*) FROM session", [], |r| r.get(0))
+ .unwrap();
+ assert_eq!(session_count, 1, "opencode session row not inserted");
+}
+
+#[test]
+fn file_input_explicit_pi_projects_and_records_exec() {
+ let _env = env_lock();
+ let _home = ScopedHome::new();
+ let _path = ScopedPath::with_binary("pi");
+ let cwd = tempfile::tempdir().unwrap();
+
+ let path = make_convo_path("agent:pi", "pi://resume-pi-int");
+ let doc_file = write_path_to_temp(cwd.path(), path);
+
+ let recorder = RecordingExec::default();
+ run_with_strategy(args_explicit(doc_file, cwd.path(), HarnessArg::Pi), &recorder).unwrap();
+
+ let cap = recorder.captured();
+ assert_eq!(cap.binary, "pi");
+ assert_eq!(cap.args[0], "--session");
+ assert!(!cap.args[1].is_empty());
+
+ let sessions = std::env::var_os("HOME")
+ .map(|h| std::path::PathBuf::from(h).join(".pi/agent/sessions"))
+ .unwrap();
+ assert!(sessions.exists(), "pi sessions dir not created");
+}
+
+// ── Cache-id input ──────────────────────────────────────────────────
+
+#[test]
+fn cache_id_input_loads_and_projects() {
+ let _env = env_lock();
+ let _home = ScopedHome::new();
+ let _path = ScopedPath::with_binary("claude");
+ let cwd = tempfile::tempdir().unwrap();
+
+ // Seed a cache entry by writing the graph to
+ // /documents/.json directly.
+ let cache_id = "claude-resume-cache-test";
+ let documents = std::path::PathBuf::from(std::env::var_os("TOOLPATH_CONFIG_DIR").unwrap())
+ .join("documents");
+ std::fs::create_dir_all(&documents).unwrap();
+ let graph = toolpath::v1::Graph::from_path(make_convo_path(
+ "agent:claude-code",
+ "claude-code://resume-cache-int",
+ ));
+ std::fs::write(documents.join(format!("{cache_id}.json")), graph.to_json().unwrap()).unwrap();
+
+ let resume_args = ResumeArgs {
+ input: cache_id.to_string(),
+ cwd: Some(cwd.path().to_path_buf()),
+ harness: Some(HarnessArg::Claude),
+ no_cache: false,
+ force: false,
+ url: None,
+ };
+
+ let recorder = RecordingExec::default();
+ run_with_strategy(resume_args, &recorder).unwrap();
+
+ let cap = recorder.captured();
+ assert_eq!(cap.binary, "claude");
+ assert_eq!(cap.args[0], "-r");
+}
+
+// ── Rejection cases ─────────────────────────────────────────────────
+
+#[test]
+fn multi_path_graph_returns_clear_error() {
+ let _env = env_lock();
+ let _home = ScopedHome::new();
+ let _path = ScopedPath::with_binary("claude");
+ let cwd = tempfile::tempdir().unwrap();
+
+ let p1 = make_convo_path("agent:claude-code", "claude-code://multi-1");
+ let mut p2 = make_convo_path("agent:claude-code", "claude-code://multi-2");
+ p2.path.id = "p2".into();
+
+ let graph = toolpath::v1::Graph {
+ graph: toolpath::v1::GraphIdentity { id: "g1".into() },
+ paths: vec![
+ toolpath::v1::PathOrRef::Path(Box::new(p1)),
+ toolpath::v1::PathOrRef::Path(Box::new(p2)),
+ ],
+ meta: None,
+ };
+ let doc_file = cwd.path().join("multi.json");
+ std::fs::write(&doc_file, graph.to_json().unwrap()).unwrap();
+
+ let recorder = RecordingExec::default();
+ let err = run_with_strategy(
+ args_explicit(doc_file, cwd.path(), HarnessArg::Claude),
+ &recorder,
+ )
+ .unwrap_err();
+ let s = err.to_string();
+ assert!(s.contains("single `Path`"), "actual: {s}");
+ assert!(s.contains("2 paths"), "actual: {s}");
+}
+
+#[test]
+fn agentless_path_returns_clear_error() {
+ let _env = env_lock();
+ let _home = ScopedHome::new();
+ let _path = ScopedPath::with_binary("claude");
+ let cwd = tempfile::tempdir().unwrap();
+
+ // human:* actor — should be rejected by ensure_path_with_agent.
+ let path = make_convo_path("human:alex", "claude-code://noop");
+ let doc_file = write_path_to_temp(cwd.path(), path);
+
+ let recorder = RecordingExec::default();
+ let err = run_with_strategy(
+ args_explicit(doc_file, cwd.path(), HarnessArg::Claude),
+ &recorder,
+ )
+ .unwrap_err();
+ assert!(err.to_string().contains("no agent session"));
+}
+
+#[test]
+fn explicit_harness_not_on_path_errors() {
+ let _env = env_lock();
+ let _home = ScopedHome::new();
+ let _path = ScopedPath::empty();
+ let cwd = tempfile::tempdir().unwrap();
+
+ let path = make_convo_path("agent:claude-code", "claude-code://no-binary");
+ let doc_file = write_path_to_temp(cwd.path(), path);
+
+ let recorder = RecordingExec::default();
+ let err = run_with_strategy(
+ args_explicit(doc_file, cwd.path(), HarnessArg::Claude),
+ &recorder,
+ )
+ .unwrap_err();
+ let s = err.to_string();
+ assert!(s.contains("isn't on PATH"), "actual: {s}");
+ assert!(s.contains("claude"), "actual: {s}");
+}
diff --git a/crates/path-cli/tests/support/mod.rs b/crates/path-cli/tests/support/mod.rs
new file mode 100644
index 0000000..2c9d460
--- /dev/null
+++ b/crates/path-cli/tests/support/mod.rs
@@ -0,0 +1,206 @@
+//! Shared helpers for `path resume` integration tests.
+//!
+//! These are NOT integration-test entry points — they're a support
+//! module imported by `tests/resume.rs`. Lives under `tests/` so it
+//! doesn't leak into the production library API.
+
+#![allow(dead_code)]
+
+use std::collections::HashMap;
+use std::ffi::OsString;
+use std::path::{Path, PathBuf};
+use std::sync::{Mutex, OnceLock};
+
+use path_cli::cmd_resume::{HarnessArg, ResumeArgs};
+
+/// Process-wide lock for tests that mutate `$HOME`, `$PATH`, or
+/// `$TOOLPATH_CONFIG_DIR`. Integration tests under `tests/resume.rs`
+/// can't reach the library's internal `crate::config::TEST_ENV_LOCK`,
+/// so we use a separate lock here. Crucially, no library test holds
+/// this lock — but library tests now properly save+restore env vars
+/// (see commit 23deeb2), so the integration suite can be self-isolating.
+pub fn env_lock() -> std::sync::MutexGuard<'static, ()> {
+ static LOCK: OnceLock> = OnceLock::new();
+ LOCK.get_or_init(|| Mutex::new(()))
+ .lock()
+ .unwrap_or_else(|e| e.into_inner())
+}
+
+/// RAII guard that pins `$HOME` and `$TOOLPATH_CONFIG_DIR` to a tempdir.
+pub struct ScopedHome {
+ _td: tempfile::TempDir,
+ prev_home: Option,
+ prev_config: Option,
+}
+
+impl ScopedHome {
+ pub fn new() -> Self {
+ let td = tempfile::tempdir().unwrap();
+ let prev_home = std::env::var_os("HOME");
+ let prev_config = std::env::var_os("TOOLPATH_CONFIG_DIR");
+ unsafe {
+ std::env::set_var("HOME", td.path());
+ std::env::set_var("TOOLPATH_CONFIG_DIR", td.path().join(".toolpath"));
+ }
+ Self { _td: td, prev_home, prev_config }
+ }
+
+ pub fn home_dir(&self) -> PathBuf {
+ PathBuf::from(self._td.path())
+ }
+}
+
+impl Drop for ScopedHome {
+ fn drop(&mut self) {
+ unsafe {
+ match &self.prev_home {
+ Some(v) => std::env::set_var("HOME", v),
+ None => std::env::remove_var("HOME"),
+ }
+ match &self.prev_config {
+ Some(v) => std::env::set_var("TOOLPATH_CONFIG_DIR", v),
+ None => std::env::remove_var("TOOLPATH_CONFIG_DIR"),
+ }
+ }
+ }
+}
+
+/// RAII guard that prepends a tempdir of fake binaries to `$PATH`.
+pub struct ScopedPath {
+ _td: tempfile::TempDir,
+ prev: Option,
+}
+
+impl ScopedPath {
+ pub fn with_binary(name: &str) -> Self {
+ Self::with_binaries(&[name])
+ }
+
+ pub fn with_binaries(names: &[&str]) -> Self {
+ let td = tempfile::tempdir().unwrap();
+ for n in names {
+ let p = td.path().join(n);
+ std::fs::write(&p, "#!/bin/sh\nexit 0\n").unwrap();
+ #[cfg(unix)]
+ {
+ use std::os::unix::fs::PermissionsExt;
+ let mut perm = std::fs::metadata(&p).unwrap().permissions();
+ perm.set_mode(0o755);
+ std::fs::set_permissions(&p, perm).unwrap();
+ }
+ }
+ let prev = std::env::var_os("PATH");
+ let new_path = std::env::join_paths(
+ std::iter::once(td.path().to_path_buf())
+ .chain(std::env::split_paths(&prev.clone().unwrap_or_default())),
+ )
+ .unwrap();
+ unsafe {
+ std::env::set_var("PATH", new_path);
+ }
+ Self { _td: td, prev }
+ }
+
+ pub fn empty() -> Self {
+ let td = tempfile::tempdir().unwrap();
+ let prev = std::env::var_os("PATH");
+ unsafe {
+ std::env::set_var("PATH", td.path());
+ }
+ Self { _td: td, prev }
+ }
+}
+
+impl Drop for ScopedPath {
+ fn drop(&mut self) {
+ unsafe {
+ match &self.prev {
+ Some(v) => std::env::set_var("PATH", v),
+ None => std::env::remove_var("PATH"),
+ }
+ }
+ }
+}
+
+/// Build a minimal `Path` whose single step has the given `actor`
+/// and a `conversation.append` artifact keyed `://`.
+/// The artifact key drives the harness projector's session-id extraction;
+/// the actor satisfies `ensure_path_with_agent`.
+pub fn make_convo_path(actor: &str, artifact_key: &str) -> toolpath::v1::Path {
+ let mut extra = HashMap::new();
+ extra.insert("role".to_string(), serde_json::json!("user"));
+ extra.insert("text".to_string(), serde_json::json!("hello"));
+ let step = toolpath::v1::Step {
+ step: toolpath::v1::StepIdentity {
+ id: "s1".to_string(),
+ parents: vec![],
+ actor: actor.to_string(),
+ timestamp: "2026-01-01T00:00:00Z".to_string(),
+ },
+ change: {
+ let mut m = HashMap::new();
+ m.insert(
+ artifact_key.to_string(),
+ toolpath::v1::ArtifactChange {
+ raw: None,
+ structural: Some(toolpath::v1::StructuralChange {
+ change_type: "conversation.append".to_string(),
+ extra,
+ }),
+ },
+ );
+ m
+ },
+ meta: None,
+ };
+ toolpath::v1::Path {
+ path: toolpath::v1::PathIdentity {
+ id: "p1".to_string(),
+ base: None,
+ head: "s1".to_string(),
+ graph_ref: None,
+ },
+ steps: vec![step],
+ meta: None,
+ }
+}
+
+/// Convenience: write a single-path graph as JSON to `dir/doc.json`.
+pub fn write_path_to_temp(dir: &Path, path: toolpath::v1::Path) -> PathBuf {
+ let graph = toolpath::v1::Graph::from_path(path);
+ let p = dir.join("doc.json");
+ std::fs::write(&p, graph.to_json().unwrap()).unwrap();
+ p
+}
+
+/// Construct `ResumeArgs` for a file-input + explicit-harness test.
+pub fn args_explicit(input: PathBuf, cwd: &Path, harness: HarnessArg) -> ResumeArgs {
+ ResumeArgs {
+ input: input.to_string_lossy().to_string(),
+ cwd: Some(cwd.to_path_buf()),
+ harness: Some(harness),
+ no_cache: false,
+ force: false,
+ url: None,
+ }
+}
+
+/// Recursively walk `root` looking for a file with the given extension.
+pub fn dir_contains_file_with_ext(root: &Path, ext: &str) -> bool {
+ fn walk(p: &Path, ext: &str) -> bool {
+ if !p.exists() {
+ return false;
+ }
+ if p.is_dir() {
+ for e in std::fs::read_dir(p).unwrap() {
+ if walk(&e.unwrap().path(), ext) {
+ return true;
+ }
+ }
+ false
+ } else {
+ p.extension().and_then(|s| s.to_str()) == Some(ext)
+ }
+ }
+ walk(root, ext)
+}
diff --git a/crates/toolpath-cli/Cargo.toml b/crates/toolpath-cli/Cargo.toml
index 1fd0e37..7904a2a 100644
--- a/crates/toolpath-cli/Cargo.toml
+++ b/crates/toolpath-cli/Cargo.toml
@@ -1,6 +1,6 @@
[package]
name = "toolpath-cli"
-version = "0.8.0"
+version = "0.9.0"
edition = "2024"
license = "Apache-2.0"
repository = "https://github.com/empathic/toolpath"
@@ -14,7 +14,7 @@ name = "path"
path = "src/main.rs"
[dependencies]
-path-cli = { path = "../path-cli", version = "0.8.0" }
+path-cli = { path = "../path-cli", version = "0.9.0" }
anyhow = "1.0"
[workspace]
diff --git a/crates/toolpath-gemini/src/project.rs b/crates/toolpath-gemini/src/project.rs
index fbd2529..be8768f 100644
--- a/crates/toolpath-gemini/src/project.rs
+++ b/crates/toolpath-gemini/src/project.rs
@@ -916,10 +916,10 @@ mod tests {
.unwrap();
let msg = &convo.main.messages[0];
assert!(
- msg.extra.get("claude").is_none(),
+ !msg.extra.contains_key("claude"),
"claude namespace should not leak onto Gemini messages"
);
- assert!(msg.extra.get("codex").is_none());
+ assert!(!msg.extra.contains_key("codex"));
}
#[test]
diff --git a/site/_data/crates.json b/site/_data/crates.json
index 10469b8..563444d 100644
--- a/site/_data/crates.json
+++ b/site/_data/crates.json
@@ -97,7 +97,7 @@
},
{
"name": "path-cli",
- "version": "0.8.0",
+ "version": "0.9.0",
"description": "Unified CLI (binary: path)",
"docs": "https://docs.rs/path-cli",
"crate": "https://crates.io/crates/path-cli",
@@ -105,7 +105,7 @@
},
{
"name": "toolpath-cli",
- "version": "0.8.0",
+ "version": "0.9.0",
"description": "Deprecated alias for path-cli",
"docs": "https://docs.rs/toolpath-cli",
"crate": "https://crates.io/crates/toolpath-cli",
]