Skip to content

Add Portable AI Kit CLI scaffold#16

Draft
mdheller wants to merge 15 commits intomainfrom
work/portable-ai-kit
Draft

Add Portable AI Kit CLI scaffold#16
mdheller wants to merge 15 commits intomainfrom
work/portable-ai-kit

Conversation

@mdheller
Copy link
Copy Markdown
Contributor

@mdheller mdheller commented May 4, 2026

Summary

Adds the first SourceOS Portable AI Kit implementation slice.

This PR now introduces:

  • docs/integration/portable-ai-kit.md as the product/integration spec;
  • sourceosctl/commands/portable_ai.py with profiles, hardened preflight, prepare, start-plan, inspect, and evidence inspection helpers;
  • sourceosctl/commands/portable_ai_cli.py as the command-group parser;
  • sourceosctl/commands/portable_ai_byom.py for BYOM local model hash verification and manifest materialization;
  • bin/sourceosctl plugin routing for sourceosctl portable-ai ...;
  • bin/sourceos-portable-ai as a standalone compatibility entrypoint reusing the same parser;
  • tests/test_portable_ai_cli.py coverage for profiles, preflight, benchmark temp-file cleanup, dry-run prepare, guarded execute, BYOM dry-run hashing, BYOM policy gate, BYOM manifest/evidence writing, start-plan, inspect, and evidence inspection;
  • README demo path, BYOM verification path, and Portable AI Kit defaults.

Preflight hardening included

portable-ai preflight emits structured facts for:

  • disk/free-space posture;
  • mount source, filesystem type, read-only state, and large-file suitability;
  • Linux findmnt/lsblk and macOS df/mount/diskutil best-effort block details;
  • removable-media confidence;
  • host platform, CPU count, machine architecture, Python version, and RAM when detectable;
  • runtime binary detection for Ollama-compatible, llama.cpp-style, and Python surfaces;
  • opt-in temporary read/write benchmark with cleanup evidence.

The benchmark remains explicit via --benchmark. Without that flag, preflight remains non-mutating.

BYOM verification included

portable-ai byom verify now supports local model file verification without downloads or provider calls:

python3 bin/sourceosctl portable-ai byom verify /tmp/SOURCEOS_AI ./models/example.gguf --name example
python3 bin/sourceosctl portable-ai byom verify /tmp/SOURCEOS_AI ./models/example.gguf --name example --execute --policy-ok --evidence-out ./byom-evidence.json

The command computes SHA-256 and file size, generates a SourceOS ModelCarryPack manifest, records license/provenance posture, and can optionally copy the file into models/blobs with --copy. It does not download a model, start a runtime, grant network, grant tool use, or store prompt bodies.

Product target

Match the immediately graspable portable-USB local-AI capability of small projects like Portable-AI-USB while exceeding them on SourceOS-grade governance:

  • signed/pinned manifests rather than unmanaged downloads;
  • prompt egress denied by default;
  • host writes denied or scoped by profile;
  • dry-run by default;
  • materialization gated by --execute --policy-ok;
  • secret-free evidence records;
  • Agent Machine activation as the runtime authority.

Validation notes

Connector-visible workflow/status results for the current head SHA are empty, so CI is not confirmed through this path. Expected local validation:

python3 -m unittest discover -s tests -v
make validate
python3 bin/sourceosctl portable-ai profiles
python3 bin/sourceosctl portable-ai preflight /tmp/SOURCEOS_AI
python3 bin/sourceosctl portable-ai preflight /tmp/SOURCEOS_AI --benchmark
python3 bin/sourceosctl portable-ai prepare /tmp/SOURCEOS_AI --profile tiny-router --dry-run
python3 bin/sourceosctl portable-ai byom verify /tmp/SOURCEOS_AI ./models/example.gguf --name example

Follow-up acceptance criteria

  • Add Agent Machine activation handoff once runtime receipts land.
  • Add runtime start/stop plan or launcher handoff for Ollama-compatible local provider.
  • Add release/package install path for non-checkout users.
  • Add signed model catalog entries after hash/license review.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant