Untether codex-opencode-pi
install
source · Clone the upstream repo
git clone https://github.com/littlebearapps/untether
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/littlebearapps/untether "$T" && mkdir -p ~/.claude/skills && cp -r "$T/.claude/skills/codex-opencode-pi" ~/.claude/skills/littlebearapps-untether-codex-opencode-pi && rm -rf "$T"
manifest:
.claude/skills/codex-opencode-pi/SKILL.mdsource content
Codex, OpenCode, and Pi Runner Protocols
These three engines are non-interactive only — no control channel, no permission prompts. They extend
JsonlSubprocessRunner directly (unlike ClaudeRunner which overrides run_impl).
Quick comparison
| Aspect | Codex | OpenCode | Pi |
|---|---|---|---|
| CLI | | | |
| Event model | Turn-based (items in turns) | Step-based (tools in steps) | Agent-based (messages + tools) |
| Resume line | | | |
| Resume token | thread_id (UUID) | ses_XXXX (26+ chars) | UUID short (8 chars) or path |
| Final answer | item | Accumulated events | assistant content |
| Error signal | | event / missing | in |
Codex
Key files
| File | Purpose |
|---|---|
| implementation |
| msgspec structs for Codex events |
| JSONL event shapes |
| Event mapping spec |
CLI invocation
codex exec --json --skip-git-repo-check --color=never \ [--model MODEL] [--session THREAD_ID -] [-]
- Prompt on stdin (trailing
means read stdin)- - Resume:
--session <thread_id> -
for clean output--skip-git-repo-check --color=never
JSONL events
| Event | Untether mapping |
|---|---|
| |
| |
| |
| |
| |
| |
(transient) | Progress note (reconnect handling) |
Item types
| Item type | ActionKind | Notes |
|---|---|---|
| | ok = (status=="completed" && exit_code==0) |
| | Title: |
| | |
| | Title: query |
| | , |
| | Reasoning text |
| (not emitted) | Stored as final answer candidate |
(item) | | Non-fatal error |
Final answer selection
Multiple
agent_message items may appear. Selection:
- Prefer item with
phase == "final_answer" - Fall back to last unnamed
agent_message - Used in
CompletedEvent.answer
Config keys
[codex] profile = "Codex" # Codex profile name extra_args = [] # additional CLI flags
OpenCode
Key files
| File | Purpose |
|---|---|
| implementation |
| msgspec structs for OpenCode events |
| Runner spec |
| JSONL event shapes |
| Event mapping spec |
CLI invocation
opencode run --format json [--session SESSION_ID] [--model MODEL] -- <prompt>
- Prompt as positional arg after
-- - Resume:
--session ses_XXX - Session IDs:
prefix + 20+ charsses_
JSONL events
| Event | Untether mapping |
|---|---|
(first, with ) | |
(status="completed") | |
(status="error") | |
| Accumulated as final answer (no action) |
(reason="stop") | |
| |
Tool mapping
| Tool | ActionKind |
|---|---|
, | |
, , | |
, , | |
, , , | |
, | |
| |
| (other) | |
Not yet implemented
Usage accumulation: OpenCode's
step_finish may include token/cost data but the runner does not currently extract it. CompletedEvent.usage is not populated.
Config keys
[opencode] model = "claude-sonnet-4-5-20250929"
Pi
Key files
| File | Purpose |
|---|---|
| implementation |
| msgspec structs for Pi events |
| Runner spec |
| JSONL event shapes |
| Event mapping spec |
CLI invocation
pi --print --mode json [--session SESSION_PATH] \ [--provider PROVIDER] [--model MODEL] <prompt>
- Prompt as positional arg (prefixed with space if starts with
)- - Resume:
(short ID or full path)--session <token> - Minimum version: 0.45.1
- Environment:
,NO_COLOR=1
(set by runner)CI=1
JSONL events
| Event | Untether mapping |
|---|---|
| Session ID extraction, possible ID promotion |
| |
| |
| |
(assistant) | Final answer + usage stored |
| |
Session ID promotion
Pi has a unique resume mechanism:
- For new runs, Untether generates a session
file path.jsonl - If a
header arrives with a UUID, the resume token is promoted to the 8-char short IDsession
flag ensures this only happens onceallow_id_promotion- This gives a user-friendly resume token instead of a long path
Session path format:
~/.pi/agent/sessions/--<sanitized-cwd>--/<date>-<uuid>.jsonl
Tool mapping
| Tool | ActionKind | Title source |
|---|---|---|
| | |
, | | |
, , , | | or |
| (other) | | tool name |
Error detection
instopReason
:message_end
or"error"
->"aborted"ok=False- No
received ->agent_endCompletedEvent(ok=False, error="stream ended...")
Config keys
[pi] provider = "anthropic" # or "openai", "google", etc. model = "claude-sonnet-4-5-20250929" extra_args = []
Adding a new engine
To add a new engine runner:
- Create
src/untether/runners/myengine.py - Define schemas in
src/untether/schemas/myengine.py - Implement
with required template methodsMyEngineRunner(JsonlSubprocessRunner) - Export
BACKEND = EngineBackend(id="myengine", build_runner=..., cli_cmd="myengine") - Register in
entry points:pyproject.tomlmyengine = "untether.runners.myengine:BACKEND" - Add reference docs in
docs/reference/runners/myengine/ - Add tests mirroring existing
patternstests/test_*_runner.py