Claude-skill-registry ai-orchestration

Multi-model AI collaboration via orchestrator MCP. Use when seeking second opinions, debugging complex issues, building consensus on architectural decisions, conducting code reviews, or needing external validation on analysis.

install
source · Clone the upstream repo
git clone https://github.com/majiayu000/claude-skill-registry
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/majiayu000/claude-skill-registry "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/data/ai-orchestration" ~/.claude/skills/majiayu000-claude-skill-registry-ai-orchestration && rm -rf "$T"
manifest: skills/data/ai-orchestration/SKILL.md
source content

AI CLI Orchestration

Query external AI models (claude, codex, gemini) for second opinions, debugging, consensus building, and expert validation.

Tools Overview

ToolModeDescription
ai_call
SynchronousCall AI and wait for result
ai_spawn
AsyncStart AI in background, get job ID
ai_fetch
AsyncGet result from spawned AI (with timeout)
ai_list
UtilityList all running/completed AI jobs
ai_review
ConvenienceSpawn all 3 AIs in parallel with same prompt

Role Hierarchy

CLIRoleModeCapabilities
claudeWorker/PeerFullCan execute any tool/command
codexReviewerRead-onlyCode review, analysis, suggestions
geminiResearcherRead-onlyWeb search, documentation lookup

Parallel Execution (Recommended)

# Spawn all 3 models in parallel
claude_job = ai_spawn(cli="claude", prompt="Analyze this code for bugs...")
codex_job = ai_spawn(cli="codex", prompt="Review this code for patterns...")
gemini_job = ai_spawn(cli="gemini", prompt="Research best practices for...")

# All running simultaneously! Fetch results:
claude_result = ai_fetch(job_id=claude_job.job_id, timeout=120)
codex_result = ai_fetch(job_id=codex_job.job_id, timeout=120)
gemini_result = ai_fetch(job_id=gemini_job.job_id, timeout=120)

# Total time = slowest model (~60s) instead of sum (~180s)

Or use

ai_review
for convenience:

review = ai_review(prompt="Analyze this architecture decision...", files=["src/"])
claude_result = ai_fetch(job_id=review.jobs["claude"].job_id, timeout=120)

When to Use External Models

Do use when: Stuck on complex bugs, architectural decisions with tradeoffs, need validation before major refactoring, security-sensitive code, want diverse perspectives

Don't use when: Simple work, already confident, just executing known solution

References

Tips

  • Use parallel for multi-model:
    ai_spawn
    +
    ai_fetch
    is 3x faster than sequential
  • Be specific: Include file paths, error messages, and context
  • Use appropriate CLI: codex for code review, gemini for web search
  • Delegate complex work: Use sub-agents for structured analysis
  • Remember read-only: Codex and Gemini cannot execute commands or modify files
  • Include files: Use the
    files
    parameter to provide code context
  • Monitor jobs: Use
    ai_list()
    to check status of all running jobs