Openclaw-prompts-and-skills model-usage
Use CodexBar CLI local cost usage to summarize per-model usage for Codex or Claude, including the current (most recent) model or a full model breakdown. Trigger when asked for model-level usage/cost data from codexbar, or when you need a scriptable per-model summary from codexbar cost JSON.
install
source · Clone the upstream repo
git clone https://github.com/seedprod/openclaw-prompts-and-skills
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/seedprod/openclaw-prompts-and-skills "$T" && mkdir -p ~/.claude/skills && cp -r "$T/.claude/skills/model-usage" ~/.claude/skills/seedprod-openclaw-prompts-and-skills-model-usage && rm -rf "$T"
OpenClaw · Install into ~/.openclaw/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/seedprod/openclaw-prompts-and-skills "$T" && mkdir -p ~/.openclaw/skills && cp -r "$T/.claude/skills/model-usage" ~/.openclaw/skills/seedprod-openclaw-prompts-and-skills-model-usage && rm -rf "$T"
manifest:
.claude/skills/model-usage/SKILL.mdsource content
Model usage
Overview
Get per-model usage cost from CodexBar's local cost logs. Supports "current model" (most recent daily entry) or "all models" summaries for Codex or Claude.
TODO: add Linux CLI support guidance once CodexBar CLI install path is documented for Linux.
Quick start
- Fetch cost JSON via CodexBar CLI or pass a JSON file.
- Use the bundled script to summarize by model.
python {baseDir}/scripts/model_usage.py --provider codex --mode current python {baseDir}/scripts/model_usage.py --provider codex --mode all python {baseDir}/scripts/model_usage.py --provider claude --mode all --format json --pretty
Current model logic
- Uses the most recent daily row with
.modelBreakdowns - Picks the model with the highest cost in that row.
- Falls back to the last entry in
when breakdowns are missing.modelsUsed - Override with
when you need a specific model.--model <name>
Inputs
- Default: runs
.codexbar cost --format json --provider <codex|claude> - File or stdin:
codexbar cost --provider codex --format json > /tmp/cost.json python {baseDir}/scripts/model_usage.py --input /tmp/cost.json --mode all cat /tmp/cost.json | python {baseDir}/scripts/model_usage.py --input - --mode current
Output
- Text (default) or JSON (
).--format json --pretty - Values are cost-only per model; tokens are not split by model in CodexBar output.
References
- Read
for CLI flags and cost JSON fields.references/codexbar-cli.md