Dotfiles dotfile-audit
git clone https://github.com/harperreed/dotfiles
T=$(mktemp -d) && git clone --depth=1 https://github.com/harperreed/dotfiles "$T" && mkdir -p ~/.claude/skills && cp -r "$T/.claude/skills/dot-file-audit" ~/.claude/skills/harperreed-dotfiles-dotfile-audit && rm -rf "$T"
.claude/skills/dot-file-audit/SKILL.mddotfile-audit
Audit and fix pipeline DOT files. Catches structural issues, missing failure paths, spec gaps, and validation warnings.
Checklist
- Read the DOT file
- Run structural audit (see Structural Checks below)
- If a spec/design doc exists, run spec reconciliation (see Spec Reconciliation below)
- Run
— capture outputdotfile validate <file>.dot - Fix all errors and warnings
- Re-validate until clean (zero errors, zero warnings)
- Present summary of what was found and fixed
Structural Checks
Go through each check. Print results as a checklist.
Graph Attributes
[ ] Has exactly one start node (shape=Mdiamond) [ ] Has exactly one exit node (shape=Msquare) [ ] Has `goal` attribute in graph block [ ] Has `model_stylesheet` in graph block [ ] Has `default_max_retry` in graph block [ ] Has `retry_target` in graph block [ ] Has `rankdir` in graph block
Failure Paths (the most common issue)
For every
box (codergen) node, verify it has at least one of:
ANDgoal_gate=true
setretry_target- An outgoing edge with
(orcondition="outcome=fail"
)outcome=FAIL
[x] plan — goal_gate=true, retry_target=plan [x] setup — goal_gate=true, retry_target=setup [ ] implement_feature — NO FAILURE PATH — fix needed [x] verify_feature — has fail edge to implement_feature
Fix: For nodes missing a failure path, add
goal_gate=true and retry_target pointing to the appropriate retry node (usually itself or the preceding implement node).
Human Gates
For every
hexagon node, verify:
- Has
attribute (shape alone is insufficient)type="wait.human" - Has outgoing edges with accelerator key labels (
,[A] Approve
, etc.)[R] Revise - Has
on each outgoing edgecondition - Happy path edge has
weight=2
Edge Conditions
For every
diamond (conditional) node, verify:
- Has at least two outgoing edges
- Outgoing edges have
attributescondition - Success path has
weight=2 - Uses valid condition syntax:
,condition="outcome=success"condition="outcome=fail"
Parallel Structure
If
component (fan-out) nodes exist, verify:
- Each fan-out has a matching
(fan-in) nodetripleoctagon - All parallel tracks connect from the fan-out and reconnect at the fan-in
- No edges skip the fan-in (would break synchronization)
Prompt Quality (spot check)
Spot-check at least 3
box node prompts for:
- References
run.working_dir - Mentions specific tech stack (language, framework, deps)
- Includes TDD instruction ("Write failing test FIRST")
- Includes quality tool instructions (linter, formatter)
- Includes commit instruction with conventional commit format
- Is self-contained (doesn't reference other node outputs)
Node Attributes
Check for invented attributes that the pipeline engine doesn't support:
,join_policy
,error_policy
,max_parallel
— these don't existdefault_choice
,tools
,dependencies
,inputs
— not valid pipeline attributesoutputs- Any attribute not in the dotfile-from-spec reference tables
Spec Reconciliation
If a spec or design doc is available, cross-reference:
Phases → Nodes: Every phase listed in the spec has a corresponding DOT node.
Components → Implement + Test + Commit: Every component has implement, test/verify, and commit nodes.
Tech Stack → Prompts: Node prompts reference the correct language, framework, and dependencies.
Quality Gates → Verify Prompts: Every quality tool (linter, formatter, type checker) appears in verify node prompts.
Testing Frameworks → Test Prompts: Each test framework is referenced in the appropriate test node prompts.
Parallelism → Fan-out/Fan-in: Parallel workstreams from the spec have matching
component/tripleoctagon nodes.
Models → Stylesheet + Classes:
model_stylesheet reflects spec's model preferences. Nodes have correct class assignments.
Human Gates: Gates match what the user requested (or no gates if they chose headless).
Print reconciliation results:
Phases: 8/8 covered Components: 5/5 have implement+test+commit Tech Stack: 3/3 prompts checked — all correct Quality: ruff [x] mypy [x] biome [x] Testing: pytest [x] vitest [x] playwright [x] Parallelism: fan-out [x] fan-in [x] tracks match [x] Models: stylesheet [x] classes [x] Gates: 0 requested, 0 present [x]
Pipeline Validation
After structural fixes, run:
dotfile validate <file>.dot
Errors must be fixed — the pipeline won't run with errors.
Warnings should be fixed — they indicate potential runtime failures:
- "no outgoing fail edge" → add
+goal_gate=true
or an explicit fail edgeretry_target - "unreachable node" → fix edges or remove the node
- "goal_gate without retry_target" → add
retry_target
Target: zero errors AND zero warnings.
Common Issues and Fixes
| Issue | Fix |
|---|---|
| Box node with no failure path | Add |
Hexagon without | Add attribute |
| Diamond with only one outgoing edge | Add the missing success or fail edge |
Prompts missing | Add reference to every build/verify prompt |
| Invented attributes | Remove them — only use attributes from the dotfile-from-spec reference |
| Fan-out without matching fan-in | Add fan-in node and connect parallel tracks |
outside graph block | Move into block |
| Conditions using wrong case | Use / (case-insensitive but be consistent) |
| Missing commit nodes | Add explicit commit node after each implement+verify phase |
| Prompts not self-contained | Expand prompts to include all context — they execute in isolation |