Dotfiles dotfile-audit

install
source · Clone the upstream repo
git clone https://github.com/harperreed/dotfiles
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/harperreed/dotfiles "$T" && mkdir -p ~/.claude/skills && cp -r "$T/.claude/skills/dot-file-audit" ~/.claude/skills/harperreed-dotfiles-dotfile-audit && rm -rf "$T"
manifest: .claude/skills/dot-file-audit/SKILL.md
source content

dotfile-audit

Audit and fix pipeline DOT files. Catches structural issues, missing failure paths, spec gaps, and validation warnings.

Checklist

  1. Read the DOT file
  2. Run structural audit (see Structural Checks below)
  3. If a spec/design doc exists, run spec reconciliation (see Spec Reconciliation below)
  4. Run
    dotfile validate <file>.dot
    — capture output
  5. Fix all errors and warnings
  6. Re-validate until clean (zero errors, zero warnings)
  7. Present summary of what was found and fixed

Structural Checks

Go through each check. Print results as a checklist.

Graph Attributes

[ ] Has exactly one start node (shape=Mdiamond)
[ ] Has exactly one exit node (shape=Msquare)
[ ] Has `goal` attribute in graph block
[ ] Has `model_stylesheet` in graph block
[ ] Has `default_max_retry` in graph block
[ ] Has `retry_target` in graph block
[ ] Has `rankdir` in graph block

Failure Paths (the most common issue)

For every

box
(codergen) node, verify it has at least one of:

  • goal_gate=true
    AND
    retry_target
    set
  • An outgoing edge with
    condition="outcome=fail"
    (or
    outcome=FAIL
    )
[x] plan — goal_gate=true, retry_target=plan
[x] setup — goal_gate=true, retry_target=setup
[ ] implement_feature — NO FAILURE PATH — fix needed
[x] verify_feature — has fail edge to implement_feature

Fix: For nodes missing a failure path, add

goal_gate=true
and
retry_target
pointing to the appropriate retry node (usually itself or the preceding implement node).

Human Gates

For every

hexagon
node, verify:

  • Has
    type="wait.human"
    attribute (shape alone is insufficient)
  • Has outgoing edges with accelerator key labels (
    [A] Approve
    ,
    [R] Revise
    , etc.)
  • Has
    condition
    on each outgoing edge
  • Happy path edge has
    weight=2

Edge Conditions

For every

diamond
(conditional) node, verify:

  • Has at least two outgoing edges
  • Outgoing edges have
    condition
    attributes
  • Success path has
    weight=2
  • Uses valid condition syntax:
    condition="outcome=success"
    ,
    condition="outcome=fail"

Parallel Structure

If

component
(fan-out) nodes exist, verify:

  • Each fan-out has a matching
    tripleoctagon
    (fan-in) node
  • All parallel tracks connect from the fan-out and reconnect at the fan-in
  • No edges skip the fan-in (would break synchronization)

Prompt Quality (spot check)

Spot-check at least 3

box
node prompts for:

  • References
    run.working_dir
  • Mentions specific tech stack (language, framework, deps)
  • Includes TDD instruction ("Write failing test FIRST")
  • Includes quality tool instructions (linter, formatter)
  • Includes commit instruction with conventional commit format
  • Is self-contained (doesn't reference other node outputs)

Node Attributes

Check for invented attributes that the pipeline engine doesn't support:

  • join_policy
    ,
    error_policy
    ,
    max_parallel
    ,
    default_choice
    — these don't exist
  • tools
    ,
    dependencies
    ,
    inputs
    ,
    outputs
    — not valid pipeline attributes
  • Any attribute not in the dotfile-from-spec reference tables

Spec Reconciliation

If a spec or design doc is available, cross-reference:

Phases → Nodes: Every phase listed in the spec has a corresponding DOT node.

Components → Implement + Test + Commit: Every component has implement, test/verify, and commit nodes.

Tech Stack → Prompts: Node prompts reference the correct language, framework, and dependencies.

Quality Gates → Verify Prompts: Every quality tool (linter, formatter, type checker) appears in verify node prompts.

Testing Frameworks → Test Prompts: Each test framework is referenced in the appropriate test node prompts.

Parallelism → Fan-out/Fan-in: Parallel workstreams from the spec have matching

component
/
tripleoctagon
nodes.

Models → Stylesheet + Classes:

model_stylesheet
reflects spec's model preferences. Nodes have correct
class
assignments.

Human Gates: Gates match what the user requested (or no gates if they chose headless).

Print reconciliation results:

Phases:      8/8 covered
Components:  5/5 have implement+test+commit
Tech Stack:  3/3 prompts checked — all correct
Quality:     ruff [x] mypy [x] biome [x]
Testing:     pytest [x] vitest [x] playwright [x]
Parallelism: fan-out [x] fan-in [x] tracks match [x]
Models:      stylesheet [x] classes [x]
Gates:       0 requested, 0 present [x]

Pipeline Validation

After structural fixes, run:

dotfile validate <file>.dot

Errors must be fixed — the pipeline won't run with errors.

Warnings should be fixed — they indicate potential runtime failures:

  • "no outgoing fail edge" → add
    goal_gate=true
    +
    retry_target
    or an explicit fail edge
  • "unreachable node" → fix edges or remove the node
  • "goal_gate without retry_target" → add
    retry_target

Target: zero errors AND zero warnings.

Common Issues and Fixes

IssueFix
Box node with no failure pathAdd
goal_gate=true, retry_target="node_name"
Hexagon without
type="wait.human"
Add
type="wait.human"
attribute
Diamond with only one outgoing edgeAdd the missing success or fail edge
Prompts missing
run.working_dir
Add
run.working_dir
reference to every build/verify prompt
Invented attributesRemove them — only use attributes from the dotfile-from-spec reference
Fan-out without matching fan-inAdd
tripleoctagon
fan-in node and connect parallel tracks
model_stylesheet
outside graph block
Move into
graph [ ... ]
block
Conditions using wrong caseUse
outcome=success
/
outcome=fail
(case-insensitive but be consistent)
Missing commit nodesAdd explicit commit node after each implement+verify phase
Prompts not self-containedExpand prompts to include all context — they execute in isolation