Cortex cortex-consolidate
Run memory maintenance — decay old memories, compress stale content, consolidate episodic memories into semantic knowledge, and run sleep-like replay. Use when the user says 'clean up memories', 'consolidate', 'run maintenance', 'compress old memories', 'memory cleanup', or periodically to keep the memory system healthy. Also use after importing many memories or at the end of a long session.
git clone https://github.com/cdeust/Cortex
T=$(mktemp -d) && git clone --depth=1 https://github.com/cdeust/Cortex "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/cortex-consolidate" ~/.claude/skills/cdeust-cortex-cortex-consolidate && rm -rf "$T"
skills/cortex-consolidate/SKILL.mdConsolidate — Memory Maintenance and Evolution
Keywords
consolidate, maintenance, cleanup, compress, decay, merge, evolve, sleep, replay, consolidation, memory health, prune, clean up, optimize memories, reduce noise
Overview
Run the full memory maintenance pipeline — modeled after biological memory consolidation. This includes heat decay (cooling unused memories), compression (full text to gist to tags), CLS consolidation (episodic to semantic), causal graph discovery, and sleep-like replay that strengthens important memory clusters.
Use this skill when: After a long session, after bulk imports, periodically (weekly), or when memory_stats shows too many hot memories or high noise.
Workflow
Step 1: Run Full Consolidation
cortex:consolidate({})
This runs the complete pipeline:
- Decay cycle — Cool memories by heat * decay_factor. Memories below cold threshold (0.05) become candidates for compression
- Compression — Old memories compress through stages: full text (7+ days) to gist, gist (30+ days) to tags
- CLS consolidation — Frequently-accessed episodic memories promote to semantic store (like hippocampal-to-cortical transfer)
- Causal discovery — PC Algorithm runs on entity co-occurrences to discover causal relationships
- Sleep compute — Dream-like replay strengthens clusters, summarizes related memories, and re-embeds compressed content
Step 2: Review Results
The response includes:
— how many cooled downmemories_decayed
— how many were compressed (and to what level)memories_compressed
— how many promoted from episodic to semanticmemories_consolidated
— new relationships foundcausal_edges_discovered
— memory clusters that were replayed and strengthenedreplay_clusters
Step 3: Selective Operations
For targeted maintenance instead of the full pipeline:
Forget specific memories:
cortex:forget({ "memory_id": <id>, "hard": false })
Soft delete (sets heat to 0) by default. Use
"hard": true for permanent deletion. Protected memories require explicit "force": true.
Save checkpoint before risky operations:
cortex:checkpoint({ "action": "save", "label": "before-consolidation" })
Restore if something went wrong:
cortex:checkpoint({ "action": "restore", "label": "before-consolidation" })
Tips
- Don't over-consolidate: Running too frequently prevents memories from naturally developing heat signals. Weekly is usually sufficient.
- Check stats first: Run
before consolidating to understand what needs maintenancecortex:memory_stats - Checkpoint before bulk operations: Always save a checkpoint before consolidation if you have critical memories
- After backfill: Always consolidate after
to process the imported memories through the full pipelinecortex:backfill_memories