AutoResearchClaw meta-analysis

Statistical methods for combining results across multiple studies. Use when aggregating cross-study or cross-experiment results.

install
source · Clone the upstream repo
git clone https://github.com/aiming-lab/AutoResearchClaw
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/aiming-lab/AutoResearchClaw "$T" && mkdir -p ~/.claude/skills && cp -r "$T/researchclaw/skills/builtin/experiment/meta-analysis" ~/.claude/skills/aiming-lab-autoresearchclaw-meta-analysis && rm -rf "$T"
manifest: researchclaw/skills/builtin/experiment/meta-analysis/SKILL.md
source content

Meta-Analysis Best Practice

When comparing results across studies or experiments:

  1. Report effect sizes, not just p-values
  2. Use standardized metrics for cross-study comparison
  3. Account for heterogeneity (different setups, datasets, seeds)
  4. Report confidence intervals alongside point estimates
  5. Use forest plots to visualize cross-study comparisons
  6. Identify and discuss outliers or inconsistent results
  7. Consider publication bias when interpreting aggregate results