Gsd-skill-creator science-communication

Principles and practices of communicating science to diverse audiences. Covers the CER (claims-evidence-reasoning) framework, lab report structure, peer review, scientific argumentation, audience adaptation, the baloney detection kit for evaluating sources, and the art of making complex ideas accessible without sacrificing accuracy. Use when writing, presenting, evaluating, or teaching science communication at any level.

install
source · Clone the upstream repo
git clone https://github.com/Tibsfox/gsd-skill-creator
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/Tibsfox/gsd-skill-creator "$T" && mkdir -p ~/.claude/skills && cp -r "$T/examples/skills/science/science-communication" ~/.claude/skills/tibsfox-gsd-skill-creator-science-communication && rm -rf "$T"
manifest: examples/skills/science/science-communication/SKILL.md
source content

Science Communication

Science that cannot be communicated cannot contribute to collective knowledge. Communication is not a cosmetic addition to the scientific process -- it is an integral part of it. A discovery that is not shared, a method that is not documented in enough detail to replicate, a finding that is not subjected to peer scrutiny, is not yet fully scientific. This skill covers the full spectrum of science communication: formal scientific writing, public-facing explanation, argumentation, source evaluation, and the discipline of adapting complex ideas to different audiences without losing accuracy.

Agent affinity: sagan (public communication, narrative), pestalozzi (pedagogical adaptation)

Concept IDs: sci-claims-evidence-reasoning, sci-lab-reports, sci-peer-review, sci-evaluating-sources

The Communication Spectrum

ContextAudiencePrimary goalKey constraint
Lab reportTeacher / peersDocument methods and findings for evaluationCompleteness and replicability
Research paperScientific communityAdvance collective knowledgeRigor and peer review
Conference talkSpecialistsShare findings with immediate feedbackTime limit, visual clarity
Public articleGeneral audienceInform and engageAccessibility without inaccuracy
Science journalismNews readersConvey significance of findingsBrevity, accuracy, avoiding sensationalism
Policy briefDecision-makersInform policy with evidenceClarity, relevance, separation of evidence from recommendation
Classroom explanationStudentsBuild understandingLevel-appropriate, connected to prior knowledge

Each context requires different vocabulary, detail level, and framing. The science does not change. The communication does.

The Claims-Evidence-Reasoning (CER) Framework

CER is the foundational structure of scientific argumentation, applicable at every level from elementary school through peer-reviewed papers.

Claim: A statement that answers the question. "Blue light reduces stem elongation compared to red light."

Evidence: Specific data or observations that support the claim. "Plants grown under blue light averaged 8.2 cm after 14 days (SD = 0.3, n = 30), compared to 12.1 cm under red light (SD = 0.3, n = 30). The difference was statistically significant (t(58) = 4.2, p < 0.001, d = 1.1)."

Reasoning: The scientific principle that connects the evidence to the claim. "Blue light activates cryptochrome photoreceptors, which suppress the production of auxins responsible for cell elongation. The observed height difference is consistent with this photomorphogenic response."

Why CER matters: It makes the logical structure of scientific argument visible. A claim without evidence is an assertion. Evidence without reasoning is data. Reasoning without evidence is speculation. CER requires all three.

CER Quality Rubric

ComponentWeakStrong
ClaimVague, untestable, or opinionSpecific, testable, directly answers the question
EvidenceAnecdotal, cherry-picked, impreciseQuantitative, systematic, includes uncertainty
ReasoningMissing, circular, or invokes authorityConnects evidence to claim via scientific principle, addresses alternative explanations

Lab Report Structure

The lab report is the standard format for documenting scientific work. Its sections map to the stages of the scientific method:

Title

Descriptive, specific, and informative. "The Effect of Light Wavelength on Arabidopsis Seedling Stem Elongation" is good. "Plant Experiment" is not.

Introduction

  • What question is being investigated and why it matters
  • Relevant background (what is already known)
  • The specific hypothesis being tested
  • Length: 1-3 paragraphs for a student lab, 1-2 pages for a research paper

Methods

  • What was done, in enough detail that another person could repeat the experiment
  • Materials and equipment listed with specifications
  • Procedure described in past tense, third person
  • Statistical methods specified
  • The replication test: If you handed this methods section to a stranger, could they run the experiment without asking you any questions?

Results

  • Data presented in tables and graphs with appropriate statistics
  • Patterns described in text, with reference to specific figures and tables
  • No interpretation in Results. State what was observed, not what it means. "Group A was significantly taller than Group B (p < 0.001)" is a result. "This suggests that the treatment works" is interpretation and belongs in Discussion.

Discussion

  • What the results mean in the context of the hypothesis
  • Whether the hypothesis was supported or refuted
  • Alternative explanations for the observed results
  • Limitations of the study
  • Suggestions for future work
  • Connection to broader scientific understanding

References

  • Every claim of prior knowledge cited with a specific source
  • Consistent citation format (APA, ACS, or as specified)

Peer Review

Peer review is the process by which scientific work is evaluated by independent experts before publication. It is the primary quality control mechanism for scientific knowledge.

How Peer Review Works

  1. Author submits a manuscript to a journal.
  2. Editor assesses whether the manuscript is appropriate for the journal and sends it to 2-3 independent reviewers (experts in the field).
  3. Reviewers evaluate methodology, analysis, conclusions, and significance.
  4. Reviewers recommend: accept, minor revision, major revision, or reject.
  5. Editor makes a decision based on reviewer recommendations.
  6. Author revises and resubmits (often multiple rounds).

What Peer Review Can Do

  • Identify methodological flaws before publication
  • Catch errors in analysis or reasoning
  • Suggest additional experiments or analyses
  • Improve clarity and completeness of the manuscript

What Peer Review Cannot Do

  • Guarantee correctness (reviewers may miss errors)
  • Detect fraud (deliberate data fabrication is hard to catch from the manuscript alone)
  • Assess replicability (that requires independent replication, not review)
  • Prevent publication bias (journals preferentially publish positive results)

Teaching Peer Review

Students learn scientific argumentation by reviewing each other's work. A structured peer review exercise:

  1. Provide students with a CER rubric
  2. Each student reviews one classmate's lab report
  3. Review is constructive: identify strengths first, then weaknesses with specific suggestions
  4. The reviewed student revises based on feedback
  5. Class discussion: what did reviewers catch? What did they miss?

Audience Adaptation

The same scientific finding must be communicated differently to different audiences. The content stays accurate; the framing, vocabulary, and detail level change.

Adaptation Framework

AudienceVocabularyDetailFramingWhat to includeWhat to omit
ExpertTechnical jargon assumedFull methodological detail"Here is what we found"Statistical details, methods, limitationsBasic background (they know it)
StudentJargon introduced with definitionsModerate detail, scaffolded"Here is how we know"Worked examples, diagrams, analogiesCutting-edge nuances beyond their level
General publicEveryday language, analogiesEssential findings only"Here is why this matters to you"Impact, context, wonderP-values, technical methods
PolicymakerClear, non-technicalSummary with confidence levels"Here is what the evidence says"Policy-relevant implications, uncertainty rangesIndividual studies (cite consensus)

The Analogy Discipline

Analogies make abstract concepts accessible. But every analogy eventually breaks down. Good science communication uses analogies AND flags where they fail:

"DNA is like a blueprint for building an organism. But unlike a building blueprint, DNA is also the construction crew -- it encodes the instructions AND the machinery that reads them. And unlike a blueprint, DNA can copy itself, with occasional errors that are the raw material for evolution."

The initial analogy gets the reader in. The corrections build real understanding.

Evaluating Scientific Sources

The Baloney Detection Kit (Sagan, 1995)

Carl Sagan's framework for evaluating claims:

  1. Independent confirmation. Has the finding been replicated by independent researchers?
  2. Substantive debate. Are knowledgeable people arguing about the details, or is there broad consensus?
  3. Authority is not evidence. Expert credentials lend credibility to a claim but do not prove it. Evidence proves it.
  4. Multiple hypotheses. Have alternative explanations been considered and tested?
  5. Quantification. Are claims backed by numbers, or only by anecdotes?
  6. Falsifiability. Can the claim be proven wrong? If not, it is not a scientific claim.
  7. Occam's razor. Among competing explanations that fit the data equally well, the simplest is preferred.

Source Quality Hierarchy

SourceReliabilityWhy
Peer-reviewed journal articleHighestReviewed by experts before publication
Government agency report (NOAA, CDC, NASA)HighInstitutional review, large datasets
University press releaseMediumSummarizes peer-reviewed work but may oversimplify
Science journalism (reputable outlets)MediumInterpreted by journalists, varying quality
WikipediaMedium (for overview)Community-edited, check the cited sources
Social media, blogs, YouTubeLow (without verification)No review process, may be accurate or wildly wrong
"My friend said..."Not scientificAnecdote, not evidence

Common Communication Mistakes

MistakeProblemFix
"Science proves..."Science provides evidence, not proof"Evidence strongly supports..."
Reporting only p-valuesStatistical significance without effect size is incompleteAlways report effect size alongside significance
Correlation presented as causation"X is associated with Y" becomes "X causes Y" in the headlineExplicitly state the study type and what it can/cannot show
False balancePresenting a fringe view as equal to scientific consensusReport the weight of evidence, not just the existence of disagreement
Missing uncertainty"The temperature will rise 3 degrees" without a range"The best estimate is 3 degrees (range: 1.5 to 4.5)"
Jargon without definitionTechnical terms exclude the audienceDefine terms on first use, or use everyday language

Cross-References

  • sagan agent: Public science communication specialist. Draws on this skill for communication principles and narrative techniques.
  • pestalozzi agent: Pedagogical communication. Adapts scientific content for student audiences using this skill's adaptation framework.
  • feynman-s agent: Evaluates whether communicated science maintains methodological accuracy.
  • darwin agent: Synthesizes specialist outputs into user-facing communication.
  • scientific-method skill: The inquiry process that science communication documents and shares.
  • data-analysis-sci skill: The statistical methods whose results must be communicated accurately.

References

  • Sagan, C. (1995). The Demon-Haunted World: Science as a Candle in the Dark. Random House.
  • McNeill, K. L., & Krajcik, J. (2012). Supporting Grade 5-8 Students in Constructing Explanations in Science. Pearson.
  • Olson, R. (2009). Don't Be Such a Scientist. Island Press.
  • National Research Council. (2012). A Framework for K-12 Science Education. National Academies Press.
  • Fischhoff, B., & Scheufele, D. A. (2013). "The science of science communication." PNAS, 110(Supplement 3), 14031-14032.