Awesome-claude-corporate-skills user-research-synthesizer

Synthesize user research findings from interviews, surveys, and analytics. Create insight reports, customer journey maps, and actionable recommendations based on research data and qualitative findings.

install
source · Clone the upstream repo
git clone https://github.com/w95/awesome-claude-corporate-skills
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/w95/awesome-claude-corporate-skills "$T" && mkdir -p ~/.claude/skills && cp -r "$T/09-product-management/user-research-synthesizer" ~/.claude/skills/w95-awesome-claude-corporate-skills-user-research-synthesizer && rm -rf "$T"
manifest: 09-product-management/user-research-synthesizer/SKILL.md
source content

User Research Synthesizer

Overview

The User Research Synthesizer skill enables product managers to extract meaningful insights from multiple research sources, identify patterns, and translate findings into actionable product recommendations. It bridges qualitative and quantitative research.

When to Use This Skill

  • Consolidating findings from customer research
  • Identifying user needs and pain points
  • Mapping customer journey and touchpoints
  • Discovering feature opportunities
  • Validating product assumptions
  • Creating evidence-based recommendations
  • Communicating research insights to stakeholders

Research Synthesis Methodology

Multi-Source Research Consolidation

Research Sources Integration:

Qualitative Research (What users think and feel)

  • Customer interviews (depth, stories, motivations)
  • Usability testing (observed behavior, pain points)
  • Focus groups (broader perspectives, discussion)
  • Open-ended surveys (rich feedback, themes)

Quantitative Research (How many users experience something)

  • Usage analytics (feature adoption, engagement patterns)
  • Surveys (scale of opinions, segment differences)
  • Customer feedback (NPS comments, satisfaction scores)
  • A/B testing results (statistical validation)

Behavioral Signals (What users actually do)

  • Product analytics (feature usage, drop-off points)
  • Support tickets (problems experienced)
  • Feature request trends (demand signals)
  • Churn analysis (why users leave)

Synthesis Process Framework

Step 1: Data Collection

  • Compile all research raw data
  • Organize by source and date
  • Ensure consistent note-taking format
  • Document research context and sample size

Step 2: Individual Source Analysis

  • Interview transcription and tagging
  • Survey data cleaning and basic analysis
  • Analytics report generation
  • Support ticket categorization

Step 3: Cross-Source Pattern Identification

  • Identify themes appearing in multiple sources
  • Note conflicting findings (important!)
  • Look for statistical validation of qualitative themes
  • Assess confidence in findings

Step 4: Insight Development

  • Define key findings (pattern + implication)
  • Prioritize by frequency, impact, and confidence
  • Generate actionable recommendations
  • Identify areas needing more research

Step 5: Communication and Action

  • Create insight report with visuals
  • Present findings to stakeholders
  • Translate insights to feature opportunities
  • Plan follow-up research if needed

Interview Synthesis Framework

Interview Data Organization

Research Report Header:

  • Research objective: [What question we're answering]
  • Methodology: [In-depth interviews, format]
  • Sample size: [Number of interviews conducted]
  • Participant profile: [Who we interviewed]
  • Interview dates: [Date range]
  • Duration per interview: [Minutes]

Codebook Development (Interview Themes)

Theme 1: Time Management Challenges

  • Definition: Users struggle to balance multiple competing tasks
  • Keywords: "priority", "urgent", "deadline", "overwhelmed"
  • Frequency: Mentioned in 8 of 12 interviews (67%)
  • Representative quote: "I'm constantly torn between what's urgent and what's important"
  • Subcategories:
    • Task prioritization difficulty
    • Meeting overload
    • Calendar conflicts
    • Notification fatigue

Theme 2: Team Communication Friction

  • Definition: Lack of clarity about project status leads to miscommunication
  • Keywords: "status meeting", "didn't know", "confusion", "alignment"
  • Frequency: Mentioned in 10 of 12 interviews (83%)
  • Representative quote: "We spend 3 hours in sync meetings that could be 20-minute updates"
  • Subcategories:
    • Status synchronization
    • Asynchronous vs. synchronous balance
    • Timezone challenges
    • Document sprawl

Theme 3: Tool Integration Complexity

  • Definition: Users juggle multiple tools and integrations are fragmented
  • Keywords: "switching between", "copy-paste", "sync", "scattered"
  • Frequency: Mentioned in 7 of 12 interviews (58%)
  • Representative quote: "I'm constantly copying information between 5 different tools"
  • Subcategories:
    • Data duplication
    • Switching costs
    • Integration breaks
    • Learning curve

Interview Analysis Table

ThemeFrequencyStrengthConfidenceSeverityOpportunity
Time Management8/12 (67%)StrongHighHighHigh
Communication10/12 (83%)Very StrongVery HighCriticalCritical
Tool Integration7/12 (58%)MediumMediumMediumMedium
Feature X3/12 (25%)WeakLowLowLow

Insight Extraction Template

Insight 1: Status Synchronization is Critical

  • Supporting data:
    • 10 of 12 interviewed users mentioned status visibility
    • Average 3.5 hours per week in status update meetings
    • Product analytics: 60% of log-ins are to check team status
  • Customer quote: "We could cut meeting time in half if people knew what everyone was working on without asking"
  • Business implication:
    • Core workflow pain point affecting majority of users
    • Opportunity to reduce meeting time (productivity gain)
    • Potential for product differentiation
  • Recommended action:
    • Prioritize real-time status visibility feature
    • Design activity feeds and notifications
    • Validate with prototype testing

Customer Journey Mapping

Journey Definition

Journey Name: [Scenario being mapped] Duration: [Timeframe of journey] Persona: [Who goes on this journey] Objective: [What they're trying to achieve]

Journey Stage Framework

Stage 1: Awareness (User recognizes problem)

  • Situation: [Context user is in]
  • Tasks: [What user is trying to do]
  • Touchpoints: [Where they interact with you]
    • Discovery channels: Search, recommendations, word-of-mouth
    • Awareness content: Blog, social media, product hunt
    • Touchpoint analysis: Which most effective?
  • Pain points: [Friction they experience]
    • Unclear value proposition
    • Hard to find solution
    • Trust/credibility questions
  • Emotions: [How they feel]
    • Frustration with current approach
    • Hopeful about potential solution
    • Skeptical of new tools
  • Opportunities: [How to improve experience]
    • Clearer positioning
    • SEO optimization
    • Social proof and testimonials

Stage 2: Consideration (Evaluating solutions)

  • Situation: [Research and evaluation happening]
  • Tasks: [Specific research they do]
    • Reading reviews and comparisons
    • Requesting demos
    • Talking to customers
    • Free trial signup
  • Touchpoints: [Interaction points during evaluation]
    • Product website and pricing
    • G2/Capterra reviews
    • Customer testimonials
    • Sales conversations
    • Free trial experience
  • Pain points: [Friction in evaluation process]
    • Demo scheduling difficulty
    • Incomplete feature information
    • Pricing not transparent
    • Trial too limited to evaluate
  • Emotions: [Emotional state during evaluation]
    • Cautious optimism
    • Comparison anxiety
    • Risk aversion
    • Decision fatigue
  • Opportunities: [Improve conversion]
    • Streamline demo process
    • Transparent feature comparison
    • Social proof (reviews, case studies)
    • Longer/fuller trials

Stage 3: Purchase (Buying and setup)

  • Situation: [Decision made, implementing]
  • Tasks: [Getting product set up and running]
    • Signing contracts/purchasing
    • Data migration/setup
    • Team onboarding
    • Initial configuration
  • Touchpoints: [Implementation interactions]
    • Sales conversations
    • Implementation team
    • Setup documentation
    • Support tickets
    • Onboarding webinars
  • Pain points: [Friction during implementation]
    • Slow contract negotiation
    • Data migration complexity
    • Poor onboarding materials
    • Lack of implementation support
    • Integration/setup issues
  • Emotions: [State during implementation]
    • Excitement about new tool
    • Implementation anxiety
    • Concerns about team adoption
    • Pressure to show ROI quickly
  • Opportunities: [Improve onboarding]
    • Streamlined contracts
    • Assisted data migration
    • Better onboarding content
    • Dedicated support during setup
    • Quick-win focused training

Stage 4: Adoption (Using product daily)

  • Situation: [Product in use, teams learning]
  • Tasks: [Daily product usage]
    • Learning features
    • Using core workflows
    • Teaching team members
    • Integrating with other tools
  • Touchpoints: [Regular product interactions]
    • In-product tutorials
    • Documentation
    • Support tickets
    • In-app onboarding
    • Customer community
  • Pain points: [Usage friction]
    • Feature discovery (didn't know feature existed)
    • Workflow inefficiency
    • Integration issues
    • Team resistance to change
    • Poor in-app guidance
  • Emotions: [During adoption phase]
    • Learning curve frustration
    • Excitement about productivity gains
    • Team adoption pressure
    • ROI justification stress
  • Opportunities: [Improve adoption]
    • Better in-app guidance
    • Progress tracking/milestones
    • Success metrics dashboard
    • Community and peer learning
    • Personalized recommendations

Stage 5: Retention (Ongoing use, renewal)

  • Situation: [Regular, habitual product usage]
  • Tasks: [Ongoing product usage]
    • Daily workflows using product
    • Monitoring dashboards
    • Managing team collaboration
    • Optimizing usage
  • Touchpoints: [Regular interactions]
    • Product usage itself
    • Support for questions
    • Feature updates
    • Renewal conversations
    • Usage analytics
  • Pain points: [Retention friction]
    • Competing tools introduced
    • Feature requests not addressed
    • Performance degradation
    • Team expansion costs
    • Better alternatives appear
  • Emotions: [Ongoing satisfaction state]
    • Routine/habitual use
    • Pride in productivity
    • Occasional frustration
    • Switching cost anxiety
  • Opportunities: [Improve retention and expansion]
    • New feature education
    • Advanced use case training
    • ROI reporting
    • VIP support
    • Expansion pricing for teams

Visual Journey Map Template

STAGE:        Awareness    Consideration    Purchase    Adoption    Retention
              ─────────────────────────────────────────────────────────────────
Touchpoints:  Blog post    Demo call        Contract    Onboarding  Updates
              Review site  Trial signup     Setup       In-product  Support
              Social       Comparison       Support     Community    Account review

Pain Points:  How to       Feature info     Data        Learning    Competing
              find?        missing          migration   curve       tools
                          Unclear ROI       complexity             Better alt?

Emotions:     Hopeful      Cautious         Excited +   Learning +  Satisfied
              Skeptical    Anxious          Anxious     Frustrated  Routine

Opp Score:    Medium       High             Critical    High        Medium
             (improve      (streamline      (reduce     (ease       (increase
              awareness)    evaluation)      friction)   adoption)   loyalty)

Survey Analysis and Insights

Survey Data Cleaning

Step 1: Check for Data Quality

  • Incomplete responses: Remove if <80% complete
  • Speeders: Remove responses completed <1/3 median time
  • Duplicates: Check for IP duplicates, same ID multiple times
  • Pattern responses: Remove obvious spam (all 5s or all 1s)
  • Language: Translate or exclude non-English responses

Step 2: Segment Analysis

  • Company size: Analyze enterprise vs. SMB separately
  • Use case: Different patterns by industry or use case
  • Stage: New users vs. experienced users
  • Role: Manager vs. individual contributor

Finding Correlation in Survey Data

Question 1: Feature Importance Ranking (Rate importance 1-5)

  • Feature A: 4.2 average rating
  • Feature B: 3.8 average rating
  • Feature C: 3.1 average rating
  • Feature D: 2.4 average rating

Question 2: Pain Point Frequency (Rate frequency of problem)

  • Coordination difficulty: 4.1 average
  • Integration issues: 3.6 average
  • Learning curve: 2.9 average

Correlation finding:

  • Users reporting high coordination difficulty also rated collaboration features highest
  • Suggests product should emphasize coordination/collaboration

NPS Analysis

Net Promoter Score (NPS) Question: "How likely to recommend to colleague?" (0-10 scale)

Segmentation:

  • Promoters (9-10): 35% (satisfied, growth drivers)
  • Passives (7-8): 40% (satisfied but could leave)
  • Detractors (0-6): 25% (dissatisfied, churn risk)

NPS Score = 35% - 25% = +10 (Benchmark: Industry average 20-30)

Detractor Analysis:

  • Common pain: "Integration broken with X tool"
  • Common pain: "Customer support slow to respond"
  • Common pain: "Pricing increased too much"

Recommendation: Prioritize:

  1. Integration stability (fix broken integrations)
  2. Support improvements (faster response)
  3. Transparent pricing (communicate value)

Competitive User Experience Analysis

Competitive Feature Comparison

FeatureOur ProductCompetitor ACompetitor BCompetitor C
Real-time collaboration-
Mobile app✓ Mobile webNative iOS/AndroidNative iOS/AndroidWeb only
Integrations1550200+8
Pricing$10/user$8/user$15/user$5/user
Ease of useGoodExcellentGoodOkay
SupportEmail/chat24/7 phone + chatCommunity onlyEmail
Uptime99.9%99.95%99.5%99.8%

Key findings:

  • Competitors have more integrations (weakness)
  • Our ease of use is competitive (strength)
  • Mobile experience varies (opportunity to differentiate)
  • Pricing in middle range (not a differentiator)

Usage Analytics Insights

Feature Adoption Metrics

Feature: Real-time notifications

  • Users with feature enabled: 65%
  • Daily active use: 40% of enabled users
  • Weekly active use: 60% of enabled users
  • Average uses per day: 4.2
  • Activation time: 2 days to first use
  • Retention: 70% still using after 30 days

Analysis:

  • Good adoption but lower daily active use suggests not critical for all
  • Fast activation shows clear value proposition
  • 30-day retention solid but room for improvement

Engagement Funnel Analysis

Week 1 (Onboarding)

  • Sign up: 1,000
  • First login: 950 (95%)
  • Complete setup: 850 (85%)
  • Create first project: 720 (72%)
  • Invite team member: 450 (45%)
  • Drop-off point: Setup completion to inviting team (27% drop)

Action: Improve team invitation workflow

  • Current friction: Difficult to find invite option
  • Solution: Add prominent invite call-to-action after setup

Week 2-4 (Early Adoption)

  • Weekly active users: 600 (60% of initial)
  • Average sessions/week: 3.2
  • Average session duration: 18 minutes
  • Feature used per session: 2.5
  • Drop-off point: Between week 1-2 (significant drop)

Action: Improve week 2 engagement

  • Many users not returning after initial setup
  • Implement re-engagement email campaign
  • Add guided feature tours

Insight Prioritization Framework

Insight Impact and Confidence Matrix

                High Confidence
                      ↑
High Impact    │    [CRITICAL]      [STRATEGIC]
               │    Act immediately  Plan carefully
               │
Impact ←───────●────────→
               │
Low Impact     │    [MONITOR]        [LEARN MORE]
               │    Watch for        Need more data
               │    changes          before action
                    Low Confidence

Critical Quadrant (High impact, High confidence)

  • Real-time status visibility reduces meeting time by 30%
  • User: "We spend 3 hours weekly updating status that could be automated"
  • Data: 83% of users mention status meetings, avg 3.5 hours/week
  • Action: Prioritize activity feed feature

Strategic Quadrant (High impact, Lower confidence)

  • AI-powered recommendations could increase feature adoption
  • User feedback: Mentions of "didn't know feature existed"
  • Data: Only 40% of users discover advanced features
  • Action: Run prototype test, validate value before full commitment

Monitor Quadrant (Lower impact, High confidence)

  • Dark mode is frequently requested
  • Data: 65% of users request feature in survey
  • Action: Add to roadmap as quick-win, not critical path

Learn More Quadrant (Lower impact, Lower confidence)

  • Possible export format demand
  • Only 3 users mentioned, unclear if widespread need
  • Action: Follow up with survey on export formats

Actionable Recommendations Framework

Recommendation Development Template

Insight: [Core finding]

Supporting evidence:

  • Qualitative: [Interview quotes, observations]
  • Quantitative: [Survey %, Analytics data]
  • Behavioral: [Usage patterns, support tickets]

Implication: [What this means for product]

Recommended action: [Specific, prioritized action]

  • Primary action: [Top priority]
  • Secondary actions: [Supporting initiatives]
  • Success metrics: [How we'll know it worked]
  • Timeline: [When to implement]
  • Resources needed: [Time, people, budget]

Risk/considerations:

  • Potential unintended consequences
  • Resource constraints
  • Alignment with strategy

Research Insights Reporting

Executive Summary Report

Research Overview:

  • Objective: [What question did we answer]
  • Methodology: [8 interviews, 250-person survey, 2 weeks analytics]
  • Key finding: [1-sentence core insight]

Key Findings: (3-5 top insights with supporting data)

Recommendations: (Prioritized action items with impact)

Next steps: (Follow-up research, validation needed)


Research Synthesis Checklist

  • All research data collected and organized
  • Interviews transcribed and coded by theme
  • Survey data cleaned and analyzed
  • Analytics reports generated
  • Patterns identified across multiple sources
  • Conflicting findings noted and explained
  • Insights mapped to user journey stages
  • Competitive analysis completed
  • Key recommendations developed
  • Evidence compiled for each insight
  • Prioritization completed (impact vs. confidence)
  • Stakeholder presentation prepared
  • Research shared in accessible format
  • Action items assigned and tracked

Output Deliverables

  1. Research Synthesis Report - All findings compiled, 10-15 pages
  2. Customer Journey Map - Visual journey with pain points and opportunities
  3. Insight Brief - Top 5-7 insights with supporting data
  4. Actionable Recommendations - Prioritized product/UX improvements
  5. Competitive Analysis - Feature comparison and market positioning
  6. Interview Codebook - Themes with frequencies and quotes
  7. Survey Analysis - Key findings and segment analysis
  8. Visualization Deck - Charts, maps, matrices for presentations
  9. Raw Research Archive - Interviews, survey data, analytics reports