Claude-skill-registry analytics-plan

Define analytics tracking plan for features and initiatives

install
source · Clone the upstream repo
git clone https://github.com/majiayu000/claude-skill-registry
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/majiayu000/claude-skill-registry "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/data/analytics-plan" ~/.claude/skills/majiayu000-claude-skill-registry-analytics-plan && rm -rf "$T"
manifest: skills/data/analytics-plan/SKILL.md
source content

Analytics Plan

Create comprehensive analytics tracking plans to measure feature success.

When to Use

  • Before implementing a new feature (define what to track)
  • When launching an experiment
  • When setting up product analytics
  • When defining success metrics

Used By

  • Data Analyst (primary owner)
  • Growth Marketer (growth metrics)
  • Product Manager (success metrics)
  • Full-Stack Engineer (implementation)

Analytics Plan Template

# Analytics Plan: [Feature/Initiative Name]

**Author**: [Name]
**Date**: [Date]
**Status**: Draft | Approved | Implemented

---

## Overview

### Feature Description
[Brief description of the feature]

### Business Questions
What decisions will this data inform?

1. [Question 1]
2. [Question 2]
3. [Question 3]

### Success Criteria
How will we know if this feature is successful?

- **Primary Metric**: [Metric] - Target: [X]
- **Secondary Metric**: [Metric] - Target: [X]
- **Guardrail Metric**: [Metric] - Should not decrease by [X%]

---

## Event Tracking

### Core Events

| Event Name | Trigger | Properties | Priority |
|------------|---------|------------|----------|
| `[event_name]` | [When fired] | [Key properties] | P1 |
| `[event_name]` | [When fired] | [Key properties] | P1 |
| `[event_name]` | [When fired] | [Key properties] | P2 |

### Event Specifications

#### `feature_viewed`
**Trigger**: When user views the feature for the first time in session

**Properties**:
| Property | Type | Required | Description |
|----------|------|----------|-------------|
| `source` | string | Yes | Where user came from |
| `variant` | string | No | A/B test variant |
| `user_tier` | string | Yes | Free/Pro/Enterprise |

**Example**:
```json
{
  "event": "feature_viewed",
  "properties": {
    "source": "navigation",
    "variant": "control",
    "user_tier": "pro"
  }
}

feature_action_completed

Trigger: When user completes the primary action

Properties:

PropertyTypeRequiredDescription
action_type
stringYesType of action
time_to_complete
numberYesSeconds from start
success
booleanYesAction succeeded

Funnel Definition

Primary Funnel: [Feature Adoption]

Step 1: feature_viewed
    ↓ [Target: 80%]
Step 2: feature_started
    ↓ [Target: 60%]
Step 3: feature_completed
    ↓ [Target: 40%]
Step 4: feature_repeated (within 7 days)

Funnel Analysis Questions

  • Where is the biggest drop-off?
  • How does drop-off vary by user segment?
  • What's the time between steps?

User Properties

PropertyTypeDescriptionWhen Updated
has_used_feature
booleanUser has ever used featureOn first use
feature_usage_count
numberTimes user used featureOn each use
first_feature_use
timestampWhen first usedOn first use
last_feature_use
timestampMost recent useOn each use

Segments

Key Segments to Analyze

SegmentDefinitionWhy Important
New Usersaccount_age < 7 daysAdoption patterns
Power Usersfeature_usage > 10/weekSuccess indicators
At-Riskno_activity > 14 daysRetention insights
By Planplan_type = [free/pro/enterprise]Monetization

Dashboard Requirements

Overview Dashboard

Purpose: Daily monitoring of feature health

Metrics to Include:

  • Daily Active Users (DAU)
  • Feature adoption rate
  • Primary action completion rate
  • Error rate

Filters:

  • Date range
  • User segment
  • Platform

Deep Dive Dashboard

Purpose: Understanding patterns and opportunities

Charts to Include:

  • Funnel visualization
  • Cohort retention
  • Time-based trends
  • Segment comparison

Experiment Plan (if applicable)

Hypothesis

[Change] will lead to [X% improvement] in [metric] because [reason].

Test Setup

  • Control: [Current experience]
  • Variant: [New experience]
  • Allocation: [50/50 or other]
  • Duration: [X weeks]
  • Sample Size Needed: [X users per variant]

Success Metrics

MetricBaselineMDEDirection
Primary: [metric][X%][Y%]Increase
Secondary: [metric][X][Y]Increase
Guardrail: [metric][X%][Y%]No decrease

Analysis Plan

  • Primary analysis at [X] days
  • Segment analysis by [dimensions]
  • Document learnings regardless of outcome

Implementation Checklist

Before Development

  • Analytics plan reviewed by data/product
  • Event names follow naming convention
  • Success metrics approved

During Development

  • Events implemented with correct properties
  • Events fire at correct times
  • Properties populated correctly

Before Launch

  • Events tested in staging
  • Dashboard created
  • Baseline metrics captured
  • Alert thresholds set

After Launch

  • Verify data flowing correctly
  • Check for data quality issues
  • Monitor metrics daily for first week

Data Quality Checks

CheckQuery/MethodExpected
Events firingCount by day> 0 after launch
Required propertiesNull checkNo nulls
Property valuesDistinct valuesExpected options
User join rateuser_id present100%

---

## Event Naming Convention

### Format

[object]_[action]


### Objects (nouns)
- `page` - Page views
- `button` - Button interactions
- `form` - Form interactions
- `feature` - Feature usage
- `subscription` - Subscription events
- `user` - User lifecycle

### Actions (past tense verbs)
- `viewed` - Something was seen
- `clicked` - Something was clicked
- `submitted` - Form was submitted
- `started` - Process began
- `completed` - Process finished
- `failed` - Something went wrong

### Examples

page_viewed button_clicked form_submitted feature_started feature_completed subscription_upgraded user_signed_up


---

## Property Guidelines

### Always Include
- `timestamp` - When event occurred
- `user_id` - Logged-in user identifier
- `session_id` - Session identifier
- `platform` - web/ios/android
- `page` - Current page/screen

### Contextual Properties
- `source` - What triggered the action
- `variant` - A/B test variant
- `value` - Numeric value if applicable
- `error_type` - For error events

### Naming Rules
- Use `snake_case`
- Be descriptive but concise
- Use consistent naming across events
- Document allowed values for enums

---

## Metrics Definitions

### Common Metrics

**Daily Active Users (DAU)**

Count of unique users with any event in past 24 hours


**Activation Rate**

(Users who completed key action) / (Users who signed up) × 100


**Retention Rate (Day N)**

(Users active on day N) / (Users who signed up N days ago) × 100


**Feature Adoption**

(Users who used feature) / (Total users) × 100


**Conversion Rate**

(Users who completed goal) / (Users who started flow) × 100


---

## Quick Reference

### Before Feature Launch
1. Define success metrics
2. Create event tracking plan
3. Implement events
4. Test in staging
5. Set up dashboard
6. Capture baseline

### After Feature Launch
1. Verify data quality
2. Monitor daily
3. Analyze after 1 week
4. Deep dive after 1 month
5. Document learnings