Claude-skill-registry blockchain-rpc-provider-research
Systematic workflow for researching and validating blockchain RPC providers. Use when evaluating RPC providers for historical data collection, rate limits, archive access, compute unit costs, or timeline estimation for large-scale blockchain data backfills.
git clone https://github.com/majiayu000/claude-skill-registry
T=$(mktemp -d) && git clone --depth=1 https://github.com/majiayu000/claude-skill-registry "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/data/blockchain-rpc-provider-research" ~/.claude/skills/majiayu000-claude-skill-registry-blockchain-rpc-provider-research && rm -rf "$T"
skills/data/blockchain-rpc-provider-research/SKILL.mdBlockchain RPC Provider Research
Overview
This skill provides a systematic, empirically-validated workflow for researching blockchain RPC providers before committing to large-scale data collection projects. Use when selecting an RPC provider for historical blockchain data backfill, evaluating rate limits, comparing free tier options, or estimating collection timelines.
Key principle: Never trust documented rate limits—always validate empirically with POC testing.
Investigation Workflow
This skill follows a 5-step workflow. Each step builds on the previous:
- Research Official Documentation - Survey provider docs, pricing, archive access
- Calculate Theoretical Timeline - Compute expected collection time from documented limits
- Empirical Validation with POC Testing - Test actual rate limits (CRITICAL STEP)
- Create Comparison Matrix - Build side-by-side provider comparison
- Document Findings and Make Recommendation - Write comprehensive analysis report
Detailed workflow: See
references/workflow-steps.md for complete step-by-step guide with code examples, questions to answer, and success criteria.
Quick start: For immediate testing, jump to Step 3 (Empirical Validation) using
scripts/test_rpc_rate_limits.py.
Rate Limiting Best Practices
When implementing the selected provider, use conservative targeting (80-90% of empirically validated rate) with monitoring and fallback strategy.
Full guide: See
references/rate-limiting-guide.md for detailed monitoring requirements, fallback strategies, and safety margins.
Common Pitfalls
Critical mistakes to avoid: Trusting documented burst limits (always validate empirically), testing with <50 blocks, parallel fetching on free tiers, ignoring compute unit costs, and forgetting archive access restrictions.
Full guide: See
references/common-pitfalls.md for detailed anti-patterns with real-world examples (e.g., LlamaRPC 50 RPS → 1.37 RPS case).
Scripts
- Calculate collection timeline from rate limits (RPS or compute units)calculate_timeline.py
- Empirical rate limit testing templatetest_rpc_rate_limits.py
Usage guide: See
scripts/README.md for detailed usage examples, configuration options, and success criteria.
References
Workflow Documentation
- Complete 5-step workflow with detailed guidance for each stepreferences/workflow-steps.md
- Best practices for conservative rate targeting and monitoringreferences/rate-limiting-guide.md
- Anti-patterns to avoid with real-world examplesreferences/common-pitfalls.md
- Complete case study: 13M Ethereum blocks RPC selectionreferences/example-workflow.md
Data References
- Alchemy vs LlamaRPC vs Infura vs QuickNode empirical comparisonreferences/validated-providers.md
- Template for creating provider comparison matricesreferences/rpc-comparison-template.md
Scripts
- Complete usage guide for all scriptsscripts/README.md
- Timeline calculator (RPS and compute unit modes)scripts/calculate_timeline.py
- Empirical rate limit testing templatescripts/test_rpc_rate_limits.py
Example Workflow
Case study: Selecting RPC provider for 13M Ethereum blocks → Alchemy chosen at 5.79 RPS (26 days timeline, 4.2x faster than LlamaRPC).
Full walkthrough: See
references/example-workflow.md for complete step-by-step case study showing research, calculation, validation, comparison, and final recommendation.
When to Use This Skill
Invoke this skill when:
- Evaluating blockchain RPC providers for a new project
- Planning historical data backfill timelines
- Comparing free tier vs paid provider options
- Investigating rate limiting issues with current provider
- Estimating collection timelines for multi-million block datasets
- Validating archive node access for historical queries
- Researching compute unit or API credit costs
- Building POC before production implementation
Related Patterns
This skill pairs well with:
- For validating the complete data pipeline after provider selectionblockchain-data-collection-validation- Project scratch investigations in
andscratch/ethereum-collector-poc/scratch/rpc-provider-comparison/