Claude-code-plugins-plus linktree-rate-limits

install
source · Clone the upstream repo
git clone https://github.com/jeremylongshore/claude-code-plugins-plus-skills
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/jeremylongshore/claude-code-plugins-plus-skills "$T" && mkdir -p ~/.claude/skills && cp -r "$T/plugins/saas-packs/linktree-pack/skills/linktree-rate-limits" ~/.claude/skills/jeremylongshore-claude-code-plugins-plus-linktree-rate-limits && rm -rf "$T"
manifest: plugins/saas-packs/linktree-pack/skills/linktree-rate-limits/SKILL.md
source content

Linktree Rate Limits

Overview

Linktree's API enforces rate limits per OAuth token, with analytics endpoints throttled more aggressively than profile management operations. Agencies managing dozens of creator profiles need to stagger link updates and analytics pulls across accounts to avoid hitting per-token and global IP-based limits. Bulk link reordering and analytics export during campaign launches are the most common rate-limit triggers, especially when synchronizing link performance data with external dashboards on short polling intervals.

Rate Limit Reference

EndpointLimitWindowScope
Profile read/update60 req1 minutePer OAuth token
Link create/update/delete30 req1 minutePer OAuth token
Analytics summary20 req1 minutePer OAuth token
Analytics detailed (per-link)10 req1 minutePer OAuth token
Webhook management10 req1 minutePer OAuth token

Rate Limiter Implementation

class LinktreeRateLimiter {
  private tokens: number;
  private lastRefill: number;
  private readonly max: number;
  private readonly refillRate: number;
  private queue: Array<{ resolve: () => void }> = [];

  constructor(maxPerMinute: number) {
    this.max = maxPerMinute;
    this.tokens = maxPerMinute;
    this.lastRefill = Date.now();
    this.refillRate = maxPerMinute / 60_000;
  }

  async acquire(): Promise<void> {
    this.refill();
    if (this.tokens >= 1) { this.tokens -= 1; return; }
    return new Promise(resolve => this.queue.push({ resolve }));
  }

  private refill() {
    const now = Date.now();
    this.tokens = Math.min(this.max, this.tokens + (now - this.lastRefill) * this.refillRate);
    this.lastRefill = now;
    while (this.tokens >= 1 && this.queue.length) {
      this.tokens -= 1;
      this.queue.shift()!.resolve();
    }
  }
}

const linkLimiter = new LinktreeRateLimiter(25);
const analyticsLimiter = new LinktreeRateLimiter(8);

Retry Strategy

async function linktreeRetry<T>(
  limiter: LinktreeRateLimiter, fn: () => Promise<Response>, maxRetries = 3
): Promise<T> {
  for (let attempt = 0; attempt <= maxRetries; attempt++) {
    await limiter.acquire();
    const res = await fn();
    if (res.ok) return res.json();
    if (res.status === 429) {
      const retryAfter = parseInt(res.headers.get("Retry-After") || "30", 10);
      const jitter = Math.random() * 2000;
      await new Promise(r => setTimeout(r, retryAfter * 1000 + jitter));
      continue;
    }
    if (res.status >= 500 && attempt < maxRetries) {
      await new Promise(r => setTimeout(r, Math.pow(2, attempt) * 1500));
      continue;
    }
    throw new Error(`Linktree API ${res.status}: ${await res.text()}`);
  }
  throw new Error("Max retries exceeded");
}

Batch Processing

async function batchUpdateLinks(profileId: string, links: any[], batchSize = 5) {
  const results: any[] = [];
  for (let i = 0; i < links.length; i += batchSize) {
    const batch = links.slice(i, i + batchSize);
    const batchResults = await Promise.all(
      batch.map(link => linktreeRetry(linkLimiter, () =>
        fetch(`${BASE}/api/v1/profiles/${profileId}/links/${link.id}`, {
          method: "PATCH", headers,
          body: JSON.stringify({ title: link.title, url: link.url }),
        })
      ))
    );
    results.push(...batchResults);
    if (i + batchSize < links.length) await new Promise(r => setTimeout(r, 10_000));
  }
  return results;
}

Error Handling

IssueCauseFix
429 on link updatesExceeded 30 writes/min per tokenReduce batch concurrency to 3
429 on analyticsPolling per-link stats too frequentlyCache analytics, refresh every 5 min
401 token expiredOAuth token TTL exceededRefresh token before batch operations
404 on link deleteLink already removed or archivedSkip gracefully, log warning
IP-level 429Multiple tokens from same IPSpread requests across proxy endpoints

Resources

Next Steps

See

linktree-performance-tuning
.