Awesome-omni-skill async-jobs

Async job processing patterns for background tasks, Celery workflows, task scheduling, retry strategies, and distributed task execution. Use when implementing background job processing, task queues, or scheduled task systems.

install
source · Clone the upstream repo
git clone https://github.com/diegosouzapw/awesome-omni-skill
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/diegosouzapw/awesome-omni-skill "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/development/async-jobs" ~/.claude/skills/diegosouzapw-awesome-omni-skill-async-jobs-a6f10d && rm -rf "$T"
manifest: skills/development/async-jobs/SKILL.md
source content

Async Jobs

Patterns for background task processing with Celery, ARQ, and Redis. Covers task queues, canvas workflows, scheduling, retry strategies, rate limiting, and production monitoring. Each category has individual rule files in

references/
loaded on-demand.

Quick Reference

CategoryRulesImpactWhen to Use
Configurationcelery-configHIGHCelery app setup, broker, serialization, worker tuning
Task Routingtask-routingHIGHPriority queues, multi-queue workers, dynamic routing
Canvas Workflowscanvas-workflowsHIGHChain, group, chord, nested workflows
Retry Strategiesretry-strategiesHIGHExponential backoff, idempotency, dead letter queues
Schedulingscheduled-tasksMEDIUMCelery Beat, crontab, database-backed schedules
Monitoringmonitoring-healthMEDIUMFlower, custom events, health checks, metrics
Result Backendsresult-backendsMEDIUMRedis results, custom states, progress tracking
ARQ Patternsarq-patternsMEDIUMAsync Redis Queue for FastAPI, lightweight jobs
Temporal Workflowstemporal-workflowsHIGHDurable workflow definitions, sagas, signals, queries
Temporal Activitiestemporal-activitiesHIGHActivity patterns, workers, heartbeats, testing

Total: 10 rules across 9 categories

Quick Start

# Celery task with retry
from celery import shared_task

@shared_task(
    bind=True,
    max_retries=3,
    autoretry_for=(ConnectionError, TimeoutError),
    retry_backoff=True,
)
def process_order(self, order_id: str) -> dict:
    result = do_processing(order_id)
    return {"order_id": order_id, "status": "completed"}
# ARQ task with FastAPI
from arq import create_pool
from arq.connections import RedisSettings

async def generate_report(ctx: dict, report_id: str) -> dict:
    data = await ctx["db"].fetch_report_data(report_id)
    pdf = await render_pdf(data)
    return {"report_id": report_id, "size": len(pdf)}

@router.post("/api/v1/reports")
async def create_report(data: ReportRequest, arq: ArqRedis = Depends(get_arq_pool)):
    job = await arq.enqueue_job("generate_report", data.report_id)
    return {"job_id": job.job_id}

Configuration

Production Celery app configuration with secure defaults and worker tuning.

Key Patterns

  • JSON serialization with
    task_serializer="json"
    for safety
  • Late acknowledgment with
    task_acks_late=True
    to prevent task loss on crash
  • Time limits with both
    task_time_limit
    (hard) and
    task_soft_time_limit
    (soft)
  • Fair distribution with
    worker_prefetch_multiplier=1
  • Reject on lost with
    task_reject_on_worker_lost=True

Key Decisions

DecisionRecommendation
SerializerJSON (never pickle)
Ack modeLate ack (
task_acks_late=True
)
Prefetch1 for fair, 4-8 for throughput
Time limitsoft < hard (e.g., 540/600)
TimezoneUTC always

Task Routing

Priority queue configuration with multi-queue workers and dynamic routing.

Key Patterns

  • Named queues for critical/high/default/low/bulk separation
  • Redis priority with
    queue_order_strategy: "priority"
    and 0-9 levels
  • Task router classes for dynamic routing based on task attributes
  • Per-queue workers with tuned concurrency and prefetch settings
  • Content-based routing for dynamic workflow dispatch

Key Decisions

DecisionRecommendation
Queue count3-5 (critical/high/default/low/bulk)
Priority levels0-9 with Redis
x-max-priority
Worker assignmentDedicated workers per queue
Prefetch1 for critical, 4-8 for bulk
RoutingRouter class for 5+ routing rules

Canvas Workflows

Celery canvas primitives for sequential, parallel, and fan-in/fan-out workflows.

Key Patterns

  • Chain for sequential ETL pipelines with result passing
  • Group for parallel execution of independent tasks
  • Chord for fan-out/fan-in with aggregation callback
  • Immutable signatures (
    si()
    ) for steps that ignore input
  • Nested workflows combining groups inside chains
  • Link error callbacks for workflow-level error handling

Key Decisions

DecisionRecommendation
SequentialChain with
s()
ParallelGroup for independent tasks
Fan-inChord (all must succeed for callback)
Ignore inputUse
si()
immutable signature
Error in chainReject stops chain, retry continues
Partial failuresReturn error dict in chord tasks

Retry Strategies

Retry patterns with exponential backoff, idempotency, and dead letter queues.

Key Patterns

  • Exponential backoff with
    retry_backoff=True
    and
    retry_backoff_max
  • Jitter with
    retry_jitter=True
    to prevent thundering herd
  • Idempotency keys in Redis to prevent duplicate processing
  • Dead letter queues for failed tasks requiring manual review
  • Task locking to prevent concurrent execution of singleton tasks
  • Base task classes with shared retry configuration

Key Decisions

DecisionRecommendation
Retry delayExponential backoff with jitter
Max retries3-5 for transient, 0 for permanent
IdempotencyRedis key with TTL
Failed tasksDLQ for manual review
SingletonRedis lock with TTL

Scheduling

Celery Beat periodic task configuration with crontab, database-backed schedules, and overlap prevention.

Key Patterns

  • Crontab for time-based schedules (daily, weekly, monthly)
  • Interval for fixed-frequency tasks (every N seconds)
  • Database scheduler with
    django-celery-beat
    for dynamic schedules
  • Schedule locks to prevent overlapping long-running scheduled tasks
  • Adaptive polling with self-rescheduling tasks

Key Decisions

DecisionRecommendation
Schedule typeCrontab for time-based, interval for frequency
DynamicDatabase scheduler (
django-celery-beat
)
OverlapRedis lock with timeout
Beat processSeparate process (not embedded)
TimezoneUTC always

Monitoring

Production monitoring with Flower, custom signals, health checks, and Prometheus metrics.

Key Patterns

  • Flower dashboard for real-time task monitoring
  • Celery signals (
    task_prerun
    ,
    task_postrun
    ,
    task_failure
    ) for metrics
  • Health check endpoint verifying broker connection and active workers
  • Queue depth monitoring for autoscaling decisions
  • Beat monitoring for scheduled task dispatch tracking

Key Decisions

DecisionRecommendation
DashboardFlower with persistent storage
MetricsPrometheus via celery signals
HealthBroker + worker + queue depth
AlertingSignal on task_failure
AutoscaleQueue depth > threshold

Result Backends

Task result storage, custom states, and progress tracking patterns.

Key Patterns

  • Redis backend for task status and small results
  • Custom task states (VALIDATING, PROCESSING, UPLOADING) for progress
  • update_state()
    for real-time progress reporting
  • S3/database for large result storage (never Redis)
  • AsyncResult for querying task state and progress

Key Decisions

DecisionRecommendation
Status storageRedis result backend
Large resultsS3 or database (never Redis)
ProgressCustom states with
update_state()
Result queryAsyncResult with state checks

ARQ Patterns

Lightweight async Redis Queue for FastAPI and simple background tasks.

Key Patterns

  • Native async/await with
    arq
    for FastAPI integration
  • Worker lifecycle with
    startup
    /
    shutdown
    hooks for resource management
  • Job enqueue from FastAPI routes with
    enqueue_job()
  • Job status tracking with
    Job.status()
    and
    Job.result()
  • Delayed tasks with
    _delay=timedelta()
    for deferred execution

Key Decisions

DecisionRecommendation
Simple asyncARQ (native async)
Complex workflowsCelery (chains, chords)
In-process quickFastAPI BackgroundTasks
LLM workflowsLangGraph (not Celery)

Tool Selection

ToolBest ForComplexity
ARQFastAPI, simple async jobsLow
CeleryComplex workflows, enterpriseHigh
RQSimple Redis queuesLow
DramatiqReliable messagingMedium
FastAPI BackgroundTasksIn-process quick tasksMinimal

Anti-Patterns (FORBIDDEN)

# NEVER run long tasks synchronously in request handlers
@router.post("/api/v1/reports")
async def create_report(data: ReportRequest):
    pdf = await generate_pdf(data)  # Blocks for minutes!

# NEVER block on results inside tasks (causes deadlock)
@celery_app.task
def bad_task():
    result = other_task.delay()
    return result.get()  # Blocks worker!

# NEVER store large results in Redis
@shared_task
def process_file(file_id: str) -> bytes:
    return large_file_bytes  # Store in S3/DB instead!

# NEVER skip idempotency for retried tasks
@celery_app.task(max_retries=3)
def create_order(order):
    Order.create(order)  # Creates duplicates on retry!

# NEVER use BackgroundTasks for distributed work
background_tasks.add_task(long_running_job)  # Lost if server restarts

# NEVER ignore task acknowledgment settings
celery_app.conf.task_acks_late = False  # Default loses tasks on crash

# ALWAYS use immutable signatures in chords
chord([task.s(x) for x in items], callback.si())  # si() prevents arg pollution

Temporal Workflows

Durable execution engine for reliable distributed applications with Temporal.io.

Key Patterns

  • Workflow definitions with
    @workflow.defn
    and deterministic code
  • Saga pattern with compensation for multi-step transactions
  • Signals and queries for external interaction with running workflows
  • Timers with
    workflow.wait_condition()
    for human-in-the-loop
  • Parallel activities via
    asyncio.gather
    inside workflows

Key Decisions

DecisionRecommendation
Workflow IDBusiness-meaningful, idempotent
DeterminismUse
workflow.random()
,
workflow.now()
I/OAlways via activities, never directly

Temporal Activities

Activity and worker patterns for Temporal.io I/O operations.

Key Patterns

  • Activity definitions with
    @activity.defn
    for all I/O
  • Heartbeating for long-running activities (> 60s)
  • Error classification with
    ApplicationError(non_retryable=True)
    for business errors
  • Worker configuration with dedicated task queues
  • Testing with
    WorkflowEnvironment.start_local()

Key Decisions

DecisionRecommendation
Activity timeout
start_to_close
for most cases
Error handlingNon-retryable for business errors
TestingWorkflowEnvironment for integration tests

Related Skills

  • ork:python-backend
    - FastAPI, asyncio, SQLAlchemy patterns
  • ork:langgraph
    - LangGraph workflow patterns (use for LLM workflows, not Celery)
  • ork:distributed-systems
    - Resilience patterns, circuit breakers
  • ork:monitoring-observability
    - Metrics and alerting

Capability Details

celery-config

Keywords: celery, configuration, broker, worker, setup Solves:

  • Production Celery app configuration
  • Broker and backend setup
  • Worker tuning and time limits

task-routing

Keywords: priority, queue, routing, high priority, worker Solves:

  • Premium user task prioritization
  • Multi-queue worker deployment
  • Dynamic task routing

canvas-workflows

Keywords: chain, group, chord, signature, canvas, workflow, pipeline Solves:

  • Complex multi-step task pipelines
  • Parallel task execution with aggregation
  • Sequential task dependencies

retry-strategies

Keywords: retry, backoff, idempotency, dead letter, resilience Solves:

  • Exponential backoff with jitter
  • Duplicate prevention for retried tasks
  • Failed task handling with DLQ

scheduled-tasks

Keywords: periodic, scheduled, cron, celery beat, interval Solves:

  • Run tasks on schedule (crontab)
  • Dynamic schedule management
  • Overlap prevention for long tasks

monitoring-health

Keywords: flower, monitoring, health check, metrics, alerting Solves:

  • Production task monitoring dashboard
  • Worker health checks
  • Queue depth autoscaling

result-backends

Keywords: result, state, progress, AsyncResult, status Solves:

  • Task progress tracking with custom states
  • Result storage strategies
  • Job status API endpoints

arq-patterns

Keywords: arq, async queue, redis queue, fastapi background Solves:

  • Lightweight async background tasks for FastAPI
  • Simple Redis job queue with async/await
  • Job status tracking