Skills.expanso.io expanso-kafka-to-s3

Stream Kafka topics to S3 with partitioning and batching

install
source · Clone the upstream repo
git clone https://github.com/expanso-io/skills.expanso.io
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/expanso-io/skills.expanso.io "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/recipes/kafka-to-s3" ~/.claude/skills/expanso-io-skills-expanso-io-expanso-kafka-to-s3 && rm -rf "$T"
manifest: skills/recipes/kafka-to-s3/SKILL.md
source content

Kafka to S3

Stream data from Kafka topics to S3 buckets with intelligent partitioning, batching, and compression.

Category

data-routing

Quick Start

# Configure environment
export KAFKA_BROKERS=localhost:9092
export AWS_ACCESS_KEY_ID=your-key
export AWS_SECRET_ACCESS_KEY=your-secret
export S3_BUCKET=your-bucket

# Run the pipeline
./run.sh

Pipeline

The

pipeline.yaml
streams from Kafka to S3 with:

  • Consumer group management
  • Time-based partitioning (hourly/daily)
  • Gzip compression
  • Batching for efficient S3 writes

Requirements

  • Expanso Edge installed (
    clawhub install expanso
    )
  • Kafka broker access
  • AWS credentials with S3 write permissions

Related