Antigravity-awesome-skills jq

Expert jq usage for JSON querying, filtering, transformation, and pipeline integration. Practical patterns for real shell workflows.

install
source · Clone the upstream repo
git clone https://github.com/sickn33/antigravity-awesome-skills
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/sickn33/antigravity-awesome-skills "$T" && mkdir -p ~/.claude/skills && cp -r "$T/plugins/antigravity-awesome-skills/skills/jq" ~/.claude/skills/sickn33-antigravity-awesome-skills-jq-707a53 && rm -rf "$T"
manifest: plugins/antigravity-awesome-skills/skills/jq/SKILL.md
safety · automated scan (low risk)
This is a pattern-based risk scan, not a security review. Our crawler flagged:
  • makes HTTP requests (curl)
  • references API keys
Always read a skill's source content before installing. Patterns alone don't mean the skill is malicious — but they warrant attention.
source content

jq — JSON Querying and Transformation

Overview

jq
is the standard CLI tool for querying and reshaping JSON. This skill covers practical, expert-level usage: filtering deeply nested data, transforming structures, aggregating values, and composing
jq
into shell pipelines. Every example is copy-paste ready for real workflows.

When to Use This Skill

  • Use when parsing JSON output from APIs, CLI tools (AWS, GitHub, kubectl, docker), or log files
  • Use when transforming JSON structure (rename keys, flatten arrays, group records)
  • Use when the user needs
    jq
    inside a bash script or one-liner
  • Use when explaining what a complex
    jq
    expression does

How It Works

jq
takes a filter expression and applies it to JSON input. Filters compose with pipes (
|
), and
jq
handles arrays, objects, strings, numbers, booleans, and
null
natively.

Basic Selection

# Extract a field
echo '{"name":"alice","age":30}' | jq '.name'
# "alice"

# Nested access
echo '{"user":{"email":"a@b.com"}}' | jq '.user.email'

# Array index
echo '[10, 20, 30]' | jq '.[1]'
# 20

# Array slice
echo '[1,2,3,4,5]' | jq '.[2:4]'
# [3, 4]

# All array elements
echo '[{"id":1},{"id":2}]' | jq '.[]'

Filtering with
select

# Keep only matching elements
echo '[{"role":"admin"},{"role":"user"},{"role":"admin"}]' \
  | jq '[.[] | select(.role == "admin")]'

# Numeric comparison
curl -s https://api.github.com/repos/owner/repo/issues \
  | jq '[.[] | select(.comments > 5)]'

# Test a field exists and is non-null
jq '[.[] | select(.email != null)]'

# Combine conditions
jq '[.[] | select(.active == true and .score >= 80)]'

Mapping and Transformation

# Extract a field from every array element
echo '[{"name":"alice","age":30},{"name":"bob","age":25}]' \
  | jq '[.[] | .name]'
# ["alice", "bob"]

# Shorthand: map()
jq 'map(.name)'

# Build a new object per element
jq '[.[] | {user: .name, years: .age}]'

# Add a computed field
jq '[.[] | . + {senior: (.age > 28)}]'

# Rename keys
jq '[.[] | {username: .name, email_address: .email}]'

Aggregation and Reduce

# Sum all values
echo '[1, 2, 3, 4, 5]' | jq 'add'
# 15

# Sum a field across objects
jq '[.[].price] | add'

# Count elements
jq 'length'

# Max / min
jq 'max_by(.score)'
jq 'min_by(.created_at)'

# reduce: custom accumulator
echo '[1,2,3,4,5]' | jq 'reduce .[] as $x (0; . + $x)'
# 15

# Group by field
jq 'group_by(.department)'

# Count per group
jq 'group_by(.status) | map({status: .[0].status, count: length})'

String Interpolation and Formatting

# String interpolation
jq -r '.[] | "\(.name) is \(.age) years old"'

# Format as CSV (no header)
jq -r '.[] | [.name, .age, .email] | @csv'

# Format as TSV
jq -r '.[] | [.name, .score] | @tsv'

# URL-encode a value
jq -r '.query | @uri'

# Base64 encode
jq -r '.data | @base64'

Working with Keys and Paths

# List all top-level keys
jq 'keys'

# Check if key exists
jq 'has("email")'

# Delete a key
jq 'del(.password)'

# Delete nested keys from every element
jq '[.[] | del(.internal_id, .raw_payload)]'

# Recursive descent: find all values for a key anywhere in tree
jq '.. | .id? // empty'

# Get all leaf paths
jq '[paths(scalars)]'

Conditionals and Error Handling

# if-then-else
jq 'if .score >= 90 then "A" elif .score >= 80 then "B" else "C" end'

# Alternative operator: use fallback if null or false
jq '.nickname // .name'

# try-catch: skip errors instead of halting
jq '[.[] | try .nested.value catch null]'

# Suppress null output with // empty
jq '.[] | .optional_field // empty'

Practical Shell Integration

# Read from file
jq '.users' data.json

# Compact output (no whitespace) for further piping
jq -c '.[]' records.json | while IFS= read -r record; do
  echo "Processing: $record"
done

# Pass a shell variable into jq
STATUS="active"
jq --arg s "$STATUS" '[.[] | select(.status == $s)]'

# Pass a number
jq --argjson threshold 42 '[.[] | select(.value > $threshold)]'

# Slurp multiple JSON lines into an array
jq -s '.' records.ndjson

# Multiple files: slurp all into one array
jq -s 'add' file1.json file2.json

# Null-safe pipeline from a command
kubectl get pods -o json | jq '.items[] | {name: .metadata.name, status: .status.phase}'

# GitHub CLI: extract PR numbers
gh pr list --json number,title | jq -r '.[] | "\(.number)\t\(.title)"'

# AWS CLI: list running instance IDs
aws ec2 describe-instances \
  | jq -r '.Reservations[].Instances[] | select(.State.Name=="running") | .InstanceId'

# Docker: show container names and images
docker inspect $(docker ps -q) | jq -r '.[] | "\(.Name)\t\(.Config.Image)"'

Advanced Patterns

# Transpose an object of arrays to an array of objects
# Input: {"names":["a","b"],"scores":[10,20]}
jq '[.names, .scores] | transpose | map({name: .[0], score: .[1]})'

# Flatten one level
jq 'flatten(1)'

# Unique by field
jq 'unique_by(.email)'

# Sort, deduplicate and re-index
jq '[.[] | .name] | unique | sort'

# Walk: apply transformation to every node recursively
jq 'walk(if type == "string" then ascii_downcase else . end)'

# env: read environment variables inside jq
export API_KEY=secret
jq -n 'env.API_KEY'

Best Practices

  • Always use
    -r
    (raw output) when passing
    jq
    results to shell variables or other commands to strip JSON string quotes
  • Use
    --arg
    /
    --argjson
    to inject shell variables safely — never interpolate shell variables directly into filter strings
  • Prefer
    map(f)
    over
    [.[] | f]
    for readability
  • Use
    -c
    (compact) for newline-delimited JSON pipelines; omit it for human-readable debugging
  • Test filters interactively with
    jq -n
    and literal input before embedding in scripts
  • Use
    empty
    to drop unwanted elements rather than filtering to
    null

Security & Safety Notes

  • jq
    is read-only by design — it cannot write files or execute commands
  • Avoid embedding untrusted JSON field values directly into shell commands; always quote or use
    --arg

Common Pitfalls

  • Problem:

    jq
    outputs
    null
    instead of the expected value Solution: Check for typos in key names; use
    keys
    to inspect actual field names. Remember JSON is case-sensitive.

  • Problem: Numbers are quoted as strings in the output Solution: Use

    --argjson
    instead of
    --arg
    when injecting numeric values.

  • Problem: Filter works in the terminal but fails in a script Solution: Ensure the filter string uses single quotes in the shell to prevent variable expansion. Example:

    jq '.field'
    not
    jq ".field"
    .

  • Problem:

    add
    returns
    null
    on an empty array Solution: Use
    add // 0
    or
    add // ""
    to provide a fallback default.

  • Problem: Streaming large files is slow Solution: Use

    jq --stream
    or switch to
    jstream
    /
    gron
    for very large files.

Related Skills

  • @bash-pro
    — Wrapping jq calls in robust shell scripts
  • @bash-linux
    — General shell pipeline patterns
  • @github-automation
    — Using jq with GitHub CLI JSON output

Limitations

  • Use this skill only when the task clearly matches the scope described above.
  • Do not treat the output as a substitute for environment-specific validation, testing, or expert review.
  • Stop and ask for clarification if required inputs, permissions, safety boundaries, or success criteria are missing.