Antigravity-awesome-skills jq
Expert jq usage for JSON querying, filtering, transformation, and pipeline integration. Practical patterns for real shell workflows.
install
source · Clone the upstream repo
git clone https://github.com/sickn33/antigravity-awesome-skills
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/sickn33/antigravity-awesome-skills "$T" && mkdir -p ~/.claude/skills && cp -r "$T/plugins/antigravity-awesome-skills/skills/jq" ~/.claude/skills/sickn33-antigravity-awesome-skills-jq-707a53 && rm -rf "$T"
manifest:
plugins/antigravity-awesome-skills/skills/jq/SKILL.mdsafety · automated scan (low risk)
This is a pattern-based risk scan, not a security review. Our crawler flagged:
- makes HTTP requests (curl)
- references API keys
Always read a skill's source content before installing. Patterns alone don't mean the skill is malicious — but they warrant attention.
source content
jq — JSON Querying and Transformation
Overview
jq is the standard CLI tool for querying and reshaping JSON. This skill covers practical, expert-level usage: filtering deeply nested data, transforming structures, aggregating values, and composing jq into shell pipelines. Every example is copy-paste ready for real workflows.
When to Use This Skill
- Use when parsing JSON output from APIs, CLI tools (AWS, GitHub, kubectl, docker), or log files
- Use when transforming JSON structure (rename keys, flatten arrays, group records)
- Use when the user needs
inside a bash script or one-linerjq - Use when explaining what a complex
expression doesjq
How It Works
jq takes a filter expression and applies it to JSON input. Filters compose with pipes (|), and jq handles arrays, objects, strings, numbers, booleans, and null natively.
Basic Selection
# Extract a field echo '{"name":"alice","age":30}' | jq '.name' # "alice" # Nested access echo '{"user":{"email":"a@b.com"}}' | jq '.user.email' # Array index echo '[10, 20, 30]' | jq '.[1]' # 20 # Array slice echo '[1,2,3,4,5]' | jq '.[2:4]' # [3, 4] # All array elements echo '[{"id":1},{"id":2}]' | jq '.[]'
Filtering with select
select# Keep only matching elements echo '[{"role":"admin"},{"role":"user"},{"role":"admin"}]' \ | jq '[.[] | select(.role == "admin")]' # Numeric comparison curl -s https://api.github.com/repos/owner/repo/issues \ | jq '[.[] | select(.comments > 5)]' # Test a field exists and is non-null jq '[.[] | select(.email != null)]' # Combine conditions jq '[.[] | select(.active == true and .score >= 80)]'
Mapping and Transformation
# Extract a field from every array element echo '[{"name":"alice","age":30},{"name":"bob","age":25}]' \ | jq '[.[] | .name]' # ["alice", "bob"] # Shorthand: map() jq 'map(.name)' # Build a new object per element jq '[.[] | {user: .name, years: .age}]' # Add a computed field jq '[.[] | . + {senior: (.age > 28)}]' # Rename keys jq '[.[] | {username: .name, email_address: .email}]'
Aggregation and Reduce
# Sum all values echo '[1, 2, 3, 4, 5]' | jq 'add' # 15 # Sum a field across objects jq '[.[].price] | add' # Count elements jq 'length' # Max / min jq 'max_by(.score)' jq 'min_by(.created_at)' # reduce: custom accumulator echo '[1,2,3,4,5]' | jq 'reduce .[] as $x (0; . + $x)' # 15 # Group by field jq 'group_by(.department)' # Count per group jq 'group_by(.status) | map({status: .[0].status, count: length})'
String Interpolation and Formatting
# String interpolation jq -r '.[] | "\(.name) is \(.age) years old"' # Format as CSV (no header) jq -r '.[] | [.name, .age, .email] | @csv' # Format as TSV jq -r '.[] | [.name, .score] | @tsv' # URL-encode a value jq -r '.query | @uri' # Base64 encode jq -r '.data | @base64'
Working with Keys and Paths
# List all top-level keys jq 'keys' # Check if key exists jq 'has("email")' # Delete a key jq 'del(.password)' # Delete nested keys from every element jq '[.[] | del(.internal_id, .raw_payload)]' # Recursive descent: find all values for a key anywhere in tree jq '.. | .id? // empty' # Get all leaf paths jq '[paths(scalars)]'
Conditionals and Error Handling
# if-then-else jq 'if .score >= 90 then "A" elif .score >= 80 then "B" else "C" end' # Alternative operator: use fallback if null or false jq '.nickname // .name' # try-catch: skip errors instead of halting jq '[.[] | try .nested.value catch null]' # Suppress null output with // empty jq '.[] | .optional_field // empty'
Practical Shell Integration
# Read from file jq '.users' data.json # Compact output (no whitespace) for further piping jq -c '.[]' records.json | while IFS= read -r record; do echo "Processing: $record" done # Pass a shell variable into jq STATUS="active" jq --arg s "$STATUS" '[.[] | select(.status == $s)]' # Pass a number jq --argjson threshold 42 '[.[] | select(.value > $threshold)]' # Slurp multiple JSON lines into an array jq -s '.' records.ndjson # Multiple files: slurp all into one array jq -s 'add' file1.json file2.json # Null-safe pipeline from a command kubectl get pods -o json | jq '.items[] | {name: .metadata.name, status: .status.phase}' # GitHub CLI: extract PR numbers gh pr list --json number,title | jq -r '.[] | "\(.number)\t\(.title)"' # AWS CLI: list running instance IDs aws ec2 describe-instances \ | jq -r '.Reservations[].Instances[] | select(.State.Name=="running") | .InstanceId' # Docker: show container names and images docker inspect $(docker ps -q) | jq -r '.[] | "\(.Name)\t\(.Config.Image)"'
Advanced Patterns
# Transpose an object of arrays to an array of objects # Input: {"names":["a","b"],"scores":[10,20]} jq '[.names, .scores] | transpose | map({name: .[0], score: .[1]})' # Flatten one level jq 'flatten(1)' # Unique by field jq 'unique_by(.email)' # Sort, deduplicate and re-index jq '[.[] | .name] | unique | sort' # Walk: apply transformation to every node recursively jq 'walk(if type == "string" then ascii_downcase else . end)' # env: read environment variables inside jq export API_KEY=secret jq -n 'env.API_KEY'
Best Practices
- Always use
(raw output) when passing-r
results to shell variables or other commands to strip JSON string quotesjq - Use
/--arg
to inject shell variables safely — never interpolate shell variables directly into filter strings--argjson - Prefer
overmap(f)
for readability[.[] | f] - Use
(compact) for newline-delimited JSON pipelines; omit it for human-readable debugging-c - Test filters interactively with
and literal input before embedding in scriptsjq -n - Use
to drop unwanted elements rather than filtering toemptynull
Security & Safety Notes
is read-only by design — it cannot write files or execute commandsjq- Avoid embedding untrusted JSON field values directly into shell commands; always quote or use
--arg
Common Pitfalls
-
Problem:
outputsjq
instead of the expected value Solution: Check for typos in key names; usenull
to inspect actual field names. Remember JSON is case-sensitive.keys -
Problem: Numbers are quoted as strings in the output Solution: Use
instead of--argjson
when injecting numeric values.--arg -
Problem: Filter works in the terminal but fails in a script Solution: Ensure the filter string uses single quotes in the shell to prevent variable expansion. Example:
notjq '.field'
.jq ".field" -
Problem:
returnsadd
on an empty array Solution: Usenull
oradd // 0
to provide a fallback default.add // "" -
Problem: Streaming large files is slow Solution: Use
or switch tojq --stream
/jstream
for very large files.gron
Related Skills
— Wrapping jq calls in robust shell scripts@bash-pro
— General shell pipeline patterns@bash-linux
— Using jq with GitHub CLI JSON output@github-automation
Limitations
- Use this skill only when the task clearly matches the scope described above.
- Do not treat the output as a substitute for environment-specific validation, testing, or expert review.
- Stop and ask for clarification if required inputs, permissions, safety boundaries, or success criteria are missing.