Awesome-omni-skill summarize
Summarize or extract text/transcripts from URLs, podcasts, and local files.
install
source · Clone the upstream repo
git clone https://github.com/diegosouzapw/awesome-omni-skill
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/diegosouzapw/awesome-omni-skill "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/data-ai/summarize" ~/.claude/skills/diegosouzapw-awesome-omni-skill-summarize && rm -rf "$T"
manifest:
skills/data-ai/summarize/SKILL.mdsource content
Summarize
Fast CLI to summarize URLs, local files, and YouTube links using the
summarize CLI tool.
When to use (trigger phrases)
Use this skill immediately when the user asks any of:
- "use summarize.sh"
- "what's this link/video about?"
- "summarize this URL/article"
- "transcribe this YouTube/video"
- "what does this article say?"
Prerequisites
Install the summarize CLI:
brew install steipete/tap/summarize
Quick start
# Summarize a URL summarize "https://example.com" --model google/gemini-3-flash-preview # Summarize a local file summarize "/path/to/file.pdf" --model google/gemini-3-flash-preview # Summarize a YouTube video summarize "https://youtu.be/dQw4w9WgXcQ" --youtube auto
YouTube: summary vs transcript
For best-effort transcript extraction (URLs only):
summarize "https://youtu.be/dQw4w9WgXcQ" --youtube auto --extract-only
If the user asked for a transcript but it's huge, return a tight summary first, then ask which section/time range to expand.
Model + API keys
Set the API key for your chosen provider:
- OpenAI:
OPENAI_API_KEY - Anthropic:
ANTHROPIC_API_KEY - xAI:
XAI_API_KEY - Google:
(aliases:GEMINI_API_KEY
,GOOGLE_GENERATIVE_AI_API_KEY
)GOOGLE_API_KEY
Default model is
google/gemini-3-flash-preview if none is set.
Useful flags
| Flag | Description |
|---|---|
| Summary length |
| Max output tokens |
| Extract text only (URLs) |
| Machine readable output |
| Fallback extraction |
| Apify fallback for YouTube |
Fallback without CLI
If the
summarize CLI is not installed, you can fallback to using the web tool:
# Use fetch_url to get content, then have the LLM summarize curl -s "https://example.com" | head -c 50000
Then ask the LLM to summarize the fetched content.
Config
Optional config file:
~/.summarize/config.json
{ "model": "openai/gpt-4o" }
Optional services:
for blocked sitesFIRECRAWL_API_KEY
for YouTube fallbackAPIFY_API_TOKEN