Openclaw-master-skills ai-news-aggregator-sl
git clone https://github.com/LeoYeAI/openclaw-master-skills
T=$(mktemp -d) && git clone --depth=1 https://github.com/LeoYeAI/openclaw-master-skills "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/ai-news-aggregator-sl" ~/.claude/skills/leoyeai-openclaw-master-skills-ai-news-aggregator-sl && rm -rf "$T"
T=$(mktemp -d) && git clone --depth=1 https://github.com/LeoYeAI/openclaw-master-skills "$T" && mkdir -p ~/.openclaw/skills && cp -r "$T/skills/ai-news-aggregator-sl" ~/.openclaw/skills/leoyeai-openclaw-master-skills-ai-news-aggregator-sl && rm -rf "$T"
skills/ai-news-aggregator-sl/SKILL.md- dumps environment variables
- references .env files
🦞 AI News Aggregator
Collects news on any topic, writes an English editorial digest using your choice of AI provider, and posts it to Discord.
Default (AI topic): TechCrunch · The Verge · NYT Tech (RSS) + curated AI YouTube channels Custom topics: Tavily news search + YouTube topic search (no Shorts, sorted by views) AI providers: OpenAI (default) · DeepSeek · Anthropic Claude — switchable per request
Network Endpoints
| Endpoint | Purpose | Condition |
|---|---|---|
| AI editorial summarisation | Always (required) |
| Post digest to Discord | Always (required) |
| RSS news (AI topic) | Default AI topic only |
| RSS news (AI topic) | Default AI topic only |
| RSS news (AI topic) | Default AI topic only |
| Custom topic news search | Only if set |
| Twitter search | Only if set |
| YouTube search | Only if set |
The script does not contact OpenAI endpoints. The
openai package is used solely as an HTTP client pointed at https://api.deepseek.com. OPENAI_API_KEY is explicitly removed from the environment at startup.
Usage Examples
- "Get today's AI news"
- "Collect news about crypto"
- "Last week's news about climate change"
- "What's trending in AI today?"
- "Get crypto news from the last 3 days using OpenAI"
- "Show me recent Bitcoin YouTube videos"
- "Summarise WWIII news with Claude"
- "AI news using GPT-4o"
- "AI news dry run" (preview without posting to Discord)
- "Test my Discord webhook"
API Keys
| Key | Required | Where to get it |
|---|---|---|
| ✅ Always | Discord → Channel Settings → Integrations → Webhooks → Copy URL |
| If using DeepSeek (default) | platform.deepseek.com/api_keys |
| If using OpenAI | platform.openai.com/api-keys |
| If using Claude | console.anthropic.com → API Keys |
| For custom topics | app.tavily.com |
| Optional | twitterapi.io |
| Optional | console.cloud.google.com → YouTube Data API v3 |
AI Providers & Models
| Provider | value | Default model | Best for |
|---|---|---|---|
| OpenAI | (default) | | Quality, reliability |
| DeepSeek | | | Cost-effective, fast |
| Claude | | | Nuanced writing |
Override per request using the
--provider flag. Set a permanent non-default with openclaw config set env.AI_PROVIDER '"deepseek"'. Override the model with --model (e.g. --model gpt-4o or --model claude-3-5-sonnet-20241022).
Implementation
IMPORTANT: Always run
news_aggregator.py using the steps below. Do NOT search the web manually or improvise a response — the script handles all fetching, summarisation, and Discord posting.
Step 1 — Locate the script
The script is bundled with this skill. Find it:
SKILL_DIR=$(ls -d ~/.openclaw/skills/ai-news-aggregator-sl 2>/dev/null || ls -d ~/.openclaw/skills/news-aggregator 2>/dev/null) SCRIPT="$SKILL_DIR/news_aggregator.py" echo "Script: $SCRIPT" ls "$SCRIPT"
Step 2 — Check uv is available
which uv && uv --version || echo "uv not found"
If
uv is not found, ask the user to install it from their system package manager or from https://docs.astral.sh/uv/getting-started/installation/. Do not run a curl-pipe-sh command on the user's behalf.
Step 3 — API keys
Env vars are passed automatically by OpenClaw from its config. No
.env file is needed.
Verify the required keys are set (without revealing values):
[[ -n "$OPENAI_API_KEY" ]] && echo "OPENAI_API_KEY: set" || echo "OPENAI_API_KEY: MISSING (required for default provider)" [[ -n "$DISCORD_WEBHOOK_URL" ]] && echo "DISCORD_WEBHOOK_URL: set" || echo "DISCORD_WEBHOOK_URL: MISSING"
If any are missing, ask the user to register them:
openclaw config set env.OPENAI_API_KEY '<key>' openclaw config set env.DISCORD_WEBHOOK_URL '<url>' # Optional alternatives: openclaw config set env.DEEPSEEK_API_KEY '<key>' openclaw config set env.ANTHROPIC_API_KEY '<key>'
Step 4 — Parse the request
Extract topic, days, and provider from what the user said:
For AI provider:
| User said | --provider | --model |
|---|---|---|
| "use OpenAI" / "with GPT" / "using ChatGPT" | | (omit) |
| "use Claude" / "with Anthropic" | | (omit) |
| "use DeepSeek" / nothing specified | (omit — default) | (omit) |
| "use GPT-4o" / "with gpt-4o" | | |
| "use claude sonnet" | | |
| "use deepseek reasoner" | | |
Extract topic and days from what the user said:
| User said | --topic | --days |
|---|---|---|
| "AI news" / "tech news" / nothing specific | (omit — default AI) | 1 |
| "crypto news" | | 1 |
| "news about climate change" | | 1 |
| "last week's crypto news" | | 7 |
| "last 3 days of Bitcoin news" | | 3 |
| "yesterday's AI news" | (omit topic) | 1 |
| "this week in AI" | (omit topic) | 7 |
For report type:
| User said | flag to add |
|---|---|
| "news" / "articles" / "digest" | |
| "trending" / "Twitter" / "YouTube" | |
| "dry run" / "preview" / "don't post" | |
| "test Discord" / "test webhook" | |
| anything else | (omit — runs all) |
Step 5 — Run with uv
uv run automatically installs all dependencies from the script's inline metadata — no venv setup needed.
uv run "$SCRIPT" [--topic "TOPIC"] [--days N] [--report TYPE] [--provider PROVIDER] [--model MODEL] [--dry-run]
Examples:
# AI news today — DeepSeek (default) uv run "$SCRIPT" # Crypto news using OpenAI uv run "$SCRIPT" --topic "crypto" --provider openai # Last week's climate news using Claude uv run "$SCRIPT" --topic "climate change" --days 7 --provider claude # Use a specific model uv run "$SCRIPT" --topic "Bitcoin" --provider openai --model gpt-4o # Trending AI on Twitter and YouTube uv run "$SCRIPT" --report trending # Preview without posting to Discord uv run "$SCRIPT" --topic "Bitcoin" --dry-run # Test webhook connection uv run "$SCRIPT" --test-discord
Step 6 — Report back
Tell the user what was posted to Discord, how many items were found per source, and note any skipped sources (e.g. "YouTube skipped — YOUTUBE_API_KEY not set").