Openclaw-master-skills ai-news-aggregator-sl

install
source · Clone the upstream repo
git clone https://github.com/LeoYeAI/openclaw-master-skills
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/LeoYeAI/openclaw-master-skills "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/ai-news-aggregator-sl" ~/.claude/skills/leoyeai-openclaw-master-skills-ai-news-aggregator-sl && rm -rf "$T"
OpenClaw · Install into ~/.openclaw/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/LeoYeAI/openclaw-master-skills "$T" && mkdir -p ~/.openclaw/skills && cp -r "$T/skills/ai-news-aggregator-sl" ~/.openclaw/skills/leoyeai-openclaw-master-skills-ai-news-aggregator-sl && rm -rf "$T"
manifest: skills/ai-news-aggregator-sl/SKILL.md
safety · automated scan (medium risk)
This is a pattern-based risk scan, not a security review. Our crawler flagged:
  • dumps environment variables
  • references .env files
Always read a skill's source content before installing. Patterns alone don't mean the skill is malicious — but they warrant attention.
source content

🦞 AI News Aggregator

Collects news on any topic, writes an English editorial digest using your choice of AI provider, and posts it to Discord.

Default (AI topic): TechCrunch · The Verge · NYT Tech (RSS) + curated AI YouTube channels Custom topics: Tavily news search + YouTube topic search (no Shorts, sorted by views) AI providers: OpenAI (default) · DeepSeek · Anthropic Claude — switchable per request


Network Endpoints

EndpointPurposeCondition
https://api.deepseek.com/chat/completions
AI editorial summarisationAlways (required)
https://discord.com/api/webhooks/...
Post digest to DiscordAlways (required)
https://techcrunch.com/.../feed/
RSS news (AI topic)Default AI topic only
https://www.theverge.com/rss/...
RSS news (AI topic)Default AI topic only
https://www.nytimes.com/svc/collections/...
RSS news (AI topic)Default AI topic only
https://api.tavily.com/search
Custom topic news searchOnly if
TAVILY_API_KEY
set
https://api.twitterapi.io/twitter/tweet/advanced_search
Twitter searchOnly if
TWITTERAPI_IO_KEY
set
https://www.googleapis.com/youtube/v3/...
YouTube searchOnly if
YOUTUBE_API_KEY
set

The script does not contact OpenAI endpoints. The

openai
package is used solely as an HTTP client pointed at
https://api.deepseek.com
.
OPENAI_API_KEY
is explicitly removed from the environment at startup.


Usage Examples

  • "Get today's AI news"
  • "Collect news about crypto"
  • "Last week's news about climate change"
  • "What's trending in AI today?"
  • "Get crypto news from the last 3 days using OpenAI"
  • "Show me recent Bitcoin YouTube videos"
  • "Summarise WWIII news with Claude"
  • "AI news using GPT-4o"
  • "AI news dry run" (preview without posting to Discord)
  • "Test my Discord webhook"

API Keys

KeyRequiredWhere to get it
DISCORD_WEBHOOK_URL
✅ AlwaysDiscord → Channel Settings → Integrations → Webhooks → Copy URL
DEEPSEEK_API_KEY
If using DeepSeek (default)platform.deepseek.com/api_keys
OPENAI_API_KEY
If using OpenAIplatform.openai.com/api-keys
ANTHROPIC_API_KEY
If using Claudeconsole.anthropic.com → API Keys
TAVILY_API_KEY
For custom topicsapp.tavily.com
TWITTERAPI_IO_KEY
Optionaltwitterapi.io
YOUTUBE_API_KEY
Optionalconsole.cloud.google.com → YouTube Data API v3

AI Providers & Models

Provider
--provider
value
Default modelBest for
OpenAI
openai
(default)
gpt-4o-mini
Quality, reliability
DeepSeek
deepseek
deepseek-chat
Cost-effective, fast
Claude
claude
claude-3-5-haiku-20241022
Nuanced writing

Override per request using the

--provider
flag. Set a permanent non-default with
openclaw config set env.AI_PROVIDER '"deepseek"'
. Override the model with
--model
(e.g.
--model gpt-4o
or
--model claude-3-5-sonnet-20241022
).


Implementation

IMPORTANT: Always run

news_aggregator.py
using the steps below. Do NOT search the web manually or improvise a response — the script handles all fetching, summarisation, and Discord posting.

Step 1 — Locate the script

The script is bundled with this skill. Find it:

SKILL_DIR=$(ls -d ~/.openclaw/skills/ai-news-aggregator-sl 2>/dev/null || ls -d ~/.openclaw/skills/news-aggregator 2>/dev/null)
SCRIPT="$SKILL_DIR/news_aggregator.py"
echo "Script: $SCRIPT"
ls "$SCRIPT"

Step 2 — Check uv is available

which uv && uv --version || echo "uv not found"

If

uv
is not found, ask the user to install it from their system package manager or from https://docs.astral.sh/uv/getting-started/installation/. Do not run a curl-pipe-sh command on the user's behalf.

Step 3 — API keys

Env vars are passed automatically by OpenClaw from its config. No

.env
file is needed.

Verify the required keys are set (without revealing values):

[[ -n "$OPENAI_API_KEY" ]]      && echo "OPENAI_API_KEY: set"      || echo "OPENAI_API_KEY: MISSING (required for default provider)"
[[ -n "$DISCORD_WEBHOOK_URL" ]] && echo "DISCORD_WEBHOOK_URL: set" || echo "DISCORD_WEBHOOK_URL: MISSING"

If any are missing, ask the user to register them:

openclaw config set env.OPENAI_API_KEY '<key>'
openclaw config set env.DISCORD_WEBHOOK_URL '<url>'
# Optional alternatives:
openclaw config set env.DEEPSEEK_API_KEY '<key>'
openclaw config set env.ANTHROPIC_API_KEY '<key>'

Step 4 — Parse the request

Extract topic, days, and provider from what the user said:

For AI provider:

User said--provider--model
"use OpenAI" / "with GPT" / "using ChatGPT"
--provider openai
(omit)
"use Claude" / "with Anthropic"
--provider claude
(omit)
"use DeepSeek" / nothing specified(omit — default)(omit)
"use GPT-4o" / "with gpt-4o"
--provider openai
--model gpt-4o
"use claude sonnet"
--provider claude
--model claude-3-5-sonnet-20241022
"use deepseek reasoner"
--provider deepseek
--model deepseek-reasoner

Extract topic and days from what the user said:

User said--topic--days
"AI news" / "tech news" / nothing specific(omit — default AI)1
"crypto news"
--topic "crypto"
1
"news about climate change"
--topic "climate change"
1
"last week's crypto news"
--topic "crypto"
7
"last 3 days of Bitcoin news"
--topic "Bitcoin"
3
"yesterday's AI news"(omit topic)1
"this week in AI"(omit topic)7

For report type:

User saidflag to add
"news" / "articles" / "digest"
--report news
"trending" / "Twitter" / "YouTube"
--report trending
"dry run" / "preview" / "don't post"
--dry-run
"test Discord" / "test webhook"
--test-discord
anything else(omit — runs all)

Step 5 — Run with uv

uv run
automatically installs all dependencies from the script's inline metadata — no venv setup needed.

uv run "$SCRIPT" [--topic "TOPIC"] [--days N] [--report TYPE] [--provider PROVIDER] [--model MODEL] [--dry-run]

Examples:

# AI news today — DeepSeek (default)
uv run "$SCRIPT"

# Crypto news using OpenAI
uv run "$SCRIPT" --topic "crypto" --provider openai

# Last week's climate news using Claude
uv run "$SCRIPT" --topic "climate change" --days 7 --provider claude

# Use a specific model
uv run "$SCRIPT" --topic "Bitcoin" --provider openai --model gpt-4o

# Trending AI on Twitter and YouTube
uv run "$SCRIPT" --report trending

# Preview without posting to Discord
uv run "$SCRIPT" --topic "Bitcoin" --dry-run

# Test webhook connection
uv run "$SCRIPT" --test-discord

Step 6 — Report back

Tell the user what was posted to Discord, how many items were found per source, and note any skipped sources (e.g. "YouTube skipped — YOUTUBE_API_KEY not set").