Cli firecrawl-download

install
source · Clone the upstream repo
git clone https://github.com/firecrawl/cli
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/firecrawl/cli "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/firecrawl-download" ~/.claude/skills/firecrawl-cli-firecrawl-download && rm -rf "$T"
manifest: skills/firecrawl-download/SKILL.md
source content

firecrawl download

Experimental. Convenience command that combines

map
+
scrape
to save an entire site as local files.

Maps the site first to discover pages, then scrapes each one into nested directories under

.firecrawl/
. All scrape options work with download. Always pass
-y
to skip the confirmation prompt.

When to use

  • You want to save an entire site (or section) to local files
  • You need offline access to documentation or content
  • Bulk content extraction with organized file structure

Quick start

# Interactive wizard (picks format, screenshots, paths for you)
firecrawl download https://docs.example.com

# With screenshots
firecrawl download https://docs.example.com --screenshot --limit 20 -y

# Multiple formats (each saved as its own file per page)
firecrawl download https://docs.example.com --format markdown,links --screenshot --limit 20 -y
# Creates per page: index.md + links.txt + screenshot.png

# Filter to specific sections
firecrawl download https://docs.example.com --include-paths "/features,/sdks"

# Skip translations
firecrawl download https://docs.example.com --exclude-paths "/zh,/ja,/fr,/es,/pt-BR"

# Full combo
firecrawl download https://docs.example.com \
  --include-paths "/features,/sdks" \
  --exclude-paths "/zh,/ja" \
  --only-main-content \
  --screenshot \
  -y

Download options

OptionDescription
--limit <n>
Max pages to download
--search <query>
Filter URLs by search query
--include-paths <paths>
Only download matching paths
--exclude-paths <paths>
Skip matching paths
--allow-subdomains
Include subdomain pages
-y
Skip confirmation prompt (always use in automated flows)

Scrape options (all work with download)

-f <formats>
,
-H
,
-S
,
--screenshot
,
--full-page-screenshot
,
--only-main-content
,
--include-tags
,
--exclude-tags
,
--wait-for
,
--max-age
,
--country
,
--languages

See also