Goose-skills linkedin-job-scraper
install
source · Clone the upstream repo
git clone https://github.com/gooseworks-ai/goose-skills
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/gooseworks-ai/goose-skills "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/capabilities/linkedin-job-scraper" ~/.claude/skills/gooseworks-ai-goose-skills-linkedin-job-scraper && rm -rf "$T"
manifest:
skills/capabilities/linkedin-job-scraper/SKILL.mdsource content
LinkedIn Scraper
Overview
This skill finds LinkedIn job postings by running
tools/jobspy_scraper.py, a thin wrapper
around the JobSpy library. It handles installation,
parameter construction, execution, and result interpretation.
Quick Start
Install the dependency once (requires Python 3.10+):
python3.12 -m pip install -U python-jobspy --break-system-packages
Run the scraper:
python3.12 tools/jobspy_scraper.py \ --search "software engineer" \ --location "San Francisco, CA" \ --results 25 \ --output .tmp/jobs.csv
Results are saved as CSV and printed as a summary table.
Workflow
Step 1 — Understand the request
Identify from the user's message:
- Search term — job title, role, or keyword (required)
- Location — city, state, or "Remote" (optional but recommended)
- Results wanted — default to 25 if not specified
- Recency —
filter if user wants recent posts (e.g. "last 48 hours")hours_old - Company filter —
if targeting a specific companylinkedin_company_ids - Full descriptions — set
if user needs job description text--fetch-descriptions
If anything is ambiguous (e.g. "find AI jobs"), pick reasonable defaults and tell the user what you used.
Step 2 — Construct the command
Build the
tools/jobspy_scraper.py command using the parameters below.
Always save output to .tmp/ so it's disposable and easy to find.
python tools/jobspy_scraper.py \ --search "<term>" \ --location "<location>" \ --results <N> \ [--hours-old <N>] \ [--fetch-descriptions] \ [--company-ids <id1,id2>] \ [--job-type fulltime|parttime|contract|internship] \ [--remote] \ --output .tmp/<descriptive_filename>.csv
Note:
--hours-old and --easy-apply cannot be used together (LinkedIn API constraint).
Step 3 — Run the script
Execute the command. The script will print a progress message and a summary of results found.
If the script is not found at
tools/jobspy_scraper.py, check whether the file needs to be created
by reading skills/linkedin-job-scraper/scripts/jobspy_scraper.py and copying it to tools/.
Step 4 — Interpret and present results
After the run:
- Report how many jobs were found
- Show a brief table: Title | Company | Location | Salary | Posted
- Note the output file path so the user can open it
- If 0 results: suggest broadening the search term or removing the location filter
Parameters Reference
| Flag | Description | Default |
|---|---|---|
| Job title / keywords | required |
| City, state, or country | none |
| Number of results to fetch | 25 |
| Only jobs posted within N hours | none |
| Fetch full job descriptions (slower) | false |
| Comma-separated LinkedIn company IDs | none |
| fulltime, parttime, contract, internship | any |
| Filter for remote jobs only | false |
| Path for CSV output | .tmp/jobs.csv |
Output Columns
The CSV output includes:
| Column | Description |
|---|---|
| Job title |
| Employer name |
| City / State / Country |
| True/False |
| fulltime, contract, etc. |
| When the listing was posted |
| Minimum salary |
| Maximum salary |
| Currency code |
| Direct link to the LinkedIn posting |
| Full job description (if --fetch-descriptions used) |
| Seniority level (LinkedIn-specific) |
| Industry classification |
Common Use Cases
Find recent engineering roles at a startup:
python tools/jobspy_scraper.py --search "growth engineer" --location "New York" \ --results 50 --hours-old 72 --output .tmp/growth_eng_nyc.csv
Monitor what a specific company is hiring for:
# First find the LinkedIn company ID from the company's LinkedIn URL python tools/jobspy_scraper.py --search "engineer" --company-ids 1234567 \ --results 100 --fetch-descriptions --output .tmp/company_hiring.csv
Find remote contract roles:
python tools/jobspy_scraper.py --search "data analyst" --remote \ --job-type contract --results 30 --output .tmp/remote_contracts.csv
Error Handling
| Error | Fix |
|---|---|
| Run |
| 0 results returned | Broaden search term, remove location, increase |
| Rate limited / blocked | Wait a few minutes; avoid running back-to-back large scrapes |
| Remove one of those flags |
Script Location
The scraper script lives at
tools/jobspy_scraper.py.
If it doesn't exist, copy it from
skills/linkedin-scraper/scripts/jobspy_scraper.py to tools/:
cp skills/linkedin-job-scraper/scripts/jobspy_scraper.py tools/