Claude-code-templates brightdata-local-search
Set up and run local web searches using Bright Data SERP API with the unfancy-search pipeline (query expansion, SERP retrieval, RRF reranking).
install
source · Clone the upstream repo
git clone https://github.com/davila7/claude-code-templates
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/davila7/claude-code-templates "$T" && mkdir -p ~/.claude/skills && cp -r "$T/cli-tool/components/skills/development/brightdata-local-search" ~/.claude/skills/davila7-claude-code-templates-brightdata-local-search && rm -rf "$T"
manifest:
cli-tool/components/skills/development/brightdata-local-search/SKILL.mdsource content
Bright Data Local Search
Run powerful web searches locally using Bright Data's SERP API. This skill sets up the unfancy-search pipeline — a local search engine with query expansion, multi-engine SERP retrieval, RRF reranking, deduplication, and domain clustering.
Important: This skill uses the LOCAL version only. Do not use the hosted endpoint.
Prerequisites
- Bright Data account with SERP API access
- Anthropic API key (for query expansion, optional)
- Docker (recommended) or Node.js 18+ with pnpm
Setup
Step 1: Clone and Configure
git clone https://github.com/yaronbeen/unfancy-search.git cd unfancy-search cp .env.example .env
Step 2: Set Environment Variables
Edit
.env with your credentials:
BRIGHT_DATA_API_TOKEN=your_brightdata_token BRIGHT_DATA_SERP_ZONE=serp_api1 ANTHROPIC_API_KEY=your_anthropic_key # Optional: enables AI query expansion
Get your Bright Data token from: https://brightdata.com (SERP API section)
Step 3: Start the Local Server
Docker (recommended):
docker compose up -d # Server runs at http://localhost:3000
Node.js:
pnpm install pnpm dev # Server runs at http://localhost:3000
API Endpoints
All requests go to
http://localhost:3000:
| Endpoint | Method | Description |
|---|---|---|
| POST | Start a search job |
| GET | Poll for results |
| POST | Trigger baseline collection |
| GET | Poll baseline progress |
Running a Search
Step 1: Submit Search
curl -X POST http://localhost:3000/api/search \ -H "Content-Type: application/json" \ -d '{"query": "your search term"}'
The response returns a
jobId.
Step 2: Poll for Results
curl http://localhost:3000/api/search-status/{jobId}
Poll every 3 seconds until
status is "done".
Search Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
| string | required | Search query |
| boolean | | Enable AI query expansion via Claude |
| boolean | | Research mode (12 sub-queries for max coverage) |
| string[] | all | SERP engines to use |
| string | — | Geographic region filter |
| number | 10 | Max results (up to 10) |
| string[] | — | Only include results from these domains |
| string[] | — | Exclude results from these domains |
Search Modes
- Basic (
): Single query, fastest, no AI costexpand: false - Expanded (
): Claude Haiku generates 3 sub-queries for broader coverageexpand: true - Research (
): 12 sub-queries for maximum coverageresearch: true
Usage Examples
Basic Search from an Agent
# Start search JOB_ID=$(curl -s -X POST http://localhost:3000/api/search \ -H "Content-Type: application/json" \ -d '{"query": "best practices for API rate limiting"}' | jq -r '.jobId') # Poll until done while true; do RESULT=$(curl -s http://localhost:3000/api/search-status/$JOB_ID) STATUS=$(echo $RESULT | jq -r '.status') if [ "$STATUS" = "done" ]; then echo $RESULT | jq '.results' break fi sleep 3 done
Research Mode with Domain Filtering
curl -X POST http://localhost:3000/api/search \ -H "Content-Type: application/json" \ -d '{ "query": "kubernetes scaling strategies", "research": true, "excludeDomains": ["pinterest.com", "quora.com"] }'
Adding Search to an Existing Agent
To give your Claude Code agent search capabilities:
- Ensure the local server is running (
in the unfancy-search directory)docker compose up -d - Your agent can use
orcurl
to queryfetchhttp://localhost:3000/api/search - Parse the ranked results to ground responses with real web data
Response Format
Results include:
- Ranked URLs with RRF scores
- Domain clustering (grouped by source)
- Cost transparency (per-search expense breakdown)
- Raw and unique result counts
- Search duration
Troubleshooting
| Issue | Solution |
|---|---|
| Server won't start | Verify Docker is running or Node.js 18+ installed |
| No results returned | Check is valid and SERP API zone is active |
| Query expansion not working | Verify is set and valid |
| Slow responses | Disable mode for faster single-query searches |
| Port 3000 in use | Stop other services or modify the port in docker-compose.yml |