Skills human-paced-web-ops
Use human-paced browser interaction patterns for web navigation and search tasks with variable delays, hover-before-click, and light randomness. Improves robustness and reduces brittle bot-like behavior while respecting website rules.
install
source · Clone the upstream repo
git clone https://github.com/openclaw/skills
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/openclaw/skills "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/1477009639zw-blip/human-paced-web-ops" ~/.claude/skills/clawdbot-skills-human-paced-web-ops && rm -rf "$T"
manifest:
skills/1477009639zw-blip/human-paced-web-ops/SKILL.mdsource content
Human-Paced Web Ops
Use this skill when the task involves:
- Web search and browsing in dynamic pages
- Multi-step page navigation that easily breaks with rigid scripts
- Long-running read-and-collect workflows
Interaction Pattern
Apply these defaults unless the task needs exact deterministic clicking:
- Before actions, wait for visible/interactive state first.
- Use small randomized delays between actions (for example 300-1200ms).
- Prefer
beforehover
on menus/buttons when possible.click - Use small random click offset inside the same target element (for example 2-8px), not random page clicking.
- Add occasional small scroll steps during long pages.
- Avoid repeated fixed-interval requests; pace actions with jitter.
- Every 5-10 interactions, re-check page state and URL before continuing.
Guardrails
- If blocked by CAPTCHA, login challenge, paywall, or anti-bot page, pause and report the blocker URL plus required manual step.
- Keep identity and session settings stable and traceable for repeatable runs.
- Respect robots/terms and prefer official APIs when available.
Output Requirement
For web collection tasks, include:
- What was visited (titles + links)
- What was extracted
- What could not be accessed (and why)
- Next recoverable step