install
source · Clone the upstream repo
git clone https://github.com/a5c-ai/babysitter
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/a5c-ai/babysitter "$T" && mkdir -p ~/.claude/skills && cp -r "$T/library/specializations/domains/science/mathematics/skills/derivative-free-optimization" ~/.claude/skills/a5c-ai-babysitter-derivative-free-optimization && rm -rf "$T"
manifest:
library/specializations/domains/science/mathematics/skills/derivative-free-optimization/SKILL.mdsource content
Derivative-Free Optimization
Purpose
Provides optimization capabilities for problems where gradient information is unavailable or unreliable.
Capabilities
- Nelder-Mead simplex method
- Powell's method
- Surrogate-based optimization
- Bayesian optimization
- Pattern search methods
- Trust region methods
Usage Guidelines
- Method Selection: Choose based on problem characteristics
- Function Evaluations: Minimize expensive function calls
- Surrogate Models: Build and refine surrogate approximations
- Exploration-Exploitation: Balance search strategies
Tools/Libraries
- scipy.optimize
- Optuna
- GPyOpt