Babysitter derivative-free-optimization

Optimization without gradient information

install
source · Clone the upstream repo
git clone https://github.com/a5c-ai/babysitter
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/a5c-ai/babysitter "$T" && mkdir -p ~/.claude/skills && cp -r "$T/library/specializations/domains/science/mathematics/skills/derivative-free-optimization" ~/.claude/skills/a5c-ai-babysitter-derivative-free-optimization && rm -rf "$T"
manifest: library/specializations/domains/science/mathematics/skills/derivative-free-optimization/SKILL.md
source content

Derivative-Free Optimization

Purpose

Provides optimization capabilities for problems where gradient information is unavailable or unreliable.

Capabilities

  • Nelder-Mead simplex method
  • Powell's method
  • Surrogate-based optimization
  • Bayesian optimization
  • Pattern search methods
  • Trust region methods

Usage Guidelines

  1. Method Selection: Choose based on problem characteristics
  2. Function Evaluations: Minimize expensive function calls
  3. Surrogate Models: Build and refine surrogate approximations
  4. Exploration-Exploitation: Balance search strategies

Tools/Libraries

  • scipy.optimize
  • Optuna
  • GPyOpt