Claude-skill-registry adding-models
Guide for adding new LLM models to Letta Code. Use when the user wants to add support for a new model, needs to know valid model handles, or wants to update the model configuration. Covers models.json configuration, CI test matrix, and handle validation.
git clone https://github.com/majiayu000/claude-skill-registry
T=$(mktemp -d) && git clone --depth=1 https://github.com/majiayu000/claude-skill-registry "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/data/adding-models" ~/.claude/skills/majiayu000-claude-skill-registry-adding-models && rm -rf "$T"
skills/data/adding-models/SKILL.md- makes HTTP requests (curl)
Adding Models
This skill guides you through adding a new LLM model to Letta Code.
Quick Reference
Key files:
- Model definitions (required)src/models.json
- CI test matrix (optional).github/workflows/ci.yml
- Toolset detection logic (rarely needed)src/tools/manager.ts
Workflow
Step 1: Find Valid Model Handles
Query the Letta API to see available models:
curl -s https://api.letta.com/v1/models/ | jq '.[] | .handle'
Or filter by provider:
curl -s https://api.letta.com/v1/models/ | jq '.[] | select(.handle | startswith("google_ai/")) | .handle'
Common provider prefixes:
- Claude modelsanthropic/
- GPT modelsopenai/
- Gemini modelsgoogle_ai/
- Vertex AIgoogle_vertex/
- Various providersopenrouter/
Step 2: Add to models.json
Add an entry to
src/models.json:
{ "id": "model-shortname", "handle": "provider/model-name", "label": "Human Readable Name", "description": "Brief description of the model", "isFeatured": true, // Optional: shows in featured list "updateArgs": { "context_window": 180000, "temperature": 1.0 // Optional: provider-specific settings } }
Field reference:
: Short identifier used withid
flag (e.g.,--model
)gemini-3-flash
: Full provider/model path from the API (e.g.,handle
)google_ai/gemini-3-flash-preview
: Display name in model selectorlabel
: Brief description shown in selectordescription
: If true, appears in featured models sectionisFeatured
: Model-specific configuration (context window, temperature, reasoning settings, etc.)updateArgs
Provider prefixes:
- Anthropic (Claude models)anthropic/
- OpenAI (GPT models)openai/
- Google AI (Gemini models)google_ai/
- Google Vertex AIgoogle_vertex/
- OpenRouter (various providers)openrouter/
Step 3: Test the Model
Test with headless mode:
bun run src/index.ts --new --model <model-id> -p "hi, what model are you?"
Example:
bun run src/index.ts --new --model gemini-3-flash -p "hi, what model are you?"
Step 4: Add to CI Test Matrix (Optional)
To include the model in automated testing, add it to
.github/workflows/ci.yml:
# Find the headless job matrix around line 122 model: [gpt-5-minimal, gpt-4.1, sonnet-4.5, gemini-pro, your-new-model, glm-4.6, haiku]
Toolset Detection
Models are automatically assigned toolsets based on provider:
→openai/*
toolsetcodex
orgoogle_ai/*
→google_vertex/*
toolsetgemini- Others →
toolsetdefault
This is handled by
isGeminiModel() and isOpenAIModel() in src/tools/manager.ts. You typically don't need to modify this unless adding a new provider.
Common Issues
"Handle not found" error: The model handle is incorrect. Run the validation script to see valid handles.
Model works but wrong toolset: Check
src/tools/manager.ts to ensure the provider prefix is recognized.