Skills novelai-openclaw-adaptor
Explain how to connect NovelAI to OpenClaw through a local OpenAI-compatible shim. Use when the user wants configuration guidance for a local NovelAI adaptor, model selection, or OpenClaw `base_url` setup.
git clone https://github.com/openclaw/skills
T=$(mktemp -d) && git clone --depth=1 https://github.com/openclaw/skills "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/askkumptenchen/novelai-openclaw-adaptor" ~/.claude/skills/openclaw-skills-novelai-openclaw-adaptor && rm -rf "$T"
T=$(mktemp -d) && git clone --depth=1 https://github.com/openclaw/skills "$T" && mkdir -p ~/.openclaw/skills && cp -r "$T/skills/askkumptenchen/novelai-openclaw-adaptor" ~/.openclaw/skills/openclaw-skills-novelai-openclaw-adaptor && rm -rf "$T"
skills/askkumptenchen/novelai-openclaw-adaptor/SKILL.mdNovelAI OpenClaw Adaptor
Use this skill to explain how a local adaptor can bridge OpenClaw's OpenAI-style API calls to NovelAI, and how to configure OpenClaw to talk to that local endpoint.
Scope
This skill is for explanation and local configuration guidance:
- Explain that the adaptor is a local proxy/shim, not a hosted service.
- Explain how
and model names should be configured in OpenClaw.base_url - Help the user choose a supported text model or image model.
- Describe the local commands a user may run after they have verified the package source.
Do not use this skill to:
- Auto-install software.
- ask the user to paste secrets into chat.
- present unverified third-party packages as implicitly trusted.
Safety rules
Follow these rules whenever this skill is used:
- Treat installation as optional and approval-based.
- Before suggesting installation, tell the user to verify the package source, maintainer, and repository or project page.
- Prefer local source checkout or an already-verified package source when available.
- Never ask the user to paste a NovelAI API key into chat.
- Never include secrets inline in command examples.
- If the package source cannot be verified, stop at configuration guidance and ask the user how they want to proceed.
Verification-first workflow
If the user wants to enable runtime use, follow this order:
- Check whether the adaptor is already present locally or already installed.
- If it is not present, explain that the package source should be verified before any installation.
- Ask for approval before running any install command.
- After the user has verified the source and approved installation, use the package's documented install method.
- Prefer interactive or local-only credential entry during configuration.
Safe examples:
pip install novelai-openclaw-adaptor
novelai-config init
Do not use examples like:
novelai-config init --api-key "YOUR_NOVELAI_API_KEY"
If the user needs help deciding whether the package is trustworthy, suggest reviewing its repository, release history, and maintainers before installation.
Supported models
Text models:
glm-4-6eratokayracliokrakeeuterpesigurdgenjisnek
Image models:
nai-diffusion-4-5-fullnai-diffusion-4-5-curatednai-diffusion-4-fullnai-diffusion-4-curatednai-diffusion-3nai-diffusion-3-furry
When helping with setup, ask the user which model they want instead of assuming silently. If they do not care, recommend:
- Text:
glm-4-6 - Image:
nai-diffusion-4-5-full
OpenClaw configuration
When explaining how to connect OpenClaw to the adaptor:
- Set
to the local adaptor endpoint, such asbase_url
orhttp://127.0.0.1:xxxx/v1
.http://localhost:xxxx/v1 - Set the OpenClaw model name to the adaptor-exposed model the user selected.
- Clarify that credential handling belongs to the local adaptor configuration, not the chat.
- If OpenClaw insists on an API key field, explain that some clients accept a placeholder value such as
, but the real NovelAI credential should stay in the local adaptor config only.sk-local
Helpful local commands
If the user has already verified the package source and approved local usage, these commands are relevant:
novelai-config --help novelai-shim --help novelai-image --help
Use
novelai-config init as the normal guided setup entry point. It should collect local configuration such as:
- UI language
- NovelAI credential through local input
- Default shim model
- Default image output directory
- Default image model
Image generation usage
Once the local adaptor is configured and running, image generation can use the normal OpenClaw or OpenAI-style prompt flow.
Example prompt:
1girl, solo, masterpiece, best quality, highly detailed
The adaptor is responsible for translating that prompt into the format expected by NovelAI.