Skills Trugen AI

Build, configure, and deploy conversational video agents using the Trugen AI platform API. Use this skill when the user wants to create AI video avatars, manage knowledge bases, set up webhooks/callbacks, embed agents into websites, integrate with LiveKit, configure tools or MCPs, set up multilingual agents, or bring their own LLM to Trugen AI.

install
source · Clone the upstream repo
git clone https://github.com/openclaw/skills
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/openclaw/skills "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/ajayk47/trugenai" ~/.claude/skills/openclaw-skills-trugen-ai && rm -rf "$T"
OpenClaw · Install into ~/.openclaw/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/openclaw/skills "$T" && mkdir -p ~/.openclaw/skills && cp -r "$T/skills/ajayk47/trugenai" ~/.openclaw/skills/openclaw-skills-trugen-ai && rm -rf "$T"
manifest: skills/ajayk47/trugenai/SKILL.md
source content

Trugen AI

Build real-time conversational video agents — AI-powered avatars that see, hear, speak, and reason with users in under 1 second of latency.

API Base URL
https://api.trugen.ai
Authentication
x-api-key: <your-api-key>
header on all requests
Official Docsdocs.trugen.ai
Developer Portalapp.trugen.ai

Required Credentials

VariableDescriptionWhere to Get
TRUGEN_API_KEY
Primary API key for all Trugen API calls (sent as
x-api-key
header)
Developer Portal
TRUGEN_AVATAR_ID
(Optional) Default avatar ID for LiveKit integrationDeveloper Portal

Security: Never expose

TRUGEN_API_KEY
in client-side code. For widget/iFrame embeds, use a server-side proxy to keep keys secret. See references/embedding.md for details.

Platform Pipeline

StepComponentFunction
1WebRTCBidirectional audio/video streaming
2STT (Deepgram)Streaming speech-to-text
3Turn DetectionNatural conversation boundary detection
4LLM (OpenAI, Groq, custom)Contextual response generation
5Knowledge BaseGrounding answers in your data
6TTS (ElevenLabs)Natural, expressive speech synthesis
7Huma-01Neural avatar video generation with lip sync & microexpressions

Quickstart

  1. Create an agent →
    POST /v1/ext/agent
    — see references/agents.md
  2. Embed via iFrame or Widget — see references/embedding.md

API Endpoints Overview

ResourceEndpointsReference
AgentsCreate, Get, List, Update, Delete, Create from Templateagents.md
Knowledge BaseCreate KB, Add Docs, Get, List, Update, Delete KB/Docknowledge-base.md
TemplatesCreate, Get, List, Update, Delete persona templatestemplates.md
Tools & MCPsCreate/manage function-calling tools and MCP serverstools-and-mcps.md
WebhooksCallback events, payload format, handler exampleswebhooks.md
EmbeddingiFrame, Widget, LiveKit integration + avatar IDsembedding.md
Providers/AvatarsAvailable LLMs, STT, TTS, avatars, languages, BYO-LLMproviders-avatars-languages.md
PromptingVoice prompt strategies, guardrails, use case examplesprompting-and-use-cases.md

Conversations

Retrieve transcripts for completed sessions:

GET /v1/ext/conversation/{id}
— Returns agent_id, status, transcript array, recording_url.

Workflow Guide

Determine what the user needs, then load the appropriate reference:

TaskReference File
Creating/managing agentsagents.md
Attaching data/documentsknowledge-base.md
Reusing personas across agentstemplates.md
Calling external APIs from agenttools-and-mcps.md
Reacting to conversation eventswebhooks.md
Embedding agent in websiteembedding.md
Choosing LLM/voice/languageproviders-avatars-languages.md
Writing effective promptsprompting-and-use-cases.md

Developer Resources

ResourceLink
Documentationdocs.trugen.ai
API Referencedocs.trugen.ai/api-reference
Developer Portalapp.trugen.ai
Community Discorddiscord.gg/4dqc8A66FJ
Supportsupport@trugen.ai
GitHub Examplestrugenai/trugen-examples
Changelogdocs.trugen.ai/changelog