Awesome-omni-skill ai-sdk
Vercel AI SDK reference for building AI-powered applications. Use when implementing text/object generation (generateText, streamText, generateObject, streamObject), building chatbots with useChat/useCompletion hooks, defining tools with Zod schemas, creating agents with ToolLoopAgent, or integrating with AI providers (OpenAI, Anthropic, Google, etc.).
install
source · Clone the upstream repo
git clone https://github.com/diegosouzapw/awesome-omni-skill
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/diegosouzapw/awesome-omni-skill "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/ai-agents/ai-sdk" ~/.claude/skills/diegosouzapw-awesome-omni-skill-ai-sdk-625544 && rm -rf "$T"
manifest:
skills/ai-agents/ai-sdk/SKILL.mdsource content
AI SDK
The AI SDK is Vercel's TypeScript toolkit for building AI-powered applications with React, Next.js, Vue, Svelte, Node.js, and more.
When to Use This Skill
Use this skill when:
- Generating text or structured data with LLMs
- Building chatbot UIs with streaming
- Implementing tool calling and function execution
- Creating AI agents that use tools in a loop
- Integrating with AI providers (OpenAI, Anthropic, Google, etc.)
- Working with useChat, useCompletion, or useObject hooks
Documentation
See the
docs/2025-12-02/ directory for complete AI SDK documentation:
Getting Started
- Overview and core concepts00-introduction/index.mdx
- Framework-specific quickstarts (Next.js, Svelte, Vue, Node.js)02-getting-started/
- Prompts, providers, tools, streaming fundamentals02-foundations/
AI SDK Core
- Core API overview03-ai-sdk-core/01-overview.mdx
- Text generation with generateText/streamText03-ai-sdk-core/05-generating-text.mdx
- Structured output with generateObject/streamObject03-ai-sdk-core/10-generating-structured-data.mdx
- Tool definitions and execution03-ai-sdk-core/15-tools-and-tool-calling.mdx
- MCP (Model Context Protocol) tools03-ai-sdk-core/16-mcp-tools.mdx
- Request/response middleware03-ai-sdk-core/40-middleware.mdx
Agents
- Agent fundamentals03-agents/01-overview.mdx
- Building agents with ToolLoopAgent03-agents/02-building-agents.mdx
- Structured workflow patterns03-agents/03-workflows.mdx
- stopWhen and prepareStep control03-agents/04-loop-control.mdx
AI SDK UI (React/Vue/Svelte Hooks)
- UI hooks overview04-ai-sdk-ui/01-overview.mdx
- useChat hook for chat interfaces04-ai-sdk-ui/02-chatbot.mdx
- Tools in chatbots04-ai-sdk-ui/03-chatbot-tool-usage.mdx
- useCompletion for text completion04-ai-sdk-ui/05-completion.mdx
- useObject for streaming JSON04-ai-sdk-ui/08-object-generation.mdx
- Stream protocol details04-ai-sdk-ui/50-stream-protocol.mdx
AI SDK RSC (React Server Components)
- RSC overview05-ai-sdk-rsc/01-overview.mdx
- Streaming components05-ai-sdk-rsc/02-streaming-react-components.mdx
Reference
- Core API reference07-reference/01-ai-sdk-core/
- UI hooks reference07-reference/02-ai-sdk-ui/
- Error types07-reference/05-ai-sdk-errors/
Quick Reference
Core Functions
import { generateText, streamText, generateObject, streamObject } from 'ai'; // Generate text const { text } = await generateText({ model: anthropic('claude-sonnet-4-5-20241022'), prompt: 'Write a haiku about coding', }); // Stream text const result = streamText({ model: anthropic('claude-sonnet-4-5-20241022'), prompt: 'Write a story', }); for await (const chunk of result.textStream) { console.log(chunk); } // Generate structured data const { object } = await generateObject({ model: anthropic('claude-sonnet-4-5-20241022'), schema: z.object({ name: z.string(), age: z.number(), }), prompt: 'Generate a person', }); // Stream structured data const { partialObjectStream } = streamObject({ model: anthropic('claude-sonnet-4-5-20241022'), schema: z.object({ items: z.array(z.string()) }), prompt: 'List 5 fruits', });
Tool Definition
import { tool } from 'ai'; import { z } from 'zod'; const weatherTool = tool({ description: 'Get the weather for a location', inputSchema: z.object({ location: z.string().describe('City name'), }), execute: async ({ location }) => { return { temperature: 72, condition: 'sunny' }; }, }); // Use with generateText/streamText const result = await generateText({ model: anthropic('claude-sonnet-4-5-20241022'), tools: { weather: weatherTool }, prompt: 'What is the weather in San Francisco?', });
Agent (ToolLoopAgent)
import { ToolLoopAgent, stepCountIs, tool } from 'ai'; const agent = new ToolLoopAgent({ model: anthropic('claude-sonnet-4-5-20241022'), tools: { search: tool({ /* ... */ }), calculate: tool({ /* ... */ }), }, stopWhen: stepCountIs(10), // Max 10 steps }); const result = await agent.generate({ prompt: 'Research and calculate...', });
useChat Hook (React)
import { useChat } from '@ai-sdk/react'; import { DefaultChatTransport } from 'ai'; function Chat() { const { messages, sendMessage, status, stop } = useChat({ transport: new DefaultChatTransport({ api: '/api/chat' }), }); return ( <> {messages.map(m => ( <div key={m.id}> {m.role}: {m.parts.map(p => p.type === 'text' ? p.text : null)} </div> ))} <form onSubmit={e => { e.preventDefault(); sendMessage({ text: input }); }}> <input disabled={status !== 'ready'} /> </form> </> ); }
API Route (Next.js)
import { streamText, convertToModelMessages, UIMessage } from 'ai'; import { anthropic } from '@ai-sdk/anthropic'; export async function POST(req: Request) { const { messages }: { messages: UIMessage[] } = await req.json(); const result = streamText({ model: anthropic('claude-sonnet-4-5-20241022'), system: 'You are a helpful assistant.', messages: convertToModelMessages(messages), }); return result.toUIMessageStreamResponse(); }
Providers
// Official providers import { anthropic } from '@ai-sdk/anthropic'; import { openai } from '@ai-sdk/openai'; import { google } from '@ai-sdk/google'; import { mistral } from '@ai-sdk/mistral'; // Use models const model = anthropic('claude-sonnet-4-5-20241022'); const model = openai('gpt-4o'); const model = google('gemini-1.5-flash');
Prompt Types
// Text prompt await generateText({ model, prompt: 'Hello!', }); // System + prompt await generateText({ model, system: 'You are a helpful assistant.', prompt: 'Hello!', }); // Message array await generateText({ model, messages: [ { role: 'user', content: 'Hi!' }, { role: 'assistant', content: 'Hello!' }, { role: 'user', content: 'How are you?' }, ], }); // Multi-modal (images) await generateText({ model, messages: [{ role: 'user', content: [ { type: 'text', text: 'Describe this image' }, { type: 'image', image: fs.readFileSync('./image.png') }, ], }], });
Status Values (useChat)
- Message sent, awaiting response streamsubmitted
- Response actively streamingstreaming
- Complete, ready for new messageready
- Error occurrederror
Stream Result Properties
const result = streamText({ model, prompt }); // Async iterables result.textStream // Stream of text chunks result.fullStream // Full event stream with types // Promises (resolve when complete) result.text // Full generated text result.toolCalls // Tool calls made result.toolResults // Tool execution results result.usage // Token usage result.finishReason // Why generation stopped // Response helpers result.toUIMessageStreamResponse() // For useChat result.toTextStreamResponse() // Plain text stream
Source
Documentation downloaded from: https://github.com/vercel/ai/tree/main/content/docs