Learn-skills.dev rig
install
source · Clone the upstream repo
git clone https://github.com/NeverSight/learn-skills.dev
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/NeverSight/learn-skills.dev "$T" && mkdir -p ~/.claude/skills && cp -r "$T/data/skills-md/0xplaygrounds/rig/rig" ~/.claude/skills/neversight-learn-skills-dev-rig && rm -rf "$T"
manifest:
data/skills-md/0xplaygrounds/rig/rig/SKILL.mdsource content
Building with Rig
Rig is a Rust library for building LLM-powered applications with a provider-agnostic API. All patterns use the builder pattern and async/await via tokio.
Quick Start
use rig::completion::Prompt; use rig::providers::openai; #[tokio::main] async fn main() -> Result<(), anyhow::Error> { let client = openai::Client::from_env(); let agent = client .agent(openai::GPT_4O) .preamble("You are a helpful assistant.") .build(); let response = agent.prompt("Hello!").await?; println!("{}", response); Ok(()) }
Core Patterns
1. Simple Agent
let agent = client.agent(openai::GPT_4O) .preamble("System prompt") .temperature(0.7) .max_tokens(2000) .build(); let response = agent.prompt("Your question").await?;
2. Agent with Tools
Define a tool by implementing the
Tool trait, then attach it:
let agent = client.agent(openai::GPT_4O) .preamble("You can use tools.") .tool(MyTool) .build();
See
references/tools.md for the full Tool trait signature.
3. RAG (Retrieval-Augmented Generation)
let embedding_model = client.embedding_model(openai::TEXT_EMBEDDING_ADA_002); let index = vector_store.index(embedding_model); let agent = client.agent(openai::GPT_4O) .preamble("Answer using the provided context.") .dynamic_context(5, index) // top-5 similar docs per query .build();
See
references/rag.md for vector store setup and the Embed derive macro.
4. Streaming
use futures::StreamExt; use rig::streaming::StreamedAssistantContent; use rig::agent::prompt_request::streaming::MultiTurnStreamItem; let mut stream = agent.stream_prompt("Tell me a story").await?; while let Some(chunk) = stream.next().await { match chunk? { MultiTurnStreamItem::StreamAssistantItem( StreamedAssistantContent::Text(text) ) => print!("{}", text.text), MultiTurnStreamItem::FinalResponse(resp) => { println!("\n{}", resp.response()); } _ => {} } }
5. Structured Extraction
use schemars::JsonSchema; use serde::{Deserialize, Serialize}; #[derive(Deserialize, Serialize, JsonSchema)] struct Person { pub name: Option<String>, pub age: Option<u8>, } let extractor = client.extractor::<Person>(openai::GPT_4O).build(); let person = extractor.extract("John is 30 years old.").await?;
6. Chat with History
use rig::completion::Chat; let history = vec![ Message::from("Hi, I'm Alice."), // ...previous messages ]; let response = agent.chat("What's my name?", history).await?;
Agent Builder Methods
| Method | Description |
|---|---|
| Set system prompt |
| Add static context document |
| Add RAG with top-n retrieval |
| Attach a callable tool |
| Attach multiple tools |
| Set temperature (0.0-1.0) |
| Set max output tokens |
| Provider-specific params |
| Control tool usage |
| Build the agent |
Available Providers
Create a client with
ProviderName::Client::from_env() or ProviderName::Client::new("key").
| Provider | Module | Example Model Constant |
|---|---|---|
| OpenAI | | , |
| Anthropic | | , |
| Cohere | | |
| Mistral | | |
| Gemini | | model string |
| Groq | | model string |
| Ollama | | model string |
| DeepSeek | | model string |
| xAI | | model string |
| Together | | model string |
| Perplexity | | model string |
| OpenRouter | | model string |
| HuggingFace | | model string |
| Azure | | deployment string |
| Hyperbolic | | model string |
| Galadriel | | model string |
| Moonshot | | model string |
| Mira | | model string |
| Voyage AI | | embeddings only |
Vector Store Crates
| Backend | Crate |
|---|---|
| In-memory | (built-in) |
| MongoDB | |
| LanceDB | |
| Qdrant | |
| SQLite | |
| Neo4j | |
| Milvus | |
| SurrealDB | |
Key Rules
- All async code runs on tokio.
- Use
/WasmCompatSend
instead of rawWasmCompatSync
/Send
for WASM compatibility.Sync - Use proper error types with
— neverthiserror
.Result<(), String> - Avoid
— use.unwrap()
operator.?
Further Reference
Detailed API documentation (available when installed via Claude Code skills):
- tools — Tool trait, ToolDefinition, ToolEmbedding, attachment patterns
- rag — Vector stores, Embed derive, EmbeddingsBuilder, search requests
- providers — Provider-specific initialization, model constants, env vars
- patterns — Multi-agent, hooks, streaming details, chaining, extraction
For the full reference, see the Rig examples at
rig-core/examples/ or https://docs.rig.rs