Babysitter llamaindex-agent
LlamaIndex agent and query engine setup for RAG-powered agents
install
source · Clone the upstream repo
git clone https://github.com/a5c-ai/babysitter
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/a5c-ai/babysitter "$T" && mkdir -p ~/.claude/skills && cp -r "$T/library/specializations/ai-agents-conversational/skills/llamaindex-agent" ~/.claude/skills/a5c-ai-babysitter-llamaindex-agent && rm -rf "$T"
manifest:
library/specializations/ai-agents-conversational/skills/llamaindex-agent/SKILL.mdsource content
LlamaIndex Agent Skill
Capabilities
- Set up LlamaIndex query engines
- Configure ReAct agents with tools
- Implement OpenAI function calling agents
- Design sub-question query engines
- Set up multi-document agents
- Implement chat engines with memory
Target Processes
- rag-pipeline-implementation
- knowledge-base-qa
Implementation Details
Agent Types
- ReActAgent: Reasoning and acting agent
- OpenAIAgent: Function calling agent
- StructuredPlannerAgent: Plan-and-execute style
- SubQuestionQueryEngine: Complex query decomposition
Query Engine Types
- VectorStoreIndex query engine
- Summary index query engine
- Knowledge graph query engine
- SQL query engine
Configuration Options
- LLM selection
- Tool definitions
- Memory configuration
- Verbose/debug settings
- Query transform modules
Best Practices
- Appropriate index selection
- Clear tool descriptions
- Memory for multi-turn
- Monitor query performance
Dependencies
- llama-index
- llama-index-agent-openai