Babysitter langfuse-integration
LangFuse LLM observability integration for tracing, analytics, and cost tracking
install
source · Clone the upstream repo
git clone https://github.com/a5c-ai/babysitter
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/a5c-ai/babysitter "$T" && mkdir -p ~/.claude/skills && cp -r "$T/library/specializations/ai-agents-conversational/skills/langfuse-integration" ~/.claude/skills/a5c-ai-babysitter-langfuse-integration && rm -rf "$T"
manifest:
library/specializations/ai-agents-conversational/skills/langfuse-integration/SKILL.mdsource content
LangFuse Integration Skill
Capabilities
- Set up LangFuse tracing for LLM calls
- Configure cost tracking and analytics
- Implement prompt management
- Set up evaluation datasets
- Design custom trace metadata
- Create dashboards and alerts
Target Processes
- llm-observability-monitoring
- cost-optimization-llm
Implementation Details
Core Features
- Tracing: Track LLM calls, chains, and agents
- Prompts: Version and manage prompts
- Analytics: Usage, latency, cost metrics
- Datasets: Evaluation and testing data
- Scores: Track output quality
Integration Methods
- LangChain callback handler
- Direct SDK integration
- OpenAI drop-in replacement
- Decorator-based tracing
Configuration Options
- Public/secret keys
- Host URL (cloud or self-hosted)
- Sampling rate
- Metadata configuration
- User tracking
Best Practices
- Consistent trace naming
- Meaningful metadata
- Regular prompt versioning
- Set up alerting
Dependencies
- langfuse
- langchain (for callback integration)