git clone https://github.com/majiayu000/claude-skill-registry
T=$(mktemp -d) && git clone --depth=1 https://github.com/majiayu000/claude-skill-registry "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/data/dbt-projects-on-snowflake" ~/.claude/skills/majiayu000-claude-skill-registry-dbt-projects-on-snowflake && rm -rf "$T"
skills/data/dbt-projects-on-snowflake/SKILL.mddbt Projects on Snowflake
Deploy, manage, and monitor dbt projects natively within Snowflake using web-based workspaces, schema-level DBT PROJECT objects, and comprehensive event table telemetry.
Quick Start
Three Ways to Use dbt Projects:
- Snowsight Workspaces - Web-based IDE for interactive development
- DBT PROJECT Objects - Deployed projects for production execution
- Snowflake CLI - Command-line deployment and execution
Setup
Complete setup instructions including prerequisites, external access integration, Git API integration, and event table configuration are in
references/SETUP.md.
Deployment Methods
Method 1: Snowflake CLI (Recommended)
# Deploy project snow dbt deploy my_project --source . # Execute commands snow dbt execute my_project run snow dbt execute my_project build
Method 2: Snowsight
- Navigate to Projects → My Workspace
- Create new project from Git repository
- Configure profiles.yml
- Deploy as DBT PROJECT object
Method 3: SQL Execution
Execute directly in SQL:
EXECUTE DBT PROJECT <db>.<schema>.<project> args='build'; EXECUTE DBT PROJECT <db>.<schema>.<project> args='build --full-refresh'; EXECUTE DBT PROJECT <db>.<schema>.<project> args='build --select tag:gold';
Scheduling & Automation
For automated scheduling with Snowflake Tasks, see the "Optional: Schedule Automated Runs" section in
references/SETUP.md.
Event Table Monitoring
Event Table Configuration
Configure event tables following the Event Table Monitoring Configuration section in
references/SETUP.md. This enables OpenTelemetry-based monitoring of dbt project executions.
Monitoring Queries
All monitoring scripts use parameterized event table references. Specify your event table location when running:
# Example: Query recent executions snow sql -f scripts/recent_executions.sql --enable-templating JINJA \ -D event_table=MY_DATABASE.MY_SCHEMA.EVENT_LOG # Example: Check for errors snow sql -f scripts/execution_errors.sql --enable-templating JINJA \ -D event_table=LOGS_DB.PUBLIC.DBT_EVENTS # Example: Performance metrics snow sql -f scripts/performance_metrics.sql --enable-templating JINJA \ -D event_table=MY_DATABASE.MY_SCHEMA.EVENT_LOG
Core Monitoring:
- Lists recent dbt project executions with severityrecent_executions.sql
- Query ERROR logs to identify failuresexecution_errors.sql
- Query CPU and memory usage metricsperformance_metrics.sql
- Query execution spans for timing analysistrace_spans.sql
- Summarize executions by project with error countsexecution_summary.sql
Advanced Use Cases:
- Alert trigger for execution failures (returns error count)alert_failures.sql
- Week-over-week performance comparisonperformance_regression.sql
- CPU and memory consumption by projectresource_usage.sql
- Complete execution audit trail for complianceaudit_trail.sql
Event Table Structure
Event tables follow the OpenTelemetry data model with these key columns:
| Column | Description |
|---|---|
| TIMESTAMP | UTC timestamp when event was created (end of time span for span events) |
| START_TIMESTAMP | For span events, the start of the time span |
| TRACE | Tracing context with and |
| RESOURCE_ATTRIBUTES | Source identification: database, schema, user, warehouse, etc. |
| SCOPE | Event scopes (e.g., class names for logs) |
| RECORD_TYPE | Event type: , , , , |
| RECORD | JSON object with record-specific data (severity, metric type, span details) |
| RECORD_ATTRIBUTES | Event metadata set by Snowflake or code |
| VALUE | Actual log message, metric value, or null for spans |
Best Practices
Performance Optimization:
- Always filter by TIMESTAMP to limit scanned data (reduces cost)
- Use RESOURCE_ATTRIBUTES for efficient filtering by project/database/schema
- Archive old event table data (>90 days) to separate tables
Monitoring Strategy:
- Set event tables at DATABASE level, not account or schema
- Configure appropriate log/trace/metric levels per schema
- Always filter by
to avoid scanning large event tablesTIMESTAMP - Use
to isolate dbt eventssnow.executable.type = 'DBT_PROJECT' - Leverage
for filtering by project/database/schemaRESOURCE_ATTRIBUTES - Monitor ERROR severity logs for immediate alerts
- Use SPAN records to analyze execution timing and bottlenecks
Alerting Priorities:
- High: Any ERROR in execution, execution >2x historical avg, warehouse credit anomalies
- Medium: WARNING logs, test/model failures on critical models, performance trending down
- Low: INFO logs, scheduled job confirmations, performance metrics for analysis
Event Table Troubleshooting
| Issue | Solution |
|---|---|
| No events captured | Verify event table set at DATABASE level with |
| Too many events | Adjust // per schema |
| Slow monitoring queries | Always filter by TIMESTAMP first; consider archiving old data |
| Missing metrics | Set for schema |
| Missing traces | Set for schema |
| Cannot see project name | Verify filter |
Supported dbt Commands
| Command | Workspaces | EXECUTE DBT PROJECT | snow dbt execute |
|---|---|---|---|
| build | ✅ | ✅ | ✅ |
| run | ✅ | ✅ | ✅ |
| test | ✅ | ✅ | ✅ |
| compile | ✅ | ✅ | ✅ |
| seed | ✅ | ✅ | ✅ |
| snapshot | ✅ | ✅ | ✅ |
| deps | ✅ (workspace only) | ❌ | ❌ |
Team Collaboration
Flexibility: Team members can use different development approaches simultaneously:
- Developer A: dbt Projects on Snowflake workspaces
- Developer B: dbt Cloud
- Developer C: Local VS Code with dbt CLI
- All check into the same Git repository
Key Commands
| Command | Purpose |
|---|---|
| Deploy project to Snowflake |
| Run dbt models |
| Run and test models |
| Run tests only |
| List all dbt projects |
Troubleshooting
For setup and deployment issues, see
references/SETUP.md.
For monitoring issues, see the Troubleshooting table in the Event Table Monitoring section above.
Related Skills
Complementary Observability:
skill - For cross-platform execution logging and historical trend analysisdbt-artifacts
When to use both together:
- dbt Projects on Snowflake for real-time monitoring with OpenTelemetry event tables
- dbt Artifacts for cross-platform historical analysis and long-term metrics
When to use one vs the other:
- Use dbt Projects on Snowflake alone if you exclusively run dbt within Snowflake
- Use dbt Artifacts alone if you run dbt outside Snowflake (dbt Cloud, Airflow, local)
- Use both for comprehensive enterprise monitoring (real-time + historical)
Resources
Local Files
- Monitoring Scripts:
- Ready-to-use parameterized SQL scripts for monitoringscripts/- Core Monitoring:
,recent_executions.sql
,execution_errors.sql
,performance_metrics.sql
,trace_spans.sqlexecution_summary.sql - Advanced Monitoring:
,alert_failures.sql
,performance_regression.sql
,resource_usage.sqlaudit_trail.sql
- Core Monitoring:
- Setup Guide:
- Complete step-by-step setup including event table configuration and task schedulingreferences/SETUP.md