Skills-for-fabric powerbi-authoring-cli
git clone https://github.com/microsoft/skills-for-fabric
T=$(mktemp -d) && git clone --depth=1 https://github.com/microsoft/skills-for-fabric "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/powerbi-authoring-cli" ~/.claude/skills/microsoft-skills-for-fabric-powerbi-authoring-cli && rm -rf "$T"
skills/powerbi-authoring-cli/SKILL.mdUpdate Check — ONCE PER SESSION (mandatory) The first time this skill is used in a session, run the check-updates skill before proceeding.
- GitHub Copilot CLI / VS Code: invoke the
skill.check-updates- Claude Code / Cowork / Cursor / Windsurf / Codex: compare local vs remote package.json version.
- Skip if the check was already performed earlier in this session.
CRITICAL NOTES
- To find the workspace details (including its ID) from workspace name: list all workspaces and, then, use JMESPath filtering
- To find the item details (including its ID) from workspace ID, item type, and item name: list all items of that type in that workspace and, then, use JMESPath filtering
Power BI Semantic Model Authoring — CLI Skill
Table of Contents
| Task | Reference | Notes |
|---|---|---|
| Finding Workspaces and Items in Fabric | COMMON-CLI.md § Finding Workspaces and Items in Fabric | Mandatory — READ link first [needed for finding workspace id by its name or item id by its name, item type, and workspace id] |
| Fabric Topology & Key Concepts | COMMON-CORE.md § Fabric Topology & Key Concepts | Hierarchy; Finding Things in Fabric |
| Environment URLs | COMMON-CORE.md § Environment URLs | Production (Public Cloud) |
| Tool Selection Rationale | COMMON-CLI.md § Tool Selection Rationale | |
| Authentication Recipes | COMMON-CLI.md § Authentication Recipes | flows, environment detection, token acquisition, and debugging |
Fabric Control-Plane API via | COMMON-CLI.md § Fabric Control-Plane API via | Always pass ; includes workspace/item operations, pagination, and LRO patterns |
OneLake Data Access via | COMMON-CLI.md § OneLake Data Access via | Use not (different token audience) |
| Job Execution (CLI) | COMMON-CLI.md § Job Execution | Run notebooks/pipelines, refresh semantic models, check/cancel jobs |
| OneLake Shortcuts | COMMON-CLI.md § OneLake Shortcuts | Create a Shortcut; List Shortcuts; Delete a Shortcut |
| Capacity Management (CLI) | COMMON-CLI.md § Capacity Management | List Capacities; Assign Workspace to Capacity |
| Composite Recipes | COMMON-CLI.md § Composite Recipes | End-to-end workspace→lakehouse→file, SQL endpoint→query, and notebook execution recipes |
| Gotchas & Troubleshooting (CLI-Specific) | COMMON-CLI.md § Gotchas & Troubleshooting (CLI-Specific) | audience, shell escaping, token expiry |
| Quick Reference | COMMON-CLI.md § Quick Reference | Template; Token Audience ↔ CLI Tool Matrix |
| DAX Queries & Metadata Discovery | powerbi-consumption-cli | Read-only DAX queries; use for post-creation validation |
| Tool Stack | SKILL.md § Tool Stack | (primary), (JSON parsing), encoding |
| Authentication & API Audiences | SKILL.md § Authentication & API Audiences | Two audiences: Fabric API vs Power BI Datasets API |
| Must/Prefer/Avoid | SKILL.md § Must/Prefer/Avoid | Guardrails for semantic model authoring |
| SemanticModel Definition & Envelope | ITEM-DEFINITIONS-CORE.md § SemanticModel | TMDL format; required parts, envelope structure, support matrix |
| TMDL File Structure & Examples | SKILL.md § TMDL File Structure | Required parts, minimal content examples |
| TMDL CRUD (Create / Get / Update) | SKILL.md § Create Semantic Model | Create → Get/Download → Update; full lifecycle with LRO |
| Authoring Scope Matrix | SKILL.md § Authoring Scope Matrix | What Fabric API supports vs what to avoid |
| Refresh Operations | SKILL.md § Refresh Operations | Trigger, cancel, history, schedule (Power BI API) |
| Data Sources & Parameters | SKILL.md § Data Sources & Parameters | Get/update data sources and parameters |
| Permissions | SKILL.md § Permissions | Grant/update dataset user permissions |
| Deployment Pipelines | SKILL.md § Deployment Pipelines | List, get stages, deploy between stages |
| Agentic Workflow | SKILL.md § Agentic Workflow | Step-by-step: discover → create → verify → refresh → validate |
| Troubleshooting | SKILL.md § Troubleshooting | Common errors table: LRO, auth, TMDL encoding, refresh |
| Examples | SKILL.md § Examples | Create model, download definition, refresh, deploy |
| Property-to-API Mapping | semantic-model-properties-guide.md § Property-to-API Mapping | Maps each property category to the correct API surface |
| Owner, Storage Mode & Operational Metadata | semantic-model-properties-guide.md § Owner, Storage Mode | Power BI Datasets API properties |
| Refresh History Response Properties | semantic-model-properties-guide.md § Refresh History | Refresh detail response fields |
| Data Source Response Properties | semantic-model-properties-guide.md § Data Sources | Connection and gateway properties |
| DirectQuery / LiveConnection Refresh Schedule | semantic-model-properties-guide.md § DQ Refresh Schedule | DirectQuery/LiveConnection schedule settings |
| Upstream Dataflow Links | semantic-model-properties-guide.md § Upstream Dataflows | Dataflow dependency properties |
| Per-Table Storage Mode | semantic-model-properties-guide.md § Per-Table Storage | Table-level storage mode via TMDL |
| TMDL Syntax Rules | tmdl-authoring-guide.md § TMDL Syntax Rules | Tab indentation, object declaration, quoting rules |
| Modeling Best Practices | tmdl-authoring-guide.md § Modeling Best Practices | Naming conventions, column rules, measure & DAX rules, format strings |
| Relationships | tmdl-authoring-guide.md § Relationships | Relationship declarations, key rules |
| Hierarchies | tmdl-authoring-guide.md § Hierarchies | Hierarchy declarations and key rules |
| Direct Lake Guidelines | tmdl-authoring-guide.md § Direct Lake Guidelines | Direct Lake mode configuration and constraints |
| Calculated Tables | tmdl-authoring-guide.md § Calculated Tables | DAX-based calculated table definitions |
| Date/Calendar Table | tmdl-authoring-guide.md § Date/Calendar Table | Calendar table setup and marking |
| Parameters | tmdl-authoring-guide.md § Parameters | Expression-based parameter declarations |
| Annotations | tmdl-authoring-guide.md § Annotations | Model and object-level annotations |
| TMDL File Layout & Core Files | tmdl-advanced-features-guide.md § File Layout | Directory structure, database.tmdl, model.tmdl |
| Calculation Groups | tmdl-advanced-features-guide.md § Calculation Groups | Calculation group tables and items |
| Security Roles | tmdl-advanced-features-guide.md § Security Roles | RLS/OLS role definitions |
| Security Role Memberships | SKILL.md § Security Role Memberships | Add/list/delete users & groups in RLS roles (Power BI API) |
| Translations / Cultures | tmdl-advanced-features-guide.md § Translations / Cultures | Localization via culture files |
| Perspectives | tmdl-advanced-features-guide.md § Perspectives | Perspective definitions for subset views |
| Functions | tmdl-advanced-features-guide.md § Functions | User-defined DAX functions in the model |
| Calendar Objects | tmdl-advanced-features-guide.md § Calendar Objects | Auto date/time calendar table objects |
Tool Stack
| Tool | Role | Install |
|---|---|---|
CLI | Primary: for Fabric and Power BI REST API calls, for auth. | Pre-installed in most dev environments |
| Parse JSON from responses | Pre-installed or trivial |
(Linux/macOS) / (PowerShell) | Encode TMDL file content for definition payloads | Built-in |
Agent check — verify before first operation:
az version 2>/dev/null || echo "INSTALL: https://learn.microsoft.com/cli/azure/install-azure-cli"
Authentication & API Audiences
This skill uses two distinct API audiences. Using the wrong audience returns a 401.
| API | Audience () | Use For |
|---|---|---|
| Fabric Items API | | Create/get/update/delete semantic model definitions, list items, LRO polling |
| Power BI Datasets API | | Refresh, data sources, parameters, permissions, deployment pipelines |
# Fabric Items API — semantic model definition operations az rest --method post \ --resource "https://api.fabric.microsoft.com" \ --url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/semanticModels" \ ... # Power BI Datasets API — refresh, data sources, permissions az rest --method post \ --resource "https://analysis.windows.net/powerbi/api" \ --url "https://api.powerbi.com/v1.0/myorg/groups/$WS_ID/datasets/$DATASET_ID/refreshes" \ ...
Must/Prefer/Avoid
MUST DO
- Read the relevant TMDL reference sections BEFORE generating any TMDL — at minimum read TMDL Syntax Rules and Modeling Best Practices. If the task involves relationships, hierarchies, calculation groups, security roles, or translations, also read the corresponding sections in tmdl-authoring-guide.md and tmdl-advanced-features-guide.md. Do not generate TMDL from memory.
- Always pass
to--resource
— omitting it causes silent auth failures. Use the correct audience per the table above.az rest - Always pass
on POST/PATCH/PUT calls with a--headers "Content-Type=application/json"
to the Power BI Datasets API — omitting it causes--body
errors.Unsupported Media Type - Include ALL definition parts in
— modified + unmodified. The API replaces the entire definition; omitting parts deletes them.updateDefinition - Never include
in.platform
payloads — it is Git integration metadata and causes errors.updateDefinition - Poll LRO to completion —
,createItemWithDefinition
, andgetDefinition
returnupdateDefinition
with an202 Accepted
header. Poll until terminal state.Operation-Id - Base64-encode TMDL content — all
values in definition parts must be base64-encoded.payload - Single-quote names with special chars — names containing spaces,
,.
,=
, or:
must be wrapped in single quotes in TMDL.' - Verify workspace has capacity before creating a semantic model — call
and checkGET /v1/workspaces/{id}
.capacityId
PREFER
(single POST) over create-then-update for new semantic models.createItemWithDefinition- TMDL format over TMSL — TMDL is text-based, diff-friendly, and the preferred format for Fabric.
- Measures before columns in TMDL table files — follows TMDL convention.
- Multi-line DAX in triple backticks — improves readability for complex expressions.
- Route fine-grained changes to
— for adding/modifying individual measures, columns, or relationships, the MCP server is more efficient than full definition round-trips.powerbi-modeling-mcp - Get definition before updating — always retrieve the current definition, modify, then POST back to avoid overwriting concurrent changes.
- Cross-reference
for post-creation validation — run DAX queries to verify measures, relationships, and data.powerbi-consumption-cli
AVOID
for small changes — a full definition round-trip is heavy; route toupdateDefinition
for individual object edits.powerbi-modeling-mcp- Report creation — not supported by this skill. Reports require a separate definition format (PBIR/PBIR-Legacy).
on new objects — TMDL auto-generates lineage tags; adding them manually causes conflicts.lineageTag
comments in TMDL — not supported. Use//
descriptions instead.///
property in TMDL — usedescription
syntax above the object instead.///- Hardcoded workspace/item IDs — resolve dynamically via REST API (see COMMON-CLI.md § Finding Workspaces and Items in Fabric).
- Sending only modified parts in
— the API replaces the full definition; missing parts are deleted.updateDefinition
TMDL File Structure
For the full definition envelope and part paths, see ITEM-DEFINITIONS-CORE.md § SemanticModel.
Required TMDL parts for
createItemWithDefinition and updateDefinition:
| Part Path | Content | Required |
|---|---|---|
| Semantic model connection settings | Yes |
| Database properties (compatibility level) | Yes |
| Model properties (culture, default summarization) | Yes |
| Per-table: columns, measures, partitions | Yes (≥1) |
Critical:
must include ALL parts — modified and unmodified. The API replaces the entire definition. Never includeupdateDefinitionin update payloads..platform
For TMDL syntax rules, naming conventions, and modeling best practices, see tmdl-authoring-guide.md.
Minimal TMDL Content Examples
definition.pbism
{ "version": "4.2", "settings": { "qnaEnabled": true } }
database.tmdl
database compatibilityLevel: 1702 compatibilityMode: powerBI
model.tmdl
model Model culture: en-US defaultPowerBIDataSourceVersion: powerBI_V3 discourageImplicitMeasures
Note:
is required for Import-mode models. Without it, the API returnsdefaultPowerBIDataSourceVersion: powerBI_V3.Import from JSON supported for V3 models only
Import-Mode Table
table Customer /// Total number of customers measure '# Customers' = COUNTROWS(Customer) formatString: #,##0 column CustomerId dataType: int64 isHidden isKey summarizeBy: none sourceColumn: CustomerId column 'Customer Name' dataType: string sourceColumn: CustomerName partition Customer = m mode: import source = let Source = Sql.Database(#"Server", #"Database"), Customer = Source{[Schema="dbo", Item="Customer"]}[Data] in Customer
Direct Lake Table
expression DL_Lakehouse = let Source = AzureStorage.DataLake("https://onelake.dfs.fabric.microsoft.com/<WorkspaceId>/<LakehouseId>", [HierarchicalNavigation=true]) in Source table Sales /// Total revenue measure 'Total Sales' = ``` SUMX( Sales, Sales[Quantity] * Sales[UnitPrice] ) ``` formatString: \$#,##0.00 column SalesKey dataType: int64 isHidden isKey summarizeBy: none sourceColumn: sales_key column Quantity dataType: int64 sourceColumn: quantity column UnitPrice dataType: decimal summarizeBy: none sourceColumn: unit_price partition Sales = entity mode: directLake source entityName: Sales schemaName: dbo expressionSource: DL_Lakehouse
Create Semantic Model
Full lifecycle: author TMDL → base64-encode → construct payload → POST → poll LRO.
Per COMMON-CLI.md § Item CRUD Operations and ITEM-DEFINITIONS-CORE.md § Definition Envelope:
WS_ID="<workspaceId>" # 1. Base64-encode each TMDL file PBISM=$(base64 -w 0 < definition.pbism) DB=$(base64 -w 0 < definition/database.tmdl) MODEL=$(base64 -w 0 < definition/model.tmdl) TABLE=$(base64 -w 0 < definition/tables/Customer.tmdl) # 2. Construct payload and create — use --verbose to capture HTTP status and LRO headers cat > /tmp/body.json << EOF { "displayName": "MySalesModel", "definition": { "format": "TMDL", "parts": [ {"path": "definition.pbism", "payload": "$PBISM", "payloadType": "InlineBase64"}, {"path": "definition/database.tmdl", "payload": "$DB", "payloadType": "InlineBase64"}, {"path": "definition/model.tmdl", "payload": "$MODEL", "payloadType": "InlineBase64"}, {"path": "definition/tables/Customer.tmdl", "payload": "$TABLE", "payloadType": "InlineBase64"} ] } } EOF az rest --method post --verbose \ --resource "https://api.fabric.microsoft.com" \ --url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/semanticModels" \ --headers "Content-Type=application/json" \ --body @/tmp/body.json
PowerShell — use
instead of[Convert]::ToBase64String([System.IO.File]::ReadAllBytes("file")).base64 -w 0
If the response is
202 Accepted, poll using the LRO pattern from COMMON-CLI.md § Long-Running Operations.
Get/Download Definition
Retrieve TMDL definition for backup, migration, or inspection.
getDefinition is a POST (not GET).
WS_ID="<workspaceId>" MODEL_ID="<semanticModelId>" # 1. Request definition — may return 200 (inline) or 202 (LRO) RESPONSE=$(az rest --method post --verbose \ --resource "https://api.fabric.microsoft.com" \ --url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/semanticModels/$MODEL_ID/getDefinition?format=TMDL" \ --body '{}' \ --output json 2>/dev/null) # 2. If 202, poll the Location header URL until Succeeded, then GET /result # 3. Decode each part echo "$RESPONSE" | jq -r '.definition.parts[] | .path + " " + .payload' | \ while read -r path payload; do mkdir -p "$(dirname "$path")" echo "$payload" | base64 -d > "$path" done
Update Definition
Critical rules: Must include ALL parts (modified + unmodified). Never include
. The API replaces the entire definition — omitted parts are deleted..platform
WS_ID="<workspaceId>" MODEL_ID="<semanticModelId>" # 1. Get current definition (see Get/Download Definition above) # 2. Modify the relevant TMDL files # 3. Re-encode ALL parts and POST cat > /tmp/body.json << EOF { "definition": { "format": "TMDL", "parts": [ {"path": "definition.pbism", "payload": "$PBISM", "payloadType": "InlineBase64"}, {"path": "definition/database.tmdl", "payload": "$DB", "payloadType": "InlineBase64"}, {"path": "definition/model.tmdl", "payload": "$MODEL", "payloadType": "InlineBase64"}, {"path": "definition/tables/Customer.tmdl", "payload": "$TABLE", "payloadType": "InlineBase64"} ] } } EOF az rest --method post \ --resource "https://api.fabric.microsoft.com" \ --url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/semanticModels/$MODEL_ID/updateDefinition" \ --body @/tmp/body.json
Use
?updateMetadata=true query parameter only when the .platform file must be included to update display name or description via definition.
Authoring Scope Matrix
| Operation | Supported | Method |
|---|---|---|
| Create semantic model with TMDL | ✅ | with definition |
| Get/download TMDL definition | ✅ | |
| Update full TMDL definition | ✅ | |
| Delete semantic model | ✅ | |
| Refresh dataset | ✅ | Power BI Datasets API (Phase 4) |
| Add/modify single measure or column | ⚠️ Route to | Full definition round-trip is inefficient |
| Create reports | ❌ | Not in scope — separate definition format (PBIR) |
Refresh Operations
All refresh operations use the Power BI Datasets API audience (
https://analysis.windows.net/powerbi/api).
WS_ID="<workspaceId>" DATASET_ID="<semanticModelId>" PBI="https://api.powerbi.com/v1.0/myorg" # Trigger full refresh cat > /tmp/body.json << 'EOF' {"notifyOption": "NoNotification"} EOF az rest --method post --verbose \ --resource "https://analysis.windows.net/powerbi/api" \ --url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/refreshes" \ --headers "Content-Type=application/json" \ --body @/tmp/body.json # Get refresh history (latest first) az rest --method get \ --resource "https://analysis.windows.net/powerbi/api" \ --url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/refreshes?\$top=5" # Cancel an in-progress refresh az rest --method delete \ --resource "https://analysis.windows.net/powerbi/api" \ --url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/refreshes/<refreshId>" # Get refresh schedule az rest --method get \ --resource "https://analysis.windows.net/powerbi/api" \ --url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/refreshSchedule" # Update refresh schedule cat > /tmp/body.json << 'EOF' { "value": { "enabled": true, "days": ["Monday", "Wednesday", "Friday"], "times": ["02:00", "14:00"], "localTimeZoneId": "UTC" } } EOF az rest --method patch \ --resource "https://analysis.windows.net/powerbi/api" \ --url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/refreshSchedule" \ --headers "Content-Type=application/json" \ --body @/tmp/body.json
Data Sources & Parameters
# Get data sources for a dataset az rest --method get \ --resource "https://analysis.windows.net/powerbi/api" \ --url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/datasources" # Get parameters az rest --method get \ --resource "https://analysis.windows.net/powerbi/api" \ --url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/parameters" # Update parameters cat > /tmp/body.json << 'EOF' { "updateDetails": [ {"name": "Server", "newValue": "newserver.database.windows.net"}, {"name": "Database", "newValue": "ProductionDB"} ] } EOF az rest --method post \ --resource "https://analysis.windows.net/powerbi/api" \ --url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/Default.UpdateParameters" \ --headers "Content-Type=application/json" \ --body @/tmp/body.json
After updating parameters or data source credentials, trigger a refresh for changes to take effect.
Permissions
# List dataset users az rest --method get \ --resource "https://analysis.windows.net/powerbi/api" \ --url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/users" # Grant dataset permissions to a user cat > /tmp/body.json << 'EOF' { "identifier": "user@contoso.com", "principalType": "User", "datasetUserAccessRight": "Read" } EOF az rest --method post \ --resource "https://analysis.windows.net/powerbi/api" \ --url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/users" \ --headers "Content-Type=application/json" \ --body @/tmp/body.json # Update existing user permissions cat > /tmp/body.json << 'EOF' { "identifier": "user@contoso.com", "principalType": "User", "datasetUserAccessRight": "ReadReshare" } EOF az rest --method put \ --resource "https://analysis.windows.net/powerbi/api" \ --url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/users" \ --headers "Content-Type=application/json" \ --body @/tmp/body.json
Permission levels:
Read, ReadReshare, ReadExplore, ReadReshareExplore.
Security Role Memberships
After defining RLS/OLS roles in TMDL (see Security Roles), use the Power BI Datasets API to assign users and groups to those roles.
PBI="https://api.powerbi.com/v1.0/myorg" # List members of a security role az rest --method get \ --resource "https://analysis.windows.net/powerbi/api" \ --url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/users" \ | jq '[.value[] | select(.datasetUserAccessRight == "Read" and .roles != null)]' # Add a user to a security role cat > /tmp/body.json << 'EOF' { "identifier": "user@contoso.com", "principalType": "User", "datasetUserAccessRight": "Read", "roles": ["SalesRegion"] } EOF az rest --method post \ --resource "https://analysis.windows.net/powerbi/api" \ --url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/users" \ --headers "Content-Type=application/json" \ --body @/tmp/body.json # Add a security group to a role cat > /tmp/body.json << 'EOF' { "identifier": "<group-object-id>", "principalType": "Group", "datasetUserAccessRight": "Read", "roles": ["SalesRegion"] } EOF az rest --method post \ --resource "https://analysis.windows.net/powerbi/api" \ --url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/users" \ --headers "Content-Type=application/json" \ --body @/tmp/body.json # Update role membership (e.g., move user to a different role) cat > /tmp/body.json << 'EOF' { "identifier": "user@contoso.com", "principalType": "User", "datasetUserAccessRight": "Read", "roles": ["EuropeOnly"] } EOF az rest --method put \ --resource "https://analysis.windows.net/powerbi/api" \ --url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/users" \ --headers "Content-Type=application/json" \ --body @/tmp/body.json
The
array accepts one or more role names that must match roles defined in the semantic model's TMDL. The user/group must also have at leastrolespermission on the dataset.Readcan beprincipalType,User, orGroup.App
Deployment Pipelines
Deployment pipelines use the Fabric API audience (
https://api.fabric.microsoft.com).
FABRIC="https://api.fabric.microsoft.com/v1" # List deployment pipelines az rest --method get \ --resource "https://api.fabric.microsoft.com" \ --url "$FABRIC/deploymentPipelines" # Get pipeline stages az rest --method get \ --resource "https://api.fabric.microsoft.com" \ --url "$FABRIC/deploymentPipelines/<pipelineId>/stages" # Deploy from one stage to the next (e.g., Dev → Test) cat > /tmp/body.json << 'EOF' { "sourceStageOrder": 0, "targetStageOrder": 1, "items": [ { "sourceItemId": "<semanticModelId>", "itemType": "SemanticModel" } ], "options": { "allowCreateArtifact": true, "allowOverwriteArtifact": true } } EOF az rest --method post \ --resource "https://api.fabric.microsoft.com" \ --url "$FABRIC/deploymentPipelines/<pipelineId>/deploy" \ --headers "Content-Type=application/json" \ --body @/tmp/body.json
Omit the
array to deploy all items in the stage. The deploy call returnsitems— poll using the LRO pattern.202 Accepted
Agentic Workflow
Tool Selection Priority
available → use MCP tools for fine-grained object changes (measures, columns, relationships)powerbi-modeling-mcp- MCP unavailable, TMDL files available → edit TMDL files directly, deploy via
updateDefinitionaz rest - MCP unavailable, workspace only → use this skill: getDefinition → edit TMDL → updateDefinition
Workflow Steps
- Discover workspace → list workspaces, find target by name (see COMMON-CLI.md § Finding Workspaces and Items)
- List semantic models →
to find existing models or confirm name availabilityGET /v1/workspaces/{id}/semanticModels - Analyze source schema → inspect source tables/columns via SQL, DAX, or Lakehouse metadata to inform star schema design
- Design star schema → identify fact and dimension tables, define relationship keys, plan measures
- Author TMDL files → create
,definition.pbism
,database.tmdl
, and table files per Minimal TMDL Content Examples and tmdl-authoring-guide.mdmodel.tmdl - Create relationships → define in table TMDL files before creating measures that depend on them
- Create measures → add explicit measures with
for all aggregatable valuesformatString - Deploy → base64-encode all parts → POST createItemWithDefinition (see Create Semantic Model)
- Verify → run validation checks (see below)
- Refresh → trigger dataset refresh via Refresh Operations
- Validate with DAX → use powerbi-consumption-cli to run DAX queries against the deployed model
Post-Creation Validation
- TMDL structure — verify all required parts are present (definition.pbism, database.tmdl, model.tmdl, ≥1 table)
- Test measures — run
for each measure via DAXEVALUATE { [Measure Name] } - Verify relationships — confirm cardinality, cross-filter direction, matching
on both sidesdataType - Verify columns — confirm
mappings andsourceColumn
match source schemadataType - Check for duplicates — no duplicate measure names or orphan objects
Troubleshooting
Early-abort rule: If both
returnsgetDefinition(on an item you can list/GET) and the Power BI refresh API returns404 EntityNotFoundwith403 Forbidden, stop retrying immediately — the user almost certainly has only Viewer role on the workspace. Verify by calling"identity None"; if that also returnsGET /v1/workspaces/{id}/roleAssignments, confirm to the user they need Contributor or higher role. Do not retry with different URL formats, endpoints, or parameters — the issue is permissions, not API usage.403 InsufficientWorkspaceRole
| Symptom | Cause | Fix |
|---|---|---|
with on Power BI API | User has Viewer role — refresh, data sources, and permissions APIs require Contributor+ | Stop immediately. Ask user to request Contributor/Member/Admin role on the workspace |
on getDefinition but item exists in list | Insufficient permissions masquerading as 404 — getDefinition requires Contributor+ | Check workspace role first; do not retry with different URL formats |
on roleAssignments | User is Viewer on the workspace | Confirms Viewer role — all authoring and most read operations are blocked |
on Fabric API | Wrong or missing | Use |
on Power BI API | Wrong audience | Use |
on getDefinition | Missing request body | Pass — getDefinition is a POST |
| LRO poll never completes | Token expired during long operation | Re-acquire token in poll loop; increase Retry-After interval |
but no result | Didn't follow LRO to completion | Poll header URL until , then GET |
| TMDL validation error on create/update | Syntax error in TMDL content | Check TMDL rules in tmdl-authoring-guide.md; validate before encoding |
| Parts missing after updateDefinition | Only modified parts were sent | Must include ALL parts (modified + unmodified) in every update |
Error including in update | not accepted by default | Remove from parts, or use |
| Base64 decode produces garbled content | Wrong encoding or line wrapping | Use (no line wrap) or |
| Refresh fails with data source error | Credentials expired or parameters wrong | Check data sources and parameters; update credentials if needed |
| Deployment pipeline fails | Workspace not assigned to stage | Assign workspace to pipeline stage before deploying |
conflict on new objects | Manually added | Remove from new objects — it is auto-generated |
| DAX error testing measures | Measure name case mismatch | DAX measure names are case-sensitive; match exactly |
Attempting / via DAX to retrieve role members | DAX functions do not reliably return role membership data and may return empty or incomplete results | Use the Power BI REST API instead: and filter by field (see Security Role Memberships) |
Examples
Create a Semantic Model from TMDL
WS_ID="<workspaceId>" # Encode all TMDL files PBISM=$(base64 -w 0 < definition.pbism) DB=$(base64 -w 0 < definition/database.tmdl) MODEL=$(base64 -w 0 < definition/model.tmdl) CUSTOMER=$(base64 -w 0 < definition/tables/Customer.tmdl) SALES=$(base64 -w 0 < definition/tables/Sales.tmdl) cat > /tmp/body.json << EOF { "displayName": "SalesModel", "definition": { "parts": [ {"path": "definition.pbism", "payload": "$PBISM", "payloadType": "InlineBase64"}, {"path": "definition/database.tmdl", "payload": "$DB", "payloadType": "InlineBase64"}, {"path": "definition/model.tmdl", "payload": "$MODEL", "payloadType": "InlineBase64"}, {"path": "definition/tables/Customer.tmdl", "payload": "$CUSTOMER", "payloadType": "InlineBase64"}, {"path": "definition/tables/Sales.tmdl", "payload": "$SALES", "payloadType": "InlineBase64"} ] } } EOF az rest --method post --verbose \ --resource "https://api.fabric.microsoft.com" \ --url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/semanticModels" \ --headers "Content-Type=application/json" \ --body @/tmp/body.json
Download a Semantic Model Definition
WS_ID="<workspaceId>" MODEL_ID="<semanticModelId>" # Get definition (may return 202 — follow LRO) RESULT=$(az rest --method post \ --resource "https://api.fabric.microsoft.com" \ --url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/semanticModels/$MODEL_ID/getDefinition?format=TMDL" \ --body '{}' --output json) # Decode and save all parts echo "$RESULT" | jq -r '.definition.parts[] | .path + "\t" + .payload' | \ while IFS=$'\t' read -r path payload; do mkdir -p "$(dirname "$path")" echo "$payload" | base64 -d > "$path" echo "Saved: $path" done
Trigger a Refresh and Check Status
WS_ID="<workspaceId>" DATASET_ID="<semanticModelId>" PBI="https://api.powerbi.com/v1.0/myorg" # Trigger refresh cat > /tmp/body.json << 'EOF' {"type": "Full"} EOF az rest --method post \ --resource "https://analysis.windows.net/powerbi/api" \ --url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/refreshes" \ --body @/tmp/body.json # Check latest refresh status az rest --method get \ --resource "https://analysis.windows.net/powerbi/api" \ --url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/refreshes?\$top=1"
Deploy to Production via Pipeline
FABRIC="https://api.fabric.microsoft.com/v1" PIPELINE_ID="<pipelineId>" # Deploy from Test (stage 1) to Production (stage 2) cat > /tmp/body.json << 'EOF' { "sourceStageOrder": 1, "targetStageOrder": 2, "items": [ {"sourceItemId": "<semanticModelId>", "itemType": "SemanticModel"} ], "options": { "allowCreateArtifact": true, "allowOverwriteArtifact": true } } EOF az rest --method post \ --resource "https://api.fabric.microsoft.com" \ --url "$FABRIC/deploymentPipelines/$PIPELINE_ID/deploy" \ --body @/tmp/body.json