Power-bi-agentic-development fabric-cli
Expert guidance for using the Fabric CLI (`fab`) to fully interact with Fabric workspaces, items, and configuration. Automatically invoke this skill whenever the user mentions "Fabric" or "Power BI Service" or a "Fabric/Power BI workspace".
git clone https://github.com/data-goblin/power-bi-agentic-development
T=$(mktemp -d) && git clone --depth=1 https://github.com/data-goblin/power-bi-agentic-development "$T" && mkdir -p ~/.claude/skills && cp -r "$T/plugins/fabric-cli/skills/fabric-cli" ~/.claude/skills/data-goblin-power-bi-agentic-development-fabric-cli && rm -rf "$T"
plugins/fabric-cli/skills/fabric-cli/SKILL.mdFabric CLI
Guidance for using
fab to programmatically manage Fabric & Power BI service
- Install via
(getuv tool install ms-fabric-cli
viauv
orwinget install uv
)brew install uv - Fabric CLI is for working with the Cloud environment and not local files; it works with Power BI Pro, PPU, or Fabric; you DO NOT need a Fabric SKU to use the Fabric CLI
[!IMPORTANT] Any time you encounter errors, user preferences or learnings when using the Fabric cli, ALWAYS note these down in the user memory rules, i.e.
for future improvement. This is ONLY for generic learnings and not for item- or task-specific learnings..claude/rules/fabric-cli.md
When to use this skill
- Use whenever the user mentions "Fabric" or "Power BI"
- Use when user asks about Power BI workspaces, deployment, tenants, publishing, download, permissions, or data
Critical general rules
- IMPORTANT: The first time you use
run check that it is up to date to the latest version and runfab
; If user isn't authenticated, ask them to runfab auth statusfab auth login - Always use
andfab --help
the first time you use a command to understand its syntaxfab <command> --help - You must search the skill /references/ for relevant reference files that explain certain commands, examples, scripts, or workflows before you start using
fab - Before first use, ask the user if they have Fabric admin access, sensitivity labels or DLP policies, any API restrictions, or preferences for Fabric/Power BI API usage; remind user to add this to memory files
- If workspace or item name is unclear, ask the user first, then verify with
orfab ls
before proceedingfab exists - Ensure that you avoid removing or moving items, workspaces, or definitions, or changing properties without explicit user direction
- If a command is blocked in your permissions and you try to use it, stop and ask the user for clarification; never try to circumvent it
- Create output directories before export:
does not create intermediate directories;fab export
the output path first or the command fails withmkdir -p[InvalidPath]
Use -f
(force) for non-interactive use
-fThe
fab CLI prompts for confirmation, so you you must always append -f to prevent this UNLESS sensitivity labels are enabled, in which case you must ask the user. Do this for the commands:
; sensitivity label confirmationfab get -q "definition"
; sensitivity label confirmationfab export
; overwrite confirmationfab import
/fab cp
; overwrite and sensitivity label confirmationfab cp -r
; delete confirmationfab rm
/fab assign
; capacity/domain assignment confirmationfab unassign
; rename/move confirmationfab mv
Quickstart guide
You must read and understand the common list of operations with simple examples
- Check the commands, syntax, and auth status:
andfab --helpfab auth status - Check if the item exists if the user gave the workspace and item name:
fab exists "spaceparts-dev.Workspace/spaceparts-otc-full.SemanticModel" - Find the workspace:
fab ls - Find the item:
fab ls "Workspace Name.Workspace" - Check the commands for that item:
to get itemTypesfab desc
for commands i.e.fab desc .<ItemType>fab desc .SemanticModel
- What's in that item; what's it for; what is it?:
- Full TMDL definition:
fab get "spaceparts-dev.Workspace/spaceparts-otc-full.SemanticModel" -q "definition" -f - Search a specific measure / table / column:
fab get "ws.Workspace/Model.SemanticModel" -q "definition" -f | rga -i "Sales Amount"
- Full TMDL definition:
- Get files, tables, or table schemas:
- List lakehouse files:
fab ls "ws.Workspace/LH.Lakehouse/Files" - List lakehouse tables:
fab ls "ws.Workspace/LH.Lakehouse/Tables" - Table schema:
fab table schema "ws.Workspace/LH.Lakehouse/Tables/gold/orders"
- List lakehouse files:
- Query data (always prefer the wrapper scripts over raw
/fab api
/duckdb
; they resolve IDs, hosts, and auth for you):sqlcmd- Semantic model (DAX):
python3 scripts/execute_dax.py "ws.Workspace/Model.SemanticModel" -q "EVALUATE TOPN(10, 'Orders')" - Lakehouse or warehouse (DuckDB + Delta against OneLake):
python3 scripts/query_lakehouse_duckdb.py "ws.Workspace/LH.Lakehouse" -q "SELECT * FROM tbl LIMIT 10" -t gold.orders - Lakehouse SQL endpoint, warehouse, or SQL database (T-SQL via
+sqlcmd
session):azpython3 scripts/query_sql_endpoint.py "ws.Workspace/LH.Lakehouse" -q "SELECT TOP 10 * FROM dbo.orders"
- Semantic model (DAX):
- Set properties for an item or workspace:
orfab set "ws.Workspace/Item.Notebook" -q displayName -i "New Name"fab set "ws.Workspace" -q description -i "Production environment" - Review or manage permissions:
- Item ACL:
thenfab acl ls "ws.Workspace/Model.SemanticModel"fab acl set "ws.Workspace/Model.SemanticModel" -I user@contoso.com -R Read - Workspace roles:
thenfab acl ls "ws.Workspace"fab acl set "ws.Workspace" -I user@contoso.com -R Member
- Item ACL:
- Deploy items to Fabric:
fab import "ws.Workspace/New.Notebook" -i ./local-path/Nb.Notebook -f - Download items from Fabric:
(alwaysfab export "ws.Workspace/Nb.Notebook" -o ./backup -f
first)mkdir -p ./backup - Copy or move items between workspaces:
orfab cp "dev.Workspace/Item.Notebook" "prod.Workspace" -ffab mv "ws.Workspace/Old.Notebook" "ws.Workspace/New.Notebook" -f - Open item in Fabric via browser:
fab open "spaceparts-dev.SpaceParts/Amazing Report.Report" - Using Fabric or Power BI APIs:
orfab api -A powerbi "groups/<ws-id>/datasets/<model-id>/refreshes" -X post -i '{"type":"Full"}'fab api "workspaces/<ws-id>/items" - Using Azure CLI (advanced) when Fabric CLI doesn't suffice:
- T-SQL over any SQL-capable item ; use
(reusesscripts/query_sql_endpoint.py
viaaz login
; full walkthrough in querying-data.md)ActiveDirectoryAzCli - Pass a Key Vault secret to a consumer without ever reading, echoing, or persisting it:
; command substitution pipes the secret directly into the child process arg list, never stdout, a file, or a named shell variableaz login --service-principal -u <appId> -t <tenantId> --password "$(az keyvault secret show --vault-name <vault> --name <secret> --query value -o tsv)" - Full fab-vs-az decision matrix: fab-vs-az-cli.md
- T-SQL over any SQL-capable item ; use
Essential Concepts
For information about any concepts related to Power BI or Fabric you must search or fetch via the
microsoft-learn MCP server (or the pbi-search CLI as an alternative) and ask the user questions with the AskUserQuestion tool; NEVER guess or make assumptions.
Workspaces
- Workspaces are containers for items like Notebooks (and other ETL items), Lakehouses (and other data items), SemanticModels, Reports (and other consumption items), and OrgApps.
- Workspaces can be assigned to different things:
- Deployment Pipelines for lifecycle management (Dev, Test, Prod, etc.)
- Domains for governance and tenant structuring
- Capacities for licensing and resources (Fabric or Premium capacities only; PPU and Pro work differently)
- Git repositories for Source Control via Git integration
Key Patterns
Pay special attention to each of the following areas when using the Fabric CLI
Path Format
Fabric uses filesystem-like paths with type extensions:
"WorkspaceName.Workspace/ItemName.ItemType"
You must quote paths with spaces and punctuation:
"Workspace Name.Workspace/Semantic Model Name.SemanticModel"
For lakehouses this is extended into files and tables:
WorkspaceName.Workspace/LakehouseName.Lakehouse/Files/FileName.extension or /WorkspaceName.Workspace/LakehouseName.Lakehouse/Tables/TableName
For Fabric capacities you have to use
fab ls .capacities
Examples:
"Production Workspace.Workspace/Sales Report.Report"Data.Workspace/MainLH.Lakehouse/Files/data.csvData.Workspace/MainLH.Lakehouse/Tables/dbo/customers
Common Item Types
- Workspaces.Workspace
- Power BI datasets.SemanticModel
- Power BI reports.Report
- Fabric notebooks.Notebook
- Data pipelines.DataPipeline
/.Lakehouse
/.Warehouse
- Data artifacts.SQLDatabase
- Spark jobs.SparkJobDefinition
- Fabric Data Agents.AISkill
/.MirroredDatabase
- Mirrored databases.MirroredWarehouse
- Spark environments.Environment
- User data functions.UserDataFunction
Full list: You must use
fab desc or fab desc .<ItemType> to check syntax and types if the user asks about an item type not listed above.
JMESPath Queries
Filter and transform JSON responses with
-q:
# Get single field -q "id" -q "displayName" # Get nested field -q "properties.sqlEndpointProperties" -q "definition.parts[0]" # Filter arrays -q "value[?type=='Lakehouse']" -q "value[?contains(name, 'prod')]" # Get first element -q "value[0]" -q "definition.parts[?path=='model.tmdl'] | [0]"
Using fab api
fab apifab has an api escape hatch that lets you use any API even if it doesn't have primary commands.
Variable Extraction Pattern
To use
fab api you need item IDs. Extract them like this:
WS_ID=$(fab get "ws.Workspace" -q "id" | tr -d '"') MODEL_ID=$(fab get "ws.Workspace/Model.SemanticModel" -q "id" | tr -d '"') # Then use in API calls fab api -A powerbi "groups/$WS_ID/datasets/$MODEL_ID/refreshes" -X post -i '{"type":"Full"}'
Admin APIs (Requires Admin Role)
Don't use admin commands or APIs if the user doesn't have Admin access. Here's some examples:
# Find semantic models by name (cross-workspace) fab api "admin/items" -P "type=SemanticModel" -q "itemEntities[?contains(name, 'Sales')]" # Find all notebooks fab api "admin/items" -P "type=Notebook" -q "itemEntities[].{name:name,workspace:workspaceId}" # Find all lakehouses fab api "admin/items" -P "type=Lakehouse" # Common types: SemanticModel, Report, Notebook, Lakehouse, Warehouse, DataPipeline, Ontology
For full admin API reference (cross-workspace discovery, tenant settings read/update, capacity/domain/workspace overrides, activity events): admin.md
Error Handling & Debugging
# Show response headers fab api workspaces --show_headers # Verbose output fab get "Production.Workspace/Item" -v # Save responses for debugging fab api workspaces -o /tmp/workspaces.json
Common workflows
These are the most common workflows you'll encounter in Fabric
Finding or exploring workspaces, items, or metadata
| Command | Purpose | Example |
|---|---|---|
| List workspaces / items | |
| Check if a path exists | |
| Get item details | |
| Supported commands per type | |
Flags:
(long listing)-l
(show hidden items)-a
(JMESPath filter)-q
(verbose output)-v
(save response to file)-o
Fabric discovery follows a drill-down pattern:
- Browsing:
- List workspaces:
fab ls - List items in a workspace:
fab ls "ws.Workspace" -l - Confirm a path exists:
fab exists "ws.Workspace/Item" - Check what commands an item type supports:
fab desc .<ItemType>
- List workspaces:
- Inspection:
- Get item details:
fab get "ws.Workspace/Item" - Pull a single field:
fab get "ws.Workspace" -q "id"
- Get item details:
- Cross-workspace search:
- Rich metadata, no admin required:
scripts/search_across_workspaces.py - Downstream reports for a given model:
scripts/get-downstream-reports.py - Tenant-wide admin APIs: admin.md
- Rich metadata, no admin required:
Check references before exploring:
Querying data
| Command | Purpose | Example |
|---|---|---|
| Get model schema | |
| Execute DAX | |
| Browse files / tables | |
| Lakehouse table schema | |
| Upload / download OneLake file | |
+ | Query Delta tables (requires DuckDB) | |
+ | Query raw files (requires DuckDB) | |
Flags:
(API audience)-A fabric|powerbi|storage|azure
(HTTP method)-X get|post|put|delete|patch
(JSON body or file)-i
(skip sensitivity prompt on definition pulls).-f
Fabric exposes three query paths depending on the source; always prefer the wrapper scripts — they resolve IDs, hosts, and auth for you:
- Semantic models (DAX):
- Find model fields first:
fab get "ws.Workspace/Model.SemanticModel" -q "definition" - Query:
scripts/execute_dax.py
- Find model fields first:
- Lakehouses / Warehouses via Delta over OneLake (DuckDB):
- Query a single table:
(usescripts/query_lakehouse_duckdb.py
as a placeholder and passtbl
)-t schema.table - Multi-table joins or raw files in
: passFiles/
with your own--sql
/delta_scan()
/read_csv
callsread_json_auto - Optionally scaffold a Direct Lake model instead:
scripts/create_direct_lake_model.py
- Query a single table:
- Lakehouse SQL endpoint, Warehouse, or SQL Database (T-SQL via
):sqlcmd- Query any SQL-capable item:
(auto-detects host per item type, reusesscripts/query_sql_endpoint.py
viaaz login
)ActiveDirectoryAzCli - Prefer this over DuckDB when you need
,INFORMATION_SCHEMA
metadata, CTEs, or window functionssys.*
- Query any SQL-capable item:
Check references before writing queries:
Changing metadata or access (descriptions, tags, endorsement, properties, bindings, permissions)
| Command | Purpose | Example |
|---|---|---|
| Update property | |
| Rename / move item | |
| List permissions | |
| Grant permission | |
| Revoke permission | |
| Set sensitivity label | |
Flags:
+-q <field>
(set a single property)-i <value>
(object ID or UPN for-I
)fab acl
(role for-R Admin|Member|Contributor|Viewer
)fab acl set
(skip confirmation; ask user first if sensitivity labels are in play)-f
Metadata and access changes fall into a few groups:
- Properties (displayName, description, sensitivity config):
- Native update:
fab set "<path>" -q <field> -i "<value>" - Capture current state first so you can revert:
fab get -v -o /tmp/before.json
- Native update:
- Endorsement, certification, and tags (no first-class
commands):fab- Patch via
with item-specific endpointsfab api - Tag workflow: tags.md
- Endorsement patterns: reference.md
- Patch via
- Folder placement:
- Move items between workspace subfolders: folders.md
- Access control and sensitivity labels:
- Grant / revoke:
,fab acl setfab acl rm - Set sensitivity label:
fab label set - Verify the principal first:
az ad user show - Never change permissions or labels without explicit user confirmation
- Grant / revoke:
- Bindings:
- Rebind a thin
to a different.Report
: reports.md.SemanticModel - Semantic model source rebinds (e.g. swap a lakehouse): semantic-models.md
- Rebind a thin
Check references before changing metadata:
Working with workspaces
| Command | Purpose | Example |
|---|---|---|
| Create workspace / item | |
| Attach capacity / domain | |
| Detach capacity / domain | |
/ | Resume / pause capacity | |
| Fork workspace | |
| Soft-delete (see recovery) | |
Flags:
(creation params for-P key=value
)fab mkdir
(target workspace for-W
/fab assign
)fab unassign
(recursive copy/move)-r
(block on path collision for-bpc
)fab cp
(skip confirmation)-f
Workspace-scope operations fall into a few groups:
- Create and provision:
- Create workspace:
fab mkdir "<Name>.Workspace" -P capacityname=<cap> - Attach capacity or domain:
fab assign .capacities/<cap>.Capacity -W <ws>.Workspace - Planning context, create/get/set surface, large storage format, Spark pools, OneLake defaults, Git: workspaces.md
- Create workspace:
- Copy, fork, download:
- Duplicate a workspace in-tenant:
fab cp -r "dev.Workspace" "prod.Workspace" - Dry-run the source tree first:
fab ls "dev.Workspace" - Full local snapshot (items + lakehouse files):
scripts/download_workspace.py
- Duplicate a workspace in-tenant:
- Permissions:
- Inspect / grant / revoke:
fab acl ls | set | rm - Tenant-wide governance audit: use the
skill from theaudit-tenant-settings
pluginfabric-admin
- Inspect / grant / revoke:
- Connections and gateways (bound to, but outside, the workspace):
- Credential types (WorkspaceIdentity, SPN, Basic), OAuth2 limits: connections.md
- Datasource binding, credential rotation: gateways.md
- Folders inside a workspace:
- Layout, nesting, conventions: folders.md
Check references before modifying workspaces:
Executing or scheduling jobs (notebooks, notebook cells, pipelines, semantic model refresh)
| Command | Purpose | Example |
|---|---|---|
| Run synchronously | |
| Run asynchronously | |
| List executions | |
| Check status | |
| Cancel a job | |
| Trigger semantic model refresh | |
Flags:
(parameters, type is-P key:type=value
)string|int|bool
(job run ID)--id
(wait on cancel)-w
(overall timeout for synchronous runs)--timeout
(status poll cadence)--polling_interval
Jobs map to different endpoints depending on item type:
- Notebooks and pipelines:
- Run synchronously:
fab job run "ws/ETL.Notebook" -P date:string=2025-01-01 - Run asynchronously:
fab job start "ws/ETL.Notebook" - Check status:
fab job run-status "ws/Nb.Notebook" --id <job-id> - List history:
fab job run-list "ws/Nb.Notebook" - Python / PySpark kernels, Livy sessions, cell-level CRUD: notebooks.md
- Run synchronously:
- Semantic model refresh (not exposed as
):fab job- Trigger:
fab api -A powerbi "groups/<ws-id>/datasets/<model-id>/refreshes" -X post -i '{"type":"Full"}' - Check current run before starting a new one (409 if already running):
fab api -A powerbi "groups/<ws-id>/datasets/<model-id>/refreshes?\$top=1" - Enhanced refresh, incremental policies, partition targeting: semantic-models.md
- Trigger:
- Dataflow refresh:
- Gen1 and Gen2 have different endpoints: dataflows.md
- Scheduling:
- Per-item schedules via the scheduler API: notebooks.md, reference.md
Check references before running jobs:
Fabric admin operations (auditing, management)
| Command | Purpose | Example |
|---|---|---|
| Cross-workspace item search | |
| Workspace inventory | |
| Tenant settings | |
| Capacity inventory | |
| Update tenant setting | |
Flags:
(query params, e.g.-P key=value
)type=SemanticModel
(JMESPath filter)-q
+-X post
(write ops)-i
(inspect--show_headers
on 429)Retry-After
Admin-scope work is gated behind the Fabric / Power BI admin role. Confirm access first with
fab api "admin/capacities" 2>&1 | head -5; if it errors, stop rather than retry.
Two entry points cover most admin tasks:
- Governance audits (tenant settings, delegated overrides, Entra SG scoping):
- Use the
skill from theaudit-tenant-settings
plugin. It owns the curated metadata baseline, the audit + change-detection script, delegated-override enumeration, and the Entra SG investigation workflow.fabric-admin - Invoke it whenever the question combines tenant posture with group membership, override scope, or drift against the baseline.
- Use the
- Raw admin APIs (cross-workspace search, activity events, artifact access, item search):
- Patterns in admin.md
- Rate limit: 25 write requests / minute; honor
on 429Retry-After - Print the exact command and wait for user confirmation before any destructive admin operation
Check references before admin work:
- admin.md
- permissions.md for workspace / item ACL exposure audits
Definitions and deployment (item definitions, deployment pipelines, git integration, cicd)
| Command | Purpose | Example |
|---|---|---|
| Read raw definition | |
| Export item to local | |
| Import item from local | |
| Copy between workspaces | |
| Deployment pipelines API | |
Flags:
(output path for-o
)fab export
(input path or JSON body for-i
)fab import
(definition format for export / import)--format
(skip overwrite and sensitivity prompts)-f
Every Fabric item has a serializable definition. Move definitions between environments depending on scope:
- Single item:
- Round-trip locally:
thenfab export
(alwaysfab import
the output directory first;mkdir -p
does not create intermediate directories and fails withfab export
)[InvalidPath] - Same-tenant shortcut, no local hop:
fab cp "dev/Item" "prod.Workspace"
- Round-trip locally:
- Semantic model as PBIP (TMDL + blank report):
- Power BI Desktop and git-ready format:
scripts/export_semantic_model_as_pbip.py
- Power BI Desktop and git-ready format:
- Full workspace snapshot (items + lakehouse files):
- Backups, offline analysis, cross-tenant forks:
scripts/download_workspace.py
- Backups, offline analysis, cross-tenant forks:
- Promotion between Dev, Test, Prod:
- Fabric deployment pipelines API (covers all item types)
- Power BI pipelines API (Power BI items only, but finer-grained deploy flags like
,allowPurgeData
)allowTakeOver - When to use each, selective deploy, LRO polling: deployment-pipelines.md
- Git integration (connect workspace to repo, branch, commit, update from git):
- Workspace git section in workspaces.md
Check references before deploying:
- import-download-deploy.md ; export / import / copy / move, PBIP round-trips, migration patterns, rebinding gotchas
- deployment-pipelines.md
- semantic-models.md
- reports.md
- paginated-reports.md
- notebooks.md
- workspaces.md
Related skills
(in theaudit-tenant-settings
plugin) ; Fabric governance workflow covering tenant settings, delegated overrides (capacity / domain / workspace), and the Entra security groups those settings reference. Read-only; holds the curated metadata baseline and the audit + change-detection script.fabric-admin
Gotchas
- IMPORTANT: DON'T try to use
on items that aren't data items (.Lakehouse, .Warehouse, etc); usefab ls
to find workspaces and items, and usefab ls
to look at definitionsfab get - ALWAYS Use the
flag when using-f
,fab get
,fab import
, etc. as described abovefab export - ONLY fallback to
when a command doesn't existfab api
References
Skill references:
- Import, Download, and Deploy - Export / import / copy / move items, PBIP round-trips, dev-to-prod migration patterns
- Querying Data - Query semantic models in DAX and lakehouses or warehouses in SQL with DuckDB
- Lakehouses - Endpoints, file/table operations, OneLake paths
- Warehouses - Create, browse, query via DuckDB, load data
- SQL Databases - Create, browse, query via DuckDB, auto-mirroring
- Semantic Models - TMDL, DAX, refresh, storage mode
- Reports - Export, import, visuals, fields
- Paginated Reports - RDL upload, export-to-file, datasources, parameters
- Notebooks - Python/PySpark kernels, metadata, cell CRUD, Livy execution, scheduling
- Workspaces - Create, manage, permissions
- Permissions - Sharing and distribution, workspace roles, item permissions, apps, embed, B2B, deployment pipeline permissions, licensing and capacity SKUs
- Deployment Pipelines - CI/CD, deploy stages, selective deploy, LRO polling
- Dataflows - Gen1 and Gen2, refresh, publish, admin
- Dashboards - Tiles, clone (dashboards are not reports)
- Org Apps - Read-only API for distributed content packages
- Scorecards - Goals, check-ins, status rules (Preview API)
- Gateways - Datasources, credentials, dataset binding
- Folders - Organize items into folders via API; includes best practices for structuring workspaces
- Tags - Create, apply, and audit tenant/domain tags on items and workspaces via
(no nativefab api
command)fab tag - fab vs az CLI - When to use which; capacity, networking, Key Vault, monitoring, CMK, CI/CD
- Admin APIs - Cross-workspace search, tenant operations, governance
- API Reference - Capacities, domains, misc API patterns
- Connections - Create, update, list connections programmatically; credential types (WorkspaceIdentity, SPN, Basic); OAuth2 limitations
- Full Command Reference - All commands detailed
Scripts (scripts that you can execute):
- search_across_workspaces.py ; cross-workspace item search via DataHub V2 API; filters by type, owner, storage mode, last visited, capacity SKU
- get-downstream-reports.py ; find all reports connected to a given semantic model across accessible workspaces (no admin required)
- execute_dax.py ; execute DAX queries against semantic models; output as table, csv, or json
- query_lakehouse_duckdb.py ; query lakehouse or warehouse Delta tables via DuckDB against OneLake (reuses
); output as table, csv, or jsonaz login - query_sql_endpoint.py ; query lakehouse SQL endpoint, warehouse, or SQL database via
(reusessqlcmd
throughaz login
); output as table, csv, or jsonActiveDirectoryAzCli - create_direct_lake_model.py ; create a Direct Lake semantic model from lakehouse tables
- export_semantic_model_as_pbip.py ; export a semantic model as a PBIP project (TMDL definition + blank report)
- download_workspace.py ; download a full workspace with all item definitions and lakehouse files
See scripts/README.md for detailed usage, arguments, and examples. Always search the
scripts/ folder before writing a new helper; a script may already exist for the task.
External references (request markdown when possible):
- fab CLI: GitHub Source | Docs
- Microsoft: Fabric CLI Learn
- APIs: Fabric API | Power BI API
- DAX: dax.guide - use
e.g.dax.guide/<function>/dax.guide/addcolumns/ - Power Query: powerquery.guide - use
powerquery.guide/function/<function> - Power Query Best Practices