DP-600 Fabric Analytics Engineer – Structured Study Notes

Published: (December 6, 2025 at 11:23 AM EST)
4 min read
Source: Dev.to

Source: Dev.to

Overview

The DP‑600 exam focuses on designing, building, governing, and optimizing analytics solutions in Microsoft Fabric.
Key responsibilities include:

  • Maintaining a data analytics solution
  • Preparing data
  • Implementing and managing semantic models

Governance, Administration, and Lifecycle

Security & Governance

  • Layers of security

    • Workspace‑level roles
    • Item‑level permissions
    • Data‑level security (RLS / CLS / OLS)
    • File‑level security
    • Sensitivity labels
    • Endorsement
  • Row‑Level Security (RLS) – restricts rows a user can see (e.g., a “Bangladesh” role sees only rows where Country = 'BD').

  • Column‑Level Security (CLS) – hides columns or whole tables from specific roles (e.g., hide Salary, SSN).

  • File‑system security – separate permissions for browsing the OneLake file area versus querying tables via the semantic model.

  • Sensitivity labels – classify data (Public, General, Confidential, Highly Confidential – No Export) and can enforce:

    • Block export to Excel/CSV
    • Block publish to web
    • Restrict external sharing
  • Endorsement – signals trustworthiness of items:

    • Promoted – team‑level confidence
    • Certified – organization‑level single source of truth (restricted to data stewards/admins)

Workspace Roles

RoleCapabilities
AdminFull control: change settings, add/remove users, delete items.
Member / ContributorCreate and edit items (Lakehouse, Dataflow, Reports, Pipelines, Notebooks). Cannot change workspace‑level admin settings (e.g., capacity).
ViewerRead‑only: view reports, dashboards, semantic models, etc. Cannot create, edit, or publish new items.

Exam tip: A user who can open a report but cannot edit it or create a new dataflow is likely a Viewer; they need at least Contributor rights for those actions.

Item‑Level Permissions

Each item inside a workspace (Lakehouse, Warehouse, Semantic model, Report, Notebook, Dataflow) has its own permission set.
A user may be a Contributor at the workspace level but still lack:

  • Build permission on a semantic model → cannot create new reports or use “Analyze in Excel”.

Exam tip: “User can view a report but cannot use ‘Analyze in Excel’ or create a new report from the dataset.” → Missing Build permission on the semantic model.

Tenant, Capacity, and Workspace Settings

ScopePrimary Controls
TenantOrganization‑wide policies: Fabric enablement, export policies, guest/external sharing, sensitivity‑label integration, trial activation.
CapacityCompute configuration (e.g., F64, F128, Premium), region, workload settings (Spark, Dataflows, Pipelines, DirectLake), pause/resume, concurrency limits, memory.
WorkspaceTeam/project scope; assigned capacity (shared vs. Fabric/Premium); stores Lakehouses, Warehouses, Dataflows, Pipelines, Semantic models, Reports.

Exam tip: “A workspace does not show the option to create a Lakehouse or Dataflow Gen2, but users can still create reports.” → The workspace is on a shared capacity or the tenant has disabled Fabric item creation.

Common Capacity‑Related Issues

  • Dataflows stuck in queue → capacity under pressure / workload disabled
  • Notebooks fail to start → Spark capacity exhausted
  • DirectLake slowdown → overloaded capacity, cache eviction

Development Lifecycle

Git Integration

  • Connect a Fabric workspace to a Git repository.
  • Store:
    • Reports as PBIP / PBIR (text‑based)
    • Notebooks (.ipynb or scripts)
    • SQL scripts
    • Pipeline definitions (code)

Benefits: collaboration, PR‑based review, history/rollback, CI/CD integration.

PBIX vs. PBIP / PBIR

FormatCharacteristics
PBIXBinary, traditional Power BI file; harder to source‑control.
PBIP / PBIRText‑based project structure; separates metadata, model, and layout into files/folders; Git‑friendly.

Exam focus: Understand why PBIP/PBIR is preferred for version control and automated deployments.

Deployment Pipelines

  • Stages: Development → Test → Production
  • Deployable items: semantic models (datasets), reports, dashboards, some Fabric items (via scripts).
  • Stage‑specific rules/parameters (e.g., different data‑source connections per environment).

Exam tip: “Promote changes from Dev to Prod while pointing to different databases.” → Use deployment pipeline parameters for data‑source bindings.

XMLA Endpoint

Provides enterprise‑level management of semantic models via external tools such as:

  • Tabular Editor
  • SQL Server Management Studio (SSMS)
  • Custom scripts for deployment and partition management

Typical actions include partitioning large tables, refreshing models, and applying role‑based security programmatically.

Monitoring & Impact Analysis

  • Lineage visualizations show data flow from source (Lakehouse, Warehouse) through dataflows, pipelines, and semantic models to reports.
  • Impact analysis helps assess downstream effects of schema changes or security updates.

Summary of Key Exam Patterns

ScenarioUnderlying Concept
User can view a report but cannot edit or create a dataflowViewer role vs. needed Contributor rights
User cannot use “Analyze in Excel” on a datasetMissing Build permission on the semantic model
User can query a Lakehouse table via SQL but cannot open Files areaLack of OneLake file‑system permission
Export blocked after applying a labelHighly Confidential – No Export sensitivity label
Central BI team wants single source of truth endorsementUse Certification endorsement
Workspace missing Lakehouse/Dataflow creation optionsWorkspace on shared capacity or tenant‑level Fabric item creation disabled
Promote Dev → Prod with different DB connectionsDeployment pipeline with environment‑specific parameters
Back to Blog

Related posts

Read more »