Skip to content

RuiRomano/fabric-cli-powerbi-cicd-sample

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

bpa deploy

This repository demonstrates a Fabric CI/CD scenario using fabric-cli and GitHub. It can be easily adapted to other use cases.

  • Fabric items source code is located in the /src folder.
  • Developers should work in isolation within a feature branch.
  • Pull requests to the main branch trigger a best practices analysis pipeline, bpa.yml, for both semantic models and reports. This process leverages community tools such as Tabular Editor and PBI-Inspector.
  • Upon a successful merge into the main branch, the deployment pipeline, deploy.yml, is triggered to ensure automated deployment to the development environment.
  • The deployment pipeline, deploy.yml, also runs daily on a scheduled trigger to deploy to the production environment.

Developer workflow demo (PR -> BPA -> Merge -> Deploy)

FABCLI_CICD_UserFlow.mp4

Instructions

Run scripts locally

Make sure you have the Fabric CLI installed. If not, run:

$ pip install ms-fabric-cli

Secrets and variables

Before running the Github Action, ensure you configure the following GitHub action secrets and variables:

Name Type Value
FABRIC_CLIENT_ID Secret Service Principal client ID from your tenant
FABRIC_CLIENT_SECRET Secret Service Principal secret
FABRIC_TENANT_ID Secret Your tenant ID
FABRIC_ADMIN_UPNS Variable User Entra object ID that will be assigned to the items created by the service principal
FABRIC_CAPACITY Variable Your Fabric Capacity name, required to deploy Fabric items such as data pipelines

Topology

flowchart TB

    subgraph GitHub
        direction TB
        feature[feature branch]:::gh_branch
        main[main branch]:::gh_branch
        build[build pipeline - BPA rules]:::gh_action
        deploy[deploy pipeline]:::gh_action        
    end

    data_dev[(Files - Dev)]

    subgraph Dev
        direction TB
        
        subgraph workspace_dev[Fabric dev workspace]
            datapipeline_dev[data pipeline]
            notebook_dev[notebook]
            lakehouse_dev[lakehouse]
            semanticmodel_dev[semantic model]
            reports_dev[reports]
        end     

        data_dev-->datapipeline_dev
        datapipeline_dev-->lakehouse_dev
        notebook_dev-->lakehouse_dev
        semanticmodel_dev-->reports_dev
        lakehouse_dev-->semanticmodel_dev
    end

    data_prd[(Files - Prod)]

    subgraph Prod
        direction LR
        
        subgraph workspace_data[Fabric data workspace]
            datapipeline_prd[data pipeline]
            notebook_prd[notebook]
            lakehouse_prd[lakehouse] 
        end
        subgraph workspace_analytics[Fabric analytics workspace]
            semanticmodel_prd[semantic model]
            reports_prd[reports]
        end        
                
        data_prd-->datapipeline_prd
        
        datapipeline_prd-->lakehouse_prd
        notebook_prd-->lakehouse_prd
        semanticmodel_prd-->reports_prd    
        lakehouse_prd---semanticmodel_prd
    end
    
    feature-- PR --->build-- merge --> main --> deploy --> Dev
    
    deploy-- schedule every day --->Prod

    classDef gh_action fill: #1E90FF, color:white;
    classDef gh_branch fill:#e8b63e;    

Loading

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published