Back to List

Power BI Governance: Sharing and Deploying Power BI Content

Marcus Radue Marcus Radue  |  
Sep 15, 2020
 
In this blog series, Marcus Radue, Data Analytics Engineer at Skyline Technologies, offers high-level guidance for implementing Power BI effectively in your organization. For a full overview on this topic, check out the original Power BI Governance A-Z Webinar.
 
In my previous blog in this series, I reviewed some Power BI delivery strategy options and how they might inform your licensing choice. We also touched on the importance of sharing content. In today’s blog, we are digging into the question, “How do you want to design the way your organization shares content, and what are the different ways to do that within Power BI?”.
 

Apps

Apps are the preferred method of sharing Power BI content. Apps are used within the Power BI service to share content from published workspace content. These apps can include dataflows, datasets, reports, paginated reports, Excel workbooks, and dashboards. Apps can be used with Power BI pro or premium licensing.
 

Workspaces

After you develop reports in the Power BI Desktop app, you can publish them to workspaces. That's where they sit until you either enable content in apps or share reports individually. We have several guidelines for organizations to follow when setting up workspaces in the Power BI service.
 

Separate Report and Dataset Files

If you're familiar with building reports in the Power BI Desktop app, then you know the file format of a PBIX file can hold the data model (or dataset) of the report and all the visual content on your report pages. We recommend having separate PBIX files: one for just the data model or dataset that you create, and then another for the report side – including report pages and visualizations.
 
The reason for dividing the report and dataset files is that it encourages dataset re-use within your organization. Using shared datasets is a much more efficient way to share content than having users create dataset redundancy within the organization. It also creates a more convenient setup where dataset files are managed like Analysis services data models and the report files are left to handle the visualization and cosmetic side of the Power BI report.
 

Dividing Environments

Working with different customers and clients, I’ve seen several ways organizations have divided and named their environments – including a three-environment approach or two-environment approach.
 
The standard approach is to have three environments: development, test, and production. This is recommended for organizations using deployment pipelines with Power BI Premium. With deployment pipelines, you get automated development, testing, and production workspaces created for you with the pipeline (so it makes a lot of sense to go that route). Having those three environments also keeps things standardized from a development strategy standpoint. I’ve also seen organizations use just two areas: a sandbox or a test area, and then a production type workspace.
 
The environment approach you choose will largely depend on how you’re deploying content and how many resources you have to manage the content. You need resources to migrate reports through the environments you choose, and (with the three-environment approach) that means one extra environment to manage.
 

Divide Workspaces by Business Content

Typically, we recommend dividing out workspaces by business content. Similarly, business subject matter is going to determine what designates a workspace area. Keep these groupings broad enough to encompass multiple sets of data. You don't want to create a workspace for every single individual report or dataset. That's not very efficient. You want to group by business content or business units like sales, finance, manufacturing, purchasing, or other common business areas.
 
You may also divide workspaces by specific security groupings. You may do it for a security standpoint of, "I have these 10 users that may work in different departments, but they view the same types of content and data." It may make sense to group those individuals in a workspace, such as an executive team or a management group in a different business unit.
 

Divide Dataflows and Datasets

If you're utilizing dataflows within Power BI, consider dividing dataflows and datasets into different workspaces. Refer to this documentation for more recommended best practices on dataflows.
 

Use Security Groups for Apps and Workspaces

It is strongly encouraged to use security groups, not only for app distribution but also for workspaces. It's important to note that, in order to do that within workspaces, you need to be using the version 2 workspace experience instead of the legacy or version 1 experience.
 
If you’re using version 2 workspaces, there are two additional roles compared to the version 1 experience. Make sure you understand what access each workspace role gives users.
 

Standard Naming Convention

Whether you allow only certain people to create workspaces in your organization, or if you open that up to a larger group of people, make sure you standardize your naming convention.
 
Keeping things more consistent makes it easier from a management standpoint. It also makes it a lot easier to consume that information when workspaces are similarly named. We’ll dig into this point a little more when we cover monitoring in a future blog.
 
This wraps it up for my high-level recommendations on workspaces, but there are a few more suggestions below for getting the most out of your Power BI environments.
 

Avoid Content Packs

Look to avoid using content packs right now because they are the legacy way of sharing Power BI content that apps have replaced. If your organization is still using content packs, you may want to seriously consider a migration strategy over to apps.
 

Row-Level Security

Row-level security may be an option for sharing content with a larger group of end users that only have access to certain rows or columns of your dataset. A common example of this is a sales team. The entire team may have access to the same broad categories of data, but each salesperson only needs to see information for their territory, region, or set of customers.
 
Implementing row-level security through Power BI datasets or Analysis Services will allow you to create a broader dataset you can distribute to a larger audience: in this case, the entire sales team. Then, when an individual salesperson logs into the Power BI report, you can enable a filter to restrict that salesperson to seeing only their data. Restricting that access, while still maintaining the broader dataset, can save you hours of management time. Instead of creating multiple datasets for your sales reports, you can use one dataset with row-level security.
 

Embedding Reports and “Shared with Me” Option

There are two final ways to share your Power BI information. By purchasing a SKU or upgrading to Power BI Premium, you can embed reports in a website or custom application. This can be useful for sharing content with a larger audience and not needing Power BI pro licensing for each user viewing the report.
 
Finally, a one-off technique that we don't recommend – but can be used in certain situations – is the "Shared with me" option. You have an entirely separate menu for this in the Power BI service within the navigation menu. It allows you to share reports or dashboards with certain individuals or groups instead of using an app to do that. This option does require everyone to have a Power BI pro license even if you are using Power BI premium.
 

Deploying Content

Now that we’ve covered the sharing options available to you with Power BI, let’s dig into some of the best practices for deploying content because this can be a pain point for organizations already using Power BI.
 

Source Control

Currently, Power BI does not have a great solution for managing your PBIX or desktop files. For the Power BI space, there isn’t a standard source control environment that can be deployed throughout different environments.
 
They are getting closer to that, though. With a couple of recent feature releases, they're starting to bridge the gap between what you would consider standard source control integration and what's currently available for Power BI. These features include the enhanced metadata preview in Power BI Desktop, along with deployment pipelines for Power BI Premium users.
 
To get around the lack of source control options in the interim, we utilize a OneDrive workspace or a OneDrive folder area within a team SharePoint site. That area used to be automatically generated in the version 1 workspace experience. But since that’s turned off in the new workspace experience, this may be an additional step for those of you migrating to (or exclusively using) version 2 workspaces.
 
Another option is using a Git repository. Even though there is not a great way to do schema compare, like you could with a database project or a tabular model in Analysis Services, you still can hold your Power BI report files in a Git repository, along with your other database or data model artifacts.
 

Other Deployment Considerations

One thing I cannot stress enough is to utilize parameters in your Power BI Desktop files (this also includes utilizing templated files). Parameters can be used to easily migrate between your different development, test, and production environments.
 
Those parameters are best implemented with relational database-type data sources. You may have three separate servers for the same database in your dev, test, and prod environments. Parameterizing that server and database name in your desktop files easily allows for migration between those environments, instead of having to manage three separate report files.
 
With parameters, you're only managing that one report file (or two report files if it's a dataset and a report) and migrating that through the different environments. Having those parameters in your datasets also makes using deployment pipelines a lot easier.
 
There's been some work around automated deployment strategies through Azure DevOps release pipelines. Unfortunately, that technique is currently somewhat of a manual workaround. It’s not nearly as automated as you may be used to with using Azure DevOps pipelines. I would consider this strategy a temporary workaround. Microsoft continues to bridge the gap from their current source control integration with Power BI to what you may be used to with standard software lifecycle management processes.
 
In the next edition of this blog series, I will explain how to get the most out of your report design strategy.
 
Data AnalyticsPower BI

 

Love our Blogs?

Sign up to get notified of new Skyline posts.

 


Related Content


Blog Article
Power BI Governance: Delivery Strategy and Licensing
Marcus RadueMarcus Radue  |  
Sep 01, 2020
In this blog series, Marcus Radue, Data Analytics Engineer at Skyline Technologies, offers high-level guidance for implementing Power BI effectively in your organization. For a full overview on this topic, check out the original Power BI Governance A-Z Webinar. This blog series will give...
Blog Article
6 Practical Data Protection Features in SQL Server (Pros & Cons)
Tony RopsonTony Ropson  |  
Aug 25, 2020
About the author: Tony Ropson has been developing solutions in .Net and SQL Service since 2011. He holds an Azure Data Engineer Associate certification from Microsoft.   At Skyline, we have a moral (and oftentimes legal) responsibility to build software and data solutions that can properly...
Blog Article
Power BI’s Latest Features and How to Use Them
Marcus RadueMarcus Radue  |  
Apr 14, 2020
[Updated 09/02/20]  In this regularly updated blog, Marcus Radue (Data Analytics Engineer) highlights key features from his monthly Power BI Office Hours webinar series so you know how to capitalize on Power BI’s latest enhancements.   Power BI Features (August 2020 Update...
Blog Article
What is Microsoft’s Power Platform and How to Use It: The Guide
Skyline Technologies  |  
Jan 14, 2020
In this guide, Libby Fisette (Director of Skyline Modern Workplace team) and Marcus Radue (Data Analytics Engineer), dig into the functionality of the Microsoft Power Platform and how you can leverage this toolset to solve many business situations. From basics to key questions, you will find...
Blog Article
Realtime and Near-Realtime Data Sources and Data Integrity
Matt PlusterMatt Pluster  |  
Dec 17, 2019
In this blog series, Matt Pluster, Director of Skyline’s Data Analytics Consulting Practice, explores data sources and processing options. For a full overview on this topic, check out the Realtime vs Near-Realtime webinar.   In previous blogs in this series, I dug into advantages...