Category: Building Reports

  • Handle Errors in Data Refreshes with Power Automate

    Handle Errors in Data Refreshes with Power Automate

    This article examines using the advanced editor in Power Query to better handle when things go wrong. It will also allow custom actions to be triggered using Power Automate, allowing the rest of the data to be refreshed while alerting the owner by email of data source errors.

    Note that this article requires edit Power Query (M) code. If you want to review this, consider reading this article:
    Query Editor – Editing M Code

    Data Sources

    Data can be messy. Thankfully, Power Query has given us an excellent and efficient way to extract, transform and load the data to manipulate it to the way we want it. It is extremely versatile and can connect to a huge host of data sources. These can include third party sources as well as internal data.

    While beneficial, we need to ask what happens if the third-party source suddenly shuts down or changes its access policies. What if there is bad data recorded in the on-premise excel sheet or databases? An error in any Power Query step can stop the entire report refreshing. We can opt to receive alerts on the service, but these can be unspecific and require us to dig deep into the report.

    The technique laid out here allows to receive a specific alert of the exact step the error occurred. What’s more, we can ensure it won’t break our queries and continue to refresh the rest of the data.

    Step 1 – Base Query

    First we need to set up the query that we want to error handle. For this example, I’m going to send a web request to get some information about the Microsoft stock price. For more information on this API or to pull other stock data, check out this article.

    Open Power Query and Select the New Source > Web on the Home ribbon.

    Paste the following link:

    https://query1.finance.yahoo.com/v8/finance/chart/MSFT?range=5y&interval=1d

    This will automatically return a JSON object and parse it for us.

    Note: This link returns 5 years of daily daily historical stock price.

    For simplicity, I will just return the meta data to ensure the API call is working. The automatic parsing will return a table with clickable elements. To explore into the JSON we, click through the following steps:

    chart:Record > result:list > Record > Meta:Record

    Note: See sample of nested structure below for chart:Record

    Once we have expanded all the way down to the Meta level, Press the Convert Into Table Button found on the newly appeared Convert ribbon.

    Here is the final code, which you can see by Clicking the Advanced Editor button on the Home ribbon revealing the advanced editor:

    let 
        Source = Json.Document(Web.Contents("https://query1.finance.yahoo.com/v8/finance/chart/MSFT?range=5y&interval=1d")),
        chart = Source[chart],
        result = chart[result],
        result1 = result{0},
        #"meta" = result1[meta],
        #"Converted to Table" = Record.ToTable(#"meta")
    in
        #"Converted to Table
    

    Rename this “Stock Query” by editing the name in the properties pane on the right.

    Step 2 – Create the flow

    Next we create the Power Automate flow that will alert us something is wrong. Navigate to the Power Automate website. Once logged in, Click on the new Instant flow button.

    Give the flow an appropriate name. For the trigger, select the option When an HTTP request is received. Next press the button Create to make the flow.

    Once we save the flow will supply us with a URL. This URL will trigger the flow any time it is visited. You can use any action you want, but I will have the flow send me an email to let me know the refresh failed.

    I’ll search for the item Send an email (V2). Make sure you fill in the email address you want to send it to, and write an appropriate message and subject.

    That’s our flow! Hit Save. After you have saved, click on the When a HTTP request is received step to expand it. You’ll see that a URL has been generated. Copy the link using the Copy button on the right. You’ll need this to enter it into Power BI.

    Step 3 – Make the Flow Trigger Query

    Next, set up the query in Power BI to call this flow. In Power Query, make a new query by selecting New Source > Web. Paste in the Web URL you copied in the previous step. Open the advanced editor. Inside you see the code uses the Web.Contents() function. You’ll need to copy this code in a later step.

    This will send an email as it runs the query. For testing, if you press the Refresh Preview icon, you can trigger the actions to run again. If you don’t want to wait for the email or chose a different action, you can check if the flow ran by checking it on the power automate site. Click My Flows on the left, open the flow and scroll down to Runs.
    Press the refresh button on the runs section to check when the flow ran.

    Step 4 – Set up the Error Handling Code

    Now we need to add the logic to have the Power Automate run on error, which is going to take a little coding.

    Back in Power Query, start by adding a blank query by clicking New Source > Blank Query on the Home Ribbon.
    Next, open the Advanced Editor and the code should look like this:

    Now we will employ the try … otherwise statement. This is Power Query’s error handing statement. We can add a placeholder for now.
    Replace the step:

    Source = "" 

    with the following code:

    Source = try 1+1 otherwise “error”

    How this works

    Any code between the try and otherwise keywords will be attempted to be executed. If this is successful, the result will be returned and the next step will be evaluated, in this case the number 2. If this returns an error, however, the result will be discarded and the query after the word otherwise will be returned, in this case the word “error” .

    We can add this statement as the Source step. I’ll also wrap both statements in parenthesis as this will come in handy as we add more steps. It’s important to keep good syntax to make it readable, so here is my code:

    As 1+1 is valid, this will return the value 2. If you change the 1+1 to 1+“A”, this is not valid so it will return the word error.

    Now we can see its working, delete everything between the parenthesis in both the try and otherwise statement to set up for the next step. Do not worry if you get an error as we will be adding code to it.


    Step 5 – Update the Error Handling Code

    Now we’ve got the placeholder set up we can copy our previous code into each step.
    Open the Stock Query we made in steps 1 and 2.
    Click Advanced Editor and copy the entire code.
    Now, come back to the try..otherwise query.
    Open the advanced editor in this step and make sure you delete anything between the parenthesis in the try statement if you did not before.
    Paste the entire code you just copied.

    Next, go to the Flow Error Query, open the advanced editor and copy all the text.
    Go back to the try..otherwise and paste everything between the two parenthesis for the otherwise statement.

    Step 6 – Tidying the Code Up

    The URLs are prone to change so it is code practice to separate them as a variable. To do this, you can simply state them at the beginning of the query as a step.
    Here, add the URLs to variables called WebURL and FlowURL.
    To add WebURL, at the beginning of query after the let keyword, add the line:

    webURL = "https://query1.finance.yahoo.com/v8/finance/chart/MSFT?range=5y&interval=1d",

    Don’t forget the comma at the end. Do the same with the FlowURL. Then, replace the URLs with the variable name.

    Additionally, it will help to change the word Source outside the try..otherwise to Output. This makes it easier to follow, as we have several steps called source. Also update it after the keyword in.

    Output:

    Final code to copy (you will need to update to your correct URLS):

    let
        flowURL ="https://prod-68.westus.logic.azure.com:443/workflows/ RestofURL",
        webURL = "https://query1.finance.yahoo.com/v8/finance/chart/MSFaaT?range=5y&interval=1d",
        
        Output =   
        try 
        (
            let 
                Source = Json.Document(Web.Contents(webURL)),
                chart = Source[chart],
                result = chart[result],
                result1 = result{0},
                #"meta" = result1[meta],
                #"Converted to Table" = Record.ToTable(#"meta")
            in
                #"Converted to Table"
        )
        otherwise    
        (
            let
                Source =  Web.Contents(flowURL)         
            in
                Source
        ) 
    
    in
        Output
    

    Optional Step 7.1 – Adding POST Parameter – Flow

    Now we have set up our workflow, it may be useful to reuse this on multiple datasets. Instead of setting up a different flow for each dataset, we can allow the flow to accept inputs, such as the dataset name.

    Navigate back to Power Automate site and on the flow set up previously, click edit.

    Open the step and paste in the following code into Request body JSON.

    {
        "type": "object",
        "properties": {
            "datasetName": {
                "type": "string"
            }
        }
    }

    Next, expand advanced options and change it to POST.

    This will create a variable called datasetName. This is the name of the dataset we will pass from Power BI. We can use datasetName in the email to inform us which dataset had an error. The variable will appear in a list when you click to edit the email message – click on the name to add it.

    In addition, I added the expression utcNow(). You can find this by searching in the Expression tab of the pop up. This just displays the time it is sent. Thus we can see in the email when it failed. Notice the variable from the first step is green, while the expression is pink.
    Personalize this message as you wish.

    Optional Step 7.2 – Adding POST Parameter – Power BI

    The final step is to pass this in from Power BI. In our query, open the advanced editor and add three more parameters before webURL and flowURL:

        datasetName = "Stock Query",
        content = " { ""datasetName"": """ & datasetName & """}",
        headers = [ #"Content-Type"="application/json"],
    

    dataset name is the name of the datset we pass into the flow and ultimately appear in the email. Right now it’s set to Stock Query, but this is what we can edit for each dataset we use this query to pass a different name.

    content and headers build the correct format to pass into the API call. This will create a POST request and provide the correct format.

    Now we can edit the Web.Contents(webURL) step to include our parameters:

    Source =  Web.Contents(flowURL, [Headers = headers, Content=Text.ToBinary(content)])  

    Final output:

    Final code to copy (you will need to update to your correct URLS):

    let
        datasetName = "Stock Query",
        content = " { ""datasetName"": """ & datasetName & """}",
        headers = [ #"Content-Type"="application/json"],
        flowURL ="https://prod-68.westus.logic.azure.com:443/workflows/RestofURL",
        webURL = "https://query1.finance.yahoo.com/v8/finance/chart/MSFaaT?range=5y&interval=1d",
        
        Output =   
        try 
        (
            let 
                Source = Json.Document(Web.Contents(webURL)),
                chart = Source[chart],
                result = chart[result],
                result1 = result{0},
                #"meta" = result1[meta],
                #"Converted to Table" = Record.ToTable(#"meta")
            in
                #"Converted to Table"
        )
        otherwise    
        (
            let
                Source =  Web.Contents(flowURL, [Headers = headers, Content=Text.ToBinary(content)])         
            in
                Source
        ) 
    
    in
        Output
    

    Limitations and Considerations

    This technique uses premium Power Automate features and a valid license is required. However, only one service account license is needed.

    This query has been designed to return blank data if the query fails. This could break your model if this data is required.

    This blog does not examine securing the Power Automate endpoint, so be aware if the URL was discovered people could execute the end action (in this case sending an email).

    If you like the content from PowerBI.Tips please follow us on all the social outlets to stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel.  Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.

    Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:
    Store Merchandise

  • KPIs in Power BI

    KPIs in Power BI

    KPIs are a key visualization type used to convey high level metrics to the end users. They provide an at-a-glance metric that allows business users to know whether they are on track or off-track. Over the course of time that single metric number has been enhanced to include lines to showcase trends, date stamps, variance and many other metrics to convey key metrics in a compact and concise way. There are a number of different takes on what a KPI can or should look like. If you would like to take a look at all the different KPI visuals you can download from AppSource you can find those here there.

    I’ve used several of the visuals from AppSource before, and what I would recommend is that you always verify that the visual has the blue “Certified” label if you plan on just using it. That means that the custom visual has met certain code and verification criteria and the visual is not accessing or using external resources. That is not to say that you can’t use the other visuals, but I would recommend looking at them closer and potentially reaching out to the owner of the custom visual to see if there are any extraction or third party interactions that would not be acceptable risk for your company.

    That being said, one of the great things about having custom visuals is that they typically offer more enhanced visual options or settings that you cannot find in the default visual set in Power BI. KPIs have been one of those visuals that needed a bit more enhancement. I’m happy to say in the latest December release of the Desktop it received it. If you’ve been looking elsewhere for your KPI visual, it might be high time to give the latest default version a little bit of a test run as it offers some of the key features that are plenty enough for me.

    Basic Updates

    Prior to the December update the KPI visual was a bit standard. An example of what that looks like is here.

    However, if you look at the below example of the new one, there have been several key updates that pop this KPI now by default and extend it to make it look more pleasing. The first change is obvious, the Font is DIN and brings a bold pop to the overall number. We’re off to a good start! We can also now change the Font family of the larger metric which will allow us to mirror the overall fonts we’ve selected within our other visuals.

    The Indicator and Goal properties have received the most prominent updates, but there are also some key changes that allow us to now set the font colors both statically and conditionally.

    Indicator Properties

    Let’s see how these changes impact things overall. First, lets take a look at the Indicator. The options went from this:

    To this:

    What does that do for our KPI? Aside from what we outlined above, it gives us a simple two click option to adjust where we want the number (Top & Left), which makes our trend seem a bit more trendy?

    Another option we now have is to increase or decrease the transparency of the trend.

    Goal Properties

    The second property area that received a bunch of changes is the Goals area. That looked like this:

    While the new properties pane looks like this:

    This change is almost like the difference between the Edison bamboo filament light bulb and the latest LED that lets you pick between millions of colors… Ok, maybe that’s a bit extreme, but you get the point. This property is now extremely useful. Note: the Goal label name can be changed and the metric returned for the distance value can be updated to show the percentage, value or both.

    The color and font family for both Goal and Distance are updateable now as well along with a new property for the Distance direction.

    Conditional Formatting

    One of the “hidden” things to be aware of is that the above screen shots are the default view of all properties. To see if you can use a conditional setting you need to hover the mouse over the Font color area. Doing so will illuminate the “…”

    Clicking that gives us :

    And one more click opens the dialogue where we can set our conditions.

    One final key property is the addition of the top level “Date” property. This is a great addition in that it takes away any doubt about what the current KPI context is without having to look at filters.

    Showcase

    Utilizing all the capabilities now given, we can create a series of KPIs that carry the vast majority of all our needs in the out of the box visual. Below is just an example of a bunch of different ways to format the KPI visual. The goal here is to show how many different ways we can see these metrics using the same KPI visual now that we have all these new properties at our disposal.

    Sometimes the features we want to have added to visuals in Power BI aren’t being worked on. It isn’t for lack of effort on the Power BI team, believe me. But there are only a finite number of resources, and a backlog a mile long. Check out ideas.powerbi.com to see all the ideas being requested. I believe that is why they opened up a program to work with Power BI experts to engage with the Power BI team directly. These KPI features are a result of that program. The individual we can all shower with our thanks is James Dales. You can check out what James is up to on his blog – https://powerbi.jamesdales.com/, and be sure to hit him up on twitter and offer up a bit of thanks – @jamesdales

    If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.

    As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products:
    Store Merchandise

    If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.

    As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products:
    Store Merchandise

  • Introducing Scrims

    Introducing Scrims

    PowerBI.tips is excited to announce our new tool to help you build the best looking reports, Scrims.

    We’ve built out a fast and easy solution for snapping visualizations into place while giving your reports the extra special look and feel that allows your audiences to be impressed without losing focus on the important stuff. We developed Scrims to give you this shortcut to amazing looking reports

    16:9 (1280 x 720) examples

    What are Scrims?

    A Scrim is a background image that you would use on your Power BI Report pages. Why the name Scrim? The term scrim is used in theater productions. It is a backdrop that is placed on the stage behind the actors. It adds context & engages the audience with the production. We thought this idea crosses over well with Power BI.

    A scrim can change the mood of a theater just like a well designed background image in your report. Scrims were developed to be easily adaptable to different color themes and in each set of offerings we release we’ll be providing you with as many color options as possible.

    Why Do I need One?

    Scrims solve several problems.

    1. The most important problem it solves is time. As BI practitioners ourselves we know the demands that are placed on you. More often then not the visual look and feel gets the least amount of attention due to deadlines. We want all the time you invest in the data and building visuals to impress your audiences by being presented in a beautiful way.
    2. Scrims are designed the same way we developed Layouts. By emphasizing proper design focused on the Gestalt design principles to ensure the end users experience enjoyable and non-distracting reports.
    3. Using a Scrim you will reduce the number of objects on the page. Less elements means faster rendering reports. Here is a blog that tests this by adding more visuals to a page “More Visuals Mo Problems”. Scrims add the illusion of a very large number of objects without a negative impact.

    Scrims come in a Bundle

    A Scrim bundle contains a series of images that you can use in any way you want in your Power BI file. Each Scrim will have different page sizes to best suit your report needs. Every Scrim will contain a default 16 x 9 (1280 x 720) aspect ratio. Most Scrim bundles will have additional ratios such as 8 x 9 (1280 x 1440) or 4:3 (960 x 720). You will see the sizes prominently displayed for each bundle with the red tags.

    8:9 (1280 x 1440) examples

    Each bundle contains 6 pages minimum for each size, which means on average you will receive at least 12 pages in each bundle. Each Scrim bundle also includes the color theme that corresponds with it in JSON format to easily upload into your Power BI Report.

    Theme Color example
  • Scrims Instructions

    Scrims Instructions

    Thanks for your interest in our product Scrims. For more details on what is a scrim click this link to Learn More.

    Download a scrim from the products page. You can access all the available scrims here.

    Instructions

    After downloading, you will have a Zip file stored on your computer. Right Click on the zip file and Select the option Extract All from the drop down menu.

    Right click menu option Extract All

    The extract compressed folder menu will appear. Click on the Extract button found in the bottom right corner of the menu.

    Extract Compressed folder dialog box.

    A new folder will be extracted to the location noted in the previous menu screen. Open the newly created folder. Within this folder you will find all the images for the Scrims and a JSON theme file to use within your report.

    Folder contents, Scrims images, links to instructions, terms & conditions, and color theme file.

    When working with scrims it is helpful to see which image contains the the correct background layout for each of your pages. I find that it is helpful when reviewing images as Extra large icons. To turn this on, Open the View ribbon in the File Explorer. Then in the Layout window select the option titled Extra large icons.

    Change view of File Explorer to Extra Large Icons

    Add Scrims to Report

    Open a Power BI report in the Power BI Desktop application.

    Image of a Power BI report in Power bi desktop

    Note: Reports do not necessary have to be brand new. You can use scrims on existing reports. For illustration purposes an existing report was opened with only visuals. The remainder of this tutorial will show you how to add scrims to a pre-developed report.

    Click on the Paint Roller button. Then Open up the Page background item in the menu options. Click on the option labeled Add image.

    Note: for these options to appear you have to have a report open and none of the visuals selected on the page.

    In the open file selection dialog box pick the scrim that you want to load. Click on Open to load the image to the report page.

    The image will not initially appear. This is because the default settings for background are set incorrectly for this feature. Change the transparency to 0% and adjust the Image Fit to the drop down option of Fit.

    Note: Images provided in scrims are larger than the report page pixel size. This is because in order for the images to not look blurry we have to supply a larger image. The Fit feature then scales the image back down to the report canvas size but retains a crisp and clean look.

    Our report should now look similar to the following:

    Add JSON Themes

    Initially the visuals will not be formatted for the style of the report. We can control this by using a JSON theme file to pre-format some options for the visuals.

    On the View ribbon, select the Drop Down Arrow icon. Then Select the option at the bottom of the menu titled Browse for themes…

    Navigate to the scrims download folder and add the supplied theme with scrim download.

    This will apply formatting for the colors and some Visual style properties.

    Clean up Visuals

    Next Select individual visuals and Align them to the defined areas within the scrim.

    See sample image below with visuals aligned to the scrim.

    Finalize Report

    Apply any additional style properties for your visuals. In the below image the following settings are added for reference:

    • The top 4 cards are grouped and the group
    • The Grouped cards were applied a background of white @ 40% transparency
    • Labels were added to the bar charts
    • Unneeded axis were removed from the bar charts for clarity
    • White lines were added to the scatter chart for x and y axis

    Then rinse wash and repeat for every other page you need to develop.

    Here is a completed sample of this report:

    Check out scrims today

  • More Visuals Mo Problems

    More Visuals Mo Problems

    In some recent conversations the notion of minimizing the number of required visuals came up as a topic. While I know from talking with the Microsoft development team more visuals on a report page increases load time. But I haven’t been able to find any substantial numbers on how performance is impacted by increasing the visual count on a page. Spoiler alert, adding a ton of visuals to a page slows it down.

    Test Set up

    To begin our test I started with a know report the Microsoft September 2018 Layout. This was a good sample as it already had a number of visuals and buttons on the page. I then proceeded to create a single text box with some text in it. Then copy the text box over one hundred times. All the text boxes are placed into a single group. Grouping the text boxes allows for the ability to toggle on and off all of the visuals with a single click.

    Here is the before image of the report with the text boxes turned off:

    Now with all 100 text boxes turned on. Yes, not pretty I know, but it makes the point.

    Conducting the Test

    Now that the set up was complete we are able to use the performance analyzer to render all the visuals on the page.

    To open the performance analyzer with the new modern Power BI ribbon. Navigate to the View ribbon and click on the Performance analyzer button.

    With the Performance analyzer window open we Click the Start Recording button. This begins how the visuals perform when you interact with the report.

    There are two options at this point to start recording data.

    1. We can click on items on the report page
    2. Click the Refresh visuals button to refresh the entire page

    I chose option number two since I wanted a consistent method to record performance. This removes any human error by performing a sequence of clicks across the screen.

    After clicking the Refresh visuals the Performance analyzer generates a ton of data that we can sift through to understand performance of the report page. You can expand on one of the visual elements to understand how many milliseconds it takes for the visual to render by function.

    Note: For more details on each performance component read up on the Microsoft documentation found here: https://docs.microsoft.com/en-us/power-bi/desktop-performance-analyzer

    We can now export the data from the recording by Clicking the Export button.

    For my test I ran 5 performance tests with the text boxes turned off and 5 tests with the text boxes turned off. The process was the following:

    1. Click Start Recording
    2. Click Refresh visuals
    3. Click Export to extract the data, name the file for future review
    4. Click Clear to remove all data
    5. Go back to step 2 to Refresh visuals
    6. Repeat process until 5 performance tests are complete
    7. Turn on the Text boxes and repeat the process for 5 performance tests

    The Test Results

    Finally we can dig into the data and figure how much impact we incurred from all the visuals. After a bit of playing around with the datasets in Power Query we are able to come up with the following results.

    At a high level adding the 100 extra text boxes increased the load time from 174 ms up to 3,100 ms which is a approximately a staggering 1700 % increase in load time.

    Here is the detailed break down for average load times compared with and without text boxes.

    There are some interesting notes here. When we added the text boxes it caused all other visuals to increase load time from 22 up to 28 % percent longer per object. Clearly the text boxes took the longest to render.

    If you’d like to test this on your own, you can download these materials from this GitHub location: https://github.com/MikeCarlo/PBIReportVisualPerformanceTest

    Implications & Observations

    After completing this test there were a couple of observations that I felt would be best practices when building future reports.

    1. It is important to take time to clearly label your visual elements on the report canvas. Doing so makes it easy to identify each item in the performance analyzer.
    2. Increasing the number of visuals on a page hurts rendering performance. So think carefully about how many visuals you need to add to convey the data story you are trying to tell.
    3. When a visual is not shown it does not impact performance rendering of the page.
    4. A trend I am seeing is individuals are creating really long pages. Meaning the page is 1280 x 3000 or even 4000 pixels long. This is a nice feature that lets the report consumer scroll through multiple visuals. However, this has an unintended consequence adding all the extra visuals is slowing the time for the report to render. Instead of increasing the page length it would be better to control which visuals are being shown by using Bookmarks and Grouping. To learn more about bookmarks & grouping visuals visit:
    5. Limit adding style elements such as drop shadows and visual shading images as these will increase load speeds. Instead push those types of changes down to a background image that can be placed on the page. This is the technique used in creating PowerBI.Tips layouts.

    If you like the content from PowerBI.Tips please follow us on all the social outlets. Stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel.  Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.

    Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:

    Check out the new Merch!

    Hasta La Vista Data
    Go Ahead Make My Data
    PBIX Hat


  • Scale a Visual

    Scale a Visual

    For each visual in Power BI Desktop there is a button called Focus Mode. This feature highlights a single visual. While this can be helpful, it does remove the ability to change or adjust the visual based on filters or slicers. I’d like to introduce to you the concept of Scaling a visual.

    In this tutorial we walk through how to take a collection of visuals. Then group all the visuals. Finally, control the group of visuals by hiding or showing them with bookmarks.

    Check out this video tutorial on how to accomplish this within Power BI Desktop.

    Tutorial Video

    Downloads

    Download the icons and sample PBIX file at my GitHub repo.

    Download the layout used in this tutorial here.

    Other Resources

    Here are some other great tutorials around using the grouping feature in Power BI Desktop.

    If you like the content from PowerBI.Tips please follow us on all the social outlets. Stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel.  Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.

    Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:

    Check out the new Merch!

    Hasta La Vista Data
    Go Ahead Make My Data
    PBIX Hat


  • Consolidate Report Pages Easily with Visual Grouping

    Consolidate Report Pages Easily with Visual Grouping

    We do a ton of Layouts here at powerbi.tips and with the introduction of the new visual grouping feature I was looking for different ways I could utilize that functionality to make templates for different reporting scenarios. First, if you aren’t familiar with how to build a visual group, be sure to check out a blog Mike wrote on this a little while ago on the subject.

    While I was exploring an idea of how I might leverage this new feature, a challenge I’d seen re-occurring in the Power BI Community forums popped to the forefront of my mind. This new feature gives a perfect solution for answering the question of Report page consolidation.

    Time and again I’ve seen people talk about navigation issues related to the number of pages or tabs they have in their reports. Sometimes, particularly in embedded scenario’s, you can’t reduce the pages and break them up into smaller reports. This could already be accomplished in the past, but man oh man, good luck finding what you were looking for in the selection pane. Let me show you how easy this is to accomplish with the addition of the new visual grouping feature. By walking through these few examples you can extend this to however many pages you need following the same patterns I describe below.

    Setup:

    On your main page, make some room for buttons that you can create to toggle your pages on/off.  You can see I did that in the image below.

    Next we’re going to click on the top object of the selection pane and hold SHIFT + Click the last object, this will highlight all the objects.

    Right click and select Group from the drop down menu.

    This will create a “Group 1”

    Let’s rename this to Page 1. You can do that by Double Clicking on the group name and typing in the new name, Page 1.

    Next, we’ll create a button called “Page 1”

    Select the Home ribbon, then click the icon called Buttons.

    Choose a Blank button from the drop down. In the formatting pane under the visualization area choose the section titled Button Text and toggle it on, then type in Page 1.

    Place the button in on the right side of the screen.

    Great, now since we just created this button, let’s create a second button. This one will be for our “Page 2”. Follow the same steps above or select the Page 1 button and do a CTRL + C (copy command) / CTRL +V (paste command) to create a new button. Rename it to Page 2 and you should end up with this

    Prep Additional Page

    Now, here is where this gets cool. Imagine you have N number of pages, all with roughly 10 objects on each page. We are going to go to each page and condense all the objects down to a single one. Go to your second page, in the same fashion that we created our page 1 group, we select all the objects in the selection pane.

    An alternative method would be to just click on your report canvas and CTRL + A to select all objects, then right click and create group. We now see our new group in the selection pane and we can double click to rename it Page 2.

    Bring It Together

    Now Click on the Page 2 group and hold CTRL +C to copy the group

    Navigate to page 1 and CTRL + V to paste the group.

    This will bring all the objects from your Page 2, to page 1 and it will look like a giant mess similar to this

    But you know what isn’t a mess? The Selection pane! Because we created the group, all we need to do is toggle the Page 2 visual icon to off

    And our page looks normal again. Now let’s hook up the buttons and bookmarks.

    Go to the View ribbon and open the Bookmarks pane.

    Click the Add button at the top of the window. This will create a snapshot of the current state of the report page. Double Click the bookmark that is created and change the name to Page 1.

    Now, toggle the Page 1 visual group off from the Selection Window. Then turn the visibility of Page 2 visual group to on.

    Create a second bookmark. Now, change the name to Page 2. You’ll likely notice that things just aren’t aligned right (at least in my case that is true because I’m using a background.

    But since the visual grouping is its own object, all I need to do is select it and all the visuals resize for me! I don’t have to individually mess around with each one! Super cool.

    The last thing we need to do is connect our buttons with our bookmarks. We Select the Page 1 button. Choose the Action in the formatting and toggle it On.

    Select the Bookmark from the Type drop down. Then Select the Page 1 bookmark we created.

    Repeat the same thing for Page 2. You have just created navigation buttons to the two views of your report pages.

    CTRL + Click in Power BI desktop will activate the actions on the buttons. Finally, you can see that you will toggle between the report pages on the single page. WHEW!

    Closing

    This was possible before visual grouping, but now it is an EXTREMELY clean and efficient way to consolidate your report pages and add a ton of objects into a single page. You do have a lot of objects on the page, but don’t worry there isn’t a performance impact because they don’t render until you click the button to make them appear. You can test this out by enabling the Performance Analyzer in the View ribbon.

    Visual grouping coupled with bookmark grouping have made these features fun to use and easily manageable for all. This is just a simple use case where extending the use of visual grouping to help us manage our reports better can make a world of difference as we build. I hope you enjoyed this post and that it helps you clean up any reports you may have that got a little unwieldy.

    If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.

    As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products:
    Store Merchandise

  • Average Household Income Function in Power Query

    Average Household Income Function in Power Query

    This post will walk through how to pull an estimated household income from a US address. It will be completed all in the Power Query Editor. We will convert this to a function to reuse on any address we want.

    This is the second part in a series on free API calls. The first part demonstrated how to pull daily stock price from Yahoo! Finance, which is available here.

    Note: The content in this blog was first presented at the Power Platform Summit North America on October 18th, 2019.

    Introduction to the Data

    We cannot get exact income of a US address, but the US census releases data which is aggregated at different groups of households (called Geography Hierarchies).
    There are different levels of Geography Hierarchies, explained on the census website. The lowest level of granularity available for this data is Block Groups. This is a small group of blocks, and usually consists of about a few hundred to over 1000 people. A block group is expressed as a code.

    It is unlikely we will have the actual bock group code, but rather a street address. In order to retrieve information, we need to find which block group the address falls into. The first step is to convert the address into geographic coordinate, a process called Geocoding. You are probably familiar with using this feature – for example when you enter an address into a map app in order to get directions.

    The census.gov website offer a free service to do this (US Census TIGER). However, in my experience the match rate (percentage of addresses geocoded) is not as good as some other services – this means that some of the addresses you enter will not be found. There are many other companies that offer Geocoding services, such as Bing, who often provide better match rates. However, these tend to come at a cost. These can be used instead, but for this example we will focus on the free US Census TIGER service.

    Create an API Key

    The first step will be to sign up for an API key from the census website. API keys allow organizations to monitor usage and can stop overloading their databases with requests. Some can be used to charge for paid API calls, however the census API is free and only requires an email address. If you do not work for an organization, you can write “none”.

    Sign up here:
    https://api.census.gov/data/key_signup.html

    Storing the API Key

    Back in Power BI, on the home tab in Power Query, click Home, New Source, Blank Query.

    In the formula bar, Paste in the API key value you received. Rename the query to P_APIKEY.
    This will store the API key so it can be changed, but will not set it as a parameter and require it to be entered by the user every time they use the function.

    Setting up Parameters

    In the power query window, under the Home ribbon, Click the bottom half of the Manage Parameters button. From the drop down Select the option New Parameter.

    Name the parameter P_Address. Change the Type field to Text. Enter 15010 NE 36th St in the Current Value input box.

    Repeat this step 3 more times, so you will have 4 parameters in total. Use the below table for the names and default values:

    NameDefault Value
    P_Address15010 NE 36th St
    P_CityRedmond
    P_StateWA
    P_ZIP 98052

    Function Part 1: Geocoding

    On the home tab, Click New Source > Web. Switch to the advanced tab, Then in the open dialogue box, first Click the button Add part. This will add a new box. Repeat this to add 9 web parts. Afterwards, locate the first window and Enter part 1 of the URL. In the second box, Change the abc symbol to a parameter.  Fill in the boxes like below:

    1https://geocoding.geo.census.gov/geocoder/geographies/address?street=
    2P_Address 
    3&city=
    4 P_City
    5&state=
    6 P_State 
    7&zip=
    8 P_ZIP 
    9&benchmark=Public_AR_Census2010&vintage=Census2010_Census2010&layers=13&format=json

    Note: Do not enter the rows that begin with  P_ directly (rows 2,4,6,8). Switch the type to parameter and pick from the list.

    This will return a result: Record. Click on the Record value to drill into it.
    In the convert tab, click To Table to transform to a table.
    We have extra information here, but we are only interested in the blockgroup data. Filter the column Name to include the following rows only:
    “BLKGRP” , “COUNTY”, “STATE” , “TRACT”
    (make sure you keep the correct columns with these exact names).

    Now we have the correct columns, but for our function to waork we want them in one row. Highlight the Name column, navigate to the transform tabe and click Pivot Column. The symbol is:

    Expand the Advanced Options, and change the Aggregate Value Function to Don’t Aggregate.

    If you are following with the Microsoft head office, your data should look like this:

    Now that we have the address geocoded, we can find out census information.

    Function Part 2: Returning Household Income

    To add the second API call, we can take advantage of the custom column. It is actually possible to use Power Query (M) code in the custom column.
    Click Add Column then Custom Column.

    The code needed for the column is below. You can copy and paste this directly into the “Custom column formula” :

    Json.Document(Web.Contents("https://api.census.gov/data/2018/pdb/blockgroup?get=Tot_Population_CEN_2010,avg_Agg_HH_INC_ACS_12_16&for=block%20group:" & [BLKGRP] & "&in=state:" & [STATE] & "%20county:" & [COUNTY] & "%20tract:" & [TRACT] & "&key=" & P_API))

    Breaking this code down:
    Json.Document tells Power BI the value being returned is in JSON format, and to decode this to a list
    Web.Contents will tell Power BI we are going to be sending a web (api) query
    https://api.census.gov/data/2018/pdb/blockgroup?get=Tot_Population_CEN_2010,avg_Agg_HH_INC_ACS_12_16 is our base URL will some parameters. The parameters we’re giving are to return population and average household income
    -The second half of the URL takes in the location from the current row. anything in [Square Brackets] is a column. P_APIKEY is the parameter we set up earlier, that holds our API key

    This will add a new column holding a list of lists. Click on the word list to drill into it

    This will bring to the second level, two lists. Afterwards, Transform to a table by clicking on the “To Table” button in the ribbon, under list tools > Transform.
    After it is in table format, we expand the values to take them out the list. The data has two lists, the first is the column headers and the second is the values, so we need to transform a little to see the data in a nice format.
    Firstly, expand the values by clicking the expand button and select Extract Values.

    This will expand to one column, separated by a character of our choice. I’m going to use the carat symbol (^), as I’m confident this will not be a character already in the data. To do this, change the first box to –Custom– then type in the carat symbol.

    After extracting the data from the lists, we can split it into columns. We will split on the delimiter we added, the carat symbol (^). Click on Home Tab, Split Column, By Delimiter.

    Same as before, change the first box to –Custom– then type in the carat symbol. Leave the default setting of “At each occurrence of the delimiter”.

    Now it is split, promote the first row to headers by going to the Transform tab and use the “Use first row as headers”.


    Finalizing the Function

    As a result, this returns lots of columns. Highlight the Tot_Population_CEN_2010 and avg_Agg_HH_INC_ACS_12_16 column, right click and select “Remove other columns”.
    Rename the two columns to “Population” and “Household Income” by double clicking the column header and typing the new name.
    Finally, highlight both columns, enter the Transform tab and click “Detect Data Type” to convert the data type.

    Optional: If you want to add the original address in, Click Add Column then Custom Column. Enter the below code, which will concatenate the address to one value. You can name the column Address.

    P_Address & ", " & P_City & ", " & P_State & ", " & P_ZIP  

    Creating the Function

    This query uses a parameter which enables us to can convert it to a function. To do this, right click on the query in the Queries pane on the left and select make function.

    Now we have a function where we can input any address and return the estimated household income, as well as population of this average is taken from(n value). To check multiple addresses, you can add your function to any list. This can be found in Add Column ribbon, then Clicking the button Invoke Custom Function. This will return a table for each row. Before expanding, it is important to handle errors, otherwise it could break the query. One option is to Right Click the column header, and select the Replace Errors option, and Type the text null.

    Final Function

    For those who like M code, here is the final function. You can copy and paste this directly into the advanced editor (See this article on how to do this).

    let
        Source = (P_Address as any, P_City as any, P_State as any, P_ZIP as text) => let
        
                Source = Json.Document(Web.Contents("https://geocoding.geo.census.gov/geocoder/geographies/address?street=" & P_Address & "&city=" & P_City & "&state=" & P_State & "&zip=" & P_ZIP & "&benchmark=Public_AR_Census2010&vintage=Census2010_Census2010&layers=13&format=json")),
                result = Source[result],
                addressMatches = result[addressMatches],
                addressMatches1 = addressMatches{0},
                geographies = addressMatches1[geographies],
                #"Census Blocks" = geographies[Census Blocks],
                #"Census Blocks1" = #"Census Blocks"{0},
                #"Converted to Table" = Record.ToTable(#"Census Blocks1"),
            #"Filtered Rows1" = Table.SelectRows(#"Converted to Table", each ([Name] = "BLKGRP" or [Name] = "COUNTY" or [Name] = "STATE" or [Name] = "TRACT")),
                #"Filtered Rows" = Table.SelectRows(#"Filtered Rows1", each ([Name] = "BLKGRP" or [Name] = "COUNTY" or [Name] = "STATE" or [Name] = "TRACT")),
                #"Pivoted Column" = Table.Pivot(#"Filtered Rows", List.Distinct(#"Filtered Rows"[Name]), "Name", "Value"),
            #"Added Custom" = Table.AddColumn(#"Pivoted Column", "Custom", each Json.Document(Web.Contents("https://api.census.gov/data/2018/pdb/blockgroup?get=State_name,County_name,Tot_Population_CEN_2010,avg_Agg_HH_INC_ACS_12_16&for=block%20group:" & [BLKGRP] & "&in=state:" & [STATE] & "%20county:" & [COUNTY] & "%20tract:" & [TRACT] & "&key=" & P_APIKEY))),
            Custom = #"Added Custom"{0}[Custom],
            #"Converted to Table1" = Table.FromList(Custom, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
            #"Extracted Values" = Table.TransformColumns(#"Converted to Table1", {"Column1", each Text.Combine(List.Transform(_, Text.From), "^"), type text}),
            #"Split Column by Delimiter" = Table.SplitColumn(#"Extracted Values", "Column1", Splitter.SplitTextByDelimiter("^", QuoteStyle.Csv), {"Column1.1", "Column1.2", "Column1.3", "Column1.4", "Column1.5", "Column1.6", "Column1.7", "Column1.8"}),
            #"Changed Type" = Table.TransformColumnTypes(#"Split Column by Delimiter",{{"Column1.1", type text}, {"Column1.2", type text}, {"Column1.3", type text}, {"Column1.4", type text}, {"Column1.5", type text}, {"Column1.6", type text}, {"Column1.7", type text}, {"Column1.8", type text}}),
            #"Promoted Headers" = Table.PromoteHeaders(#"Changed Type", [PromoteAllScalars=true]),
            #"Changed Type1" = Table.TransformColumnTypes(#"Promoted Headers",{{"State_name", type text}, {"County_name", type text}, {"Tot_Population_CEN_2010", Int64.Type}, {"avg_Agg_HH_INC_ACS_12_16", Currency.Type}, {"state", Int64.Type}, {"county", Int64.Type}, {"tract", Int64.Type}, {"block group", Int64.Type}}),
            #"Removed Other Columns" = Table.SelectColumns(#"Changed Type1",{"Tot_Population_CEN_2010", "avg_Agg_HH_INC_ACS_12_16"}),
            #"Renamed Columns" = Table.RenameColumns(#"Removed Other Columns",{{"Tot_Population_CEN_2010", "Population"}, {"avg_Agg_HH_INC_ACS_12_16", "Houshold Income"}})
        in
            #"Renamed Columns"
    in
        Source

    If you like the content from PowerBI.Tips please follow us on all the social outlets to stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel.  Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.

    Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:
    Store Merchandise

  • Stacked Bar Chart

    Stacked Bar Chart

    For this week we are building a stacked bar chart on https://Charts.PowerBI.Tips. While you can build this type of chart within Power BI Desktop this video is more of an example showing how you can build a similar custom visual chart. There are two areas that i’d like to point out.

    1. When working with a chart you can add a Legend to a shape object (2:40)
    2. Adding a grand total requires an additional Glyph (3:14)

    While these two tips are pretty simple it really does help you when building more complex custom visuals.

    Watch the Tutorial

    Download this Custom Visual

    If you liked this visual and want to download it, head over to this repository to download.

    Learn More About Custom Visuals

    We have been working hard to help you learn how to make custom visuals. Check out our full YouTube Playlist to learn more about using Charts.PowerBI.Tips.

    If you like the content from PowerBI.Tips please follow us on all the social outlets. Stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel.  Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.

    Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:

    Check out the new Merch!

    Hasta La Vista Data
    Go Ahead Make My Data
    PBIX Hat


  • Act like the business, Think like I.T.

    Act like the business, Think like I.T.

    This month at our Power BI User group in Milwaukee Seth and myself walk through some of the basics of Power BI. We digest how can we leverage Power BI to Act like the business, and Think like I.T. Discuss the concept of global and local measures within a data model. As well as some tips and tricks while working in the power query editor. Check out our presentation that we recorded for this month.

    YouTube Video

    If you like the content from PowerBI.Tips please follow us on all the social outlets. Stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel.  Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.

    Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:

    Check out the new Merch!

    Hasta La Vista Data
    Go Ahead Make My Data
    PBIX Hat