Paste JSON/XML as classes in Visual Studio – Quick Tip

Hi Folks,

With this post I will show you how you can quickly add classes for your JSON and XML in Power Platform using Visual Studio.

Sometimes, there will be requirements where you need to convert and replace your Power Automate Flows with custom code either using Plugins or Actions. In this case, you may definitely need to parse the response returned by REST API calls and you might need to create relevant classes to hold the parameters and attributes, creating these manually would be cumbersome and takes few minutes of time even for a good developer.

Here I am taking the example using JSON.

So, without further due, let’s see in this in action.

Step 1: So, just copy using Cntrl + C shortcut, this is mandatory, else you will not able to see the Paste JSON as Classes and Paste XML as classes under edit..

{
"orderId": "ORD123456",
"customerName": "John Doe",
"orderDate": "2024-04-27T08:30:00Z",
"items": [
{
"itemId": "ITEM001",
"itemName": "Product A",
"quantity": 2,
"unitPrice": 25.99
},
{
"itemId": "ITEM002",
"itemName": "Product B",
"quantity": 1,
"unitPrice": 35.50
}
],
"totalAmount": 87.48,
"shippingAddress": {
"street": "456 Elm St",
"city": "Metropolis",
"zipcode": "54321",
"country": "USA"
},
"status": "Shipped"
}

Step 2: Then open Visual Studio –> Edit –> Paste Special

Step 3: Click on Paste JSON As Classes and soon you should be able to see something as below.

That’s it, your classes are now generated from the copied JSON File, you can do pretty much the similar thing with XML.

Hope this helps someone trying to achieve a similar goal…

Cheers,
PMDY

#02 – Copilot Learn Series: Test and Publish your bot

Thanks for visiting my blog, this post is a continuation of my previous blog post on creating a Copilot, if you haven’t gone through that, I would strongly recommend checking my introductory post on this topic, you can find it here.

Well, so in this blog post, we will see how you can test and publish your bot so your bot development would be complete.

Your bot can be tested, and messages will be displayed on the chat screen.

Step 1: Test your Copilot:

The bot calls the topics based on the trigger phrases you have entered as below.

You can return to the authoring canvas for the topic at any time to revise its conversation path. The Test chat pane will automatically refresh itself when you save changes to a topic.

As you fine-tune your bot, it can be useful to enable tracking between topics so you can follow through the conversation path step by step.

Step 2: Publish your bot

Once you confirm that everything is good and then you can publish your Copilot.

Publishing you bot helps to engage with your customers on multiple platforms or channels.

Each time an update is made, you would need to publish the bot in the Copilot Studio and this will publish the changes to all your respective configured channels. If you haven’t configured any channels, you may proceed to next step to get to know this.

Step 3: Configure Channels

You can see what channels we currently support by selecting Manage and going to the Channels tab in the side navigation pane and each channel may require different setup and configuration.

Channel settings

Step 4: Finally let’s see how the Copilot looks like when you embed...

a. Navigate to channels as highlighted from publish tab:

b. Verify if the channels are enabled:

If not enabled, make sure you set the authentication for your Copilot properly for the respective channel to embed your bot. For simplicity, I have chosen No authentication.

c. Find the embed code:

As shown above, you can find the embed code under Share your website.

d. Copy the embed code in browser:

You can test out your Copilot in your browser by pasting your embed code and it should look something like below.

Step 5: Bot analytics

a. Do note that there might be one hour of delay between when the conversations occur and when the statistics for those conversations appear in the analytics views

b. All the channels’ analytics will be logged here.

c. You can find the Summary on how your bot is performing, understand how your topics defined are performing and come up with CSAT Scores out of this.

Now that we have learnt how to build a simple Copilot, in the next blog post, I will cover about the Variable Management and Topic Management in Copilots which would help to design your desired topic conversation path for your Chatbot.

Hope this helps…

Cheers,

PMDY

#01 – Copilot Learn Series – Getting started with understanding Copilot Studio and basic building blocks of a Copilot (a.k.a Power Virtual agents)

In the next few blog posts in this series, I will be talking all about Microsoft Copilot aka Power Virtual Agent from beginner to advanced topics, you might see longer posts but you don’t require any prerequisite knowledge on Copilot to follow.

So, lets’ get started and learn with me in this blog series, let’s dive into the capabilities of Generative AI using #copilot.

Actually, Copilots empowers teams to quickly and easily create powerful bots using a guided, no-code graphical experience.

In this blog post, we will see how you can create a simple Chatbot. Excited? so let’s get started.

Step 1: Go to https://aka.ms/TryPVA try out.

Step 2: Click on Try free option available.

Step 3: Enter your email address and click on Next.

Step 4: In case you have an existing Microsoft 365 Subscription, you will be shown something below.

Step 5: Click on SignIn, I have already had an account in Copilot Studio, in case you don’t have, one will be created.

Step 6:

a. Once you click on Get Started, Copilot Studio opens in a new tab and you will be asked to login once, enter your Signin details.

b. In case you previously logged created a Copilot Studio trial, you can continue using it, else extend it to 30 days if requested. Click on Done at the bottom.

Step 7: Below is the home page of Copilot Studio.

I will be talking about each of the highlighted topics in great detail in the upcoming posts.

Note: Copilots are supported only in certain locations listed in the supported data locations, with data stored in respective data centers. If your company is located outside of the supported data locations, you need to create a custom environment with Region set to a supported data location before you can create your chatbot.

Step 8:In this step, we will create a Copilot, so click on New Copilot

Step 9: I have provided a name; chosen a language and we can provide an existing website to utilize the Generative AI Capabilities built in Copilot Studio.

So, here I have provided my blog address. By using this, Copilot can generate suggested topics which you have option to add to your existing topics.

Step 10:

a. Click on advanced options; you will be able to choose an icon for the Copilot

b. You can optionally include Lesson Topics

c. Choose an existing solution you wish to add this Copilot to

d. Choose appropriate Schema name.

e. You can optionally enable voice capabilities by choosing the language in which your copilot wants to speak to.

Step 11: You will be shown the below screen and within a few seconds, your Copilot will be ready.

Step 12: This is the main page of your Copilot; you will also be able to add additional Copilots or delete the Copilot.

Step 13: Now, let’s understand Topics what one of the building blocks of Copilot which are nothing but predefined categories or subjects which can help to classify and organize the KB/Support articles.

The topics shown on the right are prebuilt topics for the Copilot you have just created. Here you may wish to create new topics as necessary.

Step 14: Trigger phrases are those which customer enters in the chat window to start the conversation which then calls the relevant topics. There can be multiple trigger phrases for a single topic.

You may click to create a topic which then asks you to provide the trigger phrases, when you click to add a topic, the Trigger phrases node and a blank Message node are inserted for you.

Copilot opens the topic in the authoring canvas and displays the topic’s trigger phrases. You can add up to 1000 topics in a Copilot.

Step 15: You can add additional nodes by selecting the Plus (+) icon on the line or branch between or after a node.

Screenshot of adding a node

Step 16: When you were adding a new node, you can choose from the below options available.

You can either use any of the available options as above..

a. Ask a question

If you want to ask a question and get response from end user, you may do so by adding a node, click on Ask a question.

For example, I choose Multiple choice options.

Based on what you enter in the Identify field, you can enter what options user may have. You can nodes further to create branching logic

b. Add a condition

You can add a condition in the canvas as below to take your Copilot branch out conditionally.

c. Call an action: The options shown below are self explanatory. You can branch out with the possible options.

d. Show a message

You may choose to show a message to the user by entering your message in the text box available.

d. Goto another topic

f. End the conversation:

Finally, you can end the conversation by choosing to end the conversation with the available options or you can transfer to an agent to take the user queries further.

Step 17:

Copilot conversations are all about natural language understanding. Entity is the fundamental aspect which can be recognized from user’s input.

It can be simply can be thought real world subject like Person name, Phone number, Postal Code etc. We have system as well as custom entities while building.

You can also build custom entities you can choose from the options available.

Now that you have seen what are the building blocks of Copilot, in the upcoming blog posts let’s see on how to test and publish your copilots.

Thank you for reading.

Cheers,

PMDY

How do you deal with overlapping data labels in Power BI…? – Quick Tip

Hi Folks,

This post is a tip which I have implemented in my one of my projects which can help to improve your Power BI Reports accessibility.

Enabling data labels is a great way to show the numbers in the visual

But what if they keep overlapping even though you ensured optimal size and Data labels to be displayed at the Outside end like below. It decreases the report accessibility.

There were two options for you…

  1. Changing the colors in the theme color

You can change the themes by going to View option if you would like to install the ones available with Power BI, else if you want to install custom themes, you can download them from https://powerbi.tips/ and install.

After changing the theme color, the data label is clear and readable and thereby increasing accessibility.

2. Enable background color and set the transparency

This is the other option where you can enable the background for the data labels and set the transparency based on your requirement, it is good to set that to a low number as below.

    There it is, now your report looks a lot better for users to read the data labels

    Hope this helps someone trying to improve the readability and accessibility of the Power BI Report using the tooltips..

    Cheers,

    PMDY

    All you need to know for migrating your Power Platform environments from one region to another

    Geo Migration is a great feature/flexibility offered by Microsoft for customers who wish to move to a region which is in closest proximity to their operations even though initially their Power Platform environment region based out of a different one when they signed up. I checked out online but couldn’t find a good reference blog article yet online, hence this post.

    I will make this post detailed but a comprehensive one for anyone to understand the migration. Customers can also opt for Multi Geo for those who have a need to store data in multiple geographies to satisfy their data residency requirements. If you don’t know where your Power Platform environment resides, you can check from Power Platform Admin Center.

    If you were not aware yet, Microsoft Azure is the only cloud provider which offers services in more regions when compared to AWS (Amazon Web Services) and GCP (Google Cloud Platform). The Geo Migration feature seamlessly allows customers to move their environments in a single tenant from one region to another. e.g. for Singapore, it is as below.

    Important:

    1. Geo Migration is not generally available, so please exercise with caution.
    2. You may reach out to your TAM(Microsoft Technical Account Manager) quoting your request
    3. There were several limitations, see below references for more details.

    Mandatory Pre-Migration Check list:

    1. Any Power Apps, Power Automate Flows should be manually exported prior to the migration. Custom Connectors aren’t supported as of now, they must manually reconfigure or created in the new environment. You can export them individually or export them in group.
    2. Canvas Apps, Custom Pages, Code Components like PCF and libraries should be deleted from the environment before your migration activity starts. Else they might be in corrupted state after migration activity.
    3. If any of your Apps are not solution aware because of any reason like App calls a Power Automate when a button is called etc., you may need to explicitly export it out and take the backup.

    Post Migration Check list:

    1. After the migration, import all the packages which you have taken backup during pre migration. For those which were not solution aware, import them manually.
    2. If you have Power Portals or Power Virtual Agents, those should be exported explicitly.
    3. Make sure you test all functionalities in order not to impact end users.

    Notes:

    You don’t need to build Apps and Flows from scratch. Dynamics 365 marketing App is not supported yet. There could be some configuration changes post migration.

    While I try to put the information to the best available as per sources from Microsoft, this may change over time and variation could be different as each customer will have different workloads and dependencies with other services, so please read the references carefully before proceeding. Contact Microsoft Support or TAM as necessary.

    Hope this helps to get a sneak peek into the migration process.

    References:

    Where is your data stored?

    MultiGeo Architecture

    Dynamics 365 & Power Platform new regions

    Advance Data Residency Move Program

    Geo to Geo Migrations

    Cheers,

    PMDY

    It looks like something went wrong error in Power Automate – Quick Fix

    Hi Folks,

    While I was working with Power Automate, I encountered this error, FYI, I was using Old Designer UI, hope the fix I found was an easy one but could help someone else without digging deeper and troubleshooting by changing the flow. I tried refreshing the page couple of times as suggested in error message but still the issue remains.

    To solve this, just grab the Power Automate URL from your browser search bar…you should see something like below.

    Just replace it to

    While this could be an intermittent issue, but it persists for few mins or so. So, this tip can help someone.

    Cheers,

    PMDY

    Execution Timeout Expired. The timeout period elapsed prior to completion of the operation, or the server is not responding – Troubleshooting timeouts in Power BI

    Hi Folks,

    When I was working with my Power BI reports, I suddenly started encountering this error. I don’t have any more clue except this error message which I could see in my Power BI Desktop as below. Initially I thought there could be some problem connecting to my SQL end point of my Dataverse connection, but it isn’t.

    The error message above clearly say that the Queries are blocked. I then quickly started reviewing the model of the Power BI Report to see if there were any issues like the Relationships etc. But I couldn’t find anything in my relationships. Since I was using SQL Connection to my Dataverse, I tried to increase the Command timeout in minutes (max value being 120 minutes) from Advanced options of my connection but still the same error.

    Cause: Then I quickly noticed that in my model I have fetched the same table data both using Direct Query and Import mode. So, when I was refreshing, because of the relationships, the one imported is being dependent on the one with Direct Query.

    Fix: After review, the unnecessary Direct Query table was removed and voila it fixed the issue.

    If anyone is facing the same problem, I strongly recommend you review the Semantic Model of your Power BI Report.

    Cheers,

    PMDY

    Enabling TDS End Point for Dataverse (Preview Feature) from Power Platform Admin Center and its advantages

    Hi Folks,

    Exciting news…

    Here is how you can enable TDS End Point in Dataverse…

    1. Navigate to Power Platform Admin Center…https://admin.powerplatform.microsoft.com/home
    2. Next navigate to the respective Environment to which you want to enable TDS End Point
    3. Choose your respective environment, go to Settings
    4. In the Settings windows, select the Product and then Features.
    5. Scroll down to see TDS end point, just enable the TDS end point toggle button
    6. Once this is enabled, you can also enable user level access for this TDS end point by configuring the security role as in the below step.
    7. Open Security from Power Platform Admin Center and navigate to the available security roles, go to Miscellaneous privileges, search for tds, you can find a privilege to Allow user to access TDS endpoint.

    Advantages:

    1. With this TDS end point enabled, you can directly access the data in the Dataverse tables using SSMS(Preview) and in Power BI
    2. While the interesting part here is that Dataverse security model will be applied to the data being viewed by the user.
    3. That is whenever you were query the dataverse data using SSMS, the user role will be applied
    4. In the same way, if the Power BI report is built using the TDS(SQL end point), any user who is going to access the report will be only seeing the data he/she can access based on current security roles in Dataverse

    While this is a cool feature for anyone who is trying to build Dataverse security in Power BI without using row level security as this is a mystery till now.

    Hope this helps…

    Cheers,

    PMDY

    Adjusting the time zone in Power BI DAX – Quick Tip

    Hi Folks,

    Here is the use case, I was actually pulling data from Model Driven App. As I was testing my report and cross validating the counts with that of CRM, so, it is when I noted that the count of records getting mismatched exactly by the time zone difference and if you were any other time zone except UTC and using Date Diff functions, you may encounter this problem with Power BI which you should definitely handle.

    The Power BI Service will be in UTC time zone. Actually, we can’t set the time zone neither in Power BI Desktop nor Power BI Service, we have to amend the time difference, that’s it.

    Consider you have a measure which calculates the date difference…consider adding the time zone difference to your DAX measures or using Power Query.

    While here I am using the below DAX, I have adjusted here in DAX as below.

    While I was in Singapore time zone (SGT), I have added UTC + 8 using the TIME Function available in Power BI and there you go…

    Now the numbers should match exactly with that in your DataSource i.e. Power. Apps.

    If you want to do this in Power Query based your scenario…you can follow the article in references.

    Hope this helps someone…

    References:

    Edit time zone in Power Query

    Cheers,

    PMDY

    Delete audit log information from Power Platform Admin Center effortlessly

    Hi Folks,

    Do you know that you can set to auto delete the audit log information from Admin center. Yes, you can do this from Power Platform Admin center. Of course, this tip is a very small one, not knowing such feature can cost you so much time to figure out the Audit log deletion.

    Open Power Platform Admin Center https://admin.powerplatform.microsoft.com/

    Select an environment –> Navigate to Settings –> Audit Settings

    As highlighted below, you can specify the custom number of days within which your Audit logs can be deleted.

    Then click on save available at the bottom of the screen.

    Hope someone would find this useful…

    Cheers,

    PMDY