My top 3 favorite features released at Build 2024 for Canvas Apps…

Hi Folks,

This sounds good for those who are Pro Dev and also those working on Fusion teams (Pro + Low Code), as well. Just note, all these features are in preview or experimental features and are available in US Preview Region now as they were just released in the Microsoft Build 2024 last week. Public preview of these features is expected to be released in June 2024, so you can then try out in your region as well. If you want to check these out now, spin up a trial in US Preview region.

These are the top new features

  1. View the Code behind the Canvas Apps
  2. Use Copilot to generate comments and expressions for your Canvas Apps
  3. Integrate Canvas Apps with GitHub

Feature #1: View the Code behind the Canvas Apps

Now you can view the code behind your Canvas Apps, besides the screen where the components reside, click on the ellipses as below

You should be able to see the YAML source code for your Canvas App, the code is currently read only, you can click on Copy code option in the above screen at the bottom of page.

Make the necessary changes to the YAML code, create a new blank screen and you can then copy the YAML code to recreate the previous screen for which you copied the YAML code into a new screen if you wish you.

Here I will copy the code for the container inside, then I will create a new Blank screen

Once blank screen is added, expand it so that it occupies the entire size of the App, then click on Paste as below

Give it a minute, your new screen is now ready with container inside as below e.g here it was Screen3, just rename this accordingly.

How easy it was…. make sure you copy it to relevant item, meaning if you copied the code of container, you could only copy it to another container and not a screen.

Feature #2: Use Copilot to generate comments and expressions for your Canvas Apps

Do you want to generate comments for the expressions you wrote. Or have you forgot the logic which you have written for Canvas Apps long time back, don’t worry, use this approach.

Let’s say, I am choosing OnSelect Property, I have the below formula

Let’s ask Copilot what this mean, click on the Copilot icon available as below

Choose to explain the formula

So now you click on Copy option available and paste it above your command, this serves as a comment for your expression, you can try for any complex expression you wrote in your App. This improves the readability of your app and also makers can use existing knowledge to quickly get up to speed, minimize errors, and build—fast next time you were working on the same App

Now you can generate Power Fx using Copilot, you can start typing in natural language what you need in comments and as soon you stop typing, it shows as generating as below…you could use either // and /* */, comments can remain in the formula bar as documentation, just like with traditional code.

It generates the Power Fx command for your input as below and then you need to click on Tab key on your keyboard, it will show something like below

And finally, you can see the output as below.

You can apply these two tips for complex formulas as well.

Feature 3: Integrate Canvas Apps with GitHub

Did you ever notice that if the canvas app was opened by one user, when another user tries to open the same Canvas app, you would see a warning message and you need to explicitly click on override to take it forward, meaning at any point of time, only one person could be able to work on the Canvas App.

Now we can use first class Devops with the GitHub Integration feature enabled, many people can work on the same canvas app at the same time and also commit the code for Canvas Apps to Git, let’s see this.

Prerequisites:

  1. You need to have a GitHub Repo created, creating branches is optional, we can use main branch otherwise.
  2. Enable the experimental feature as below

Then you should see

Next thing is you need to configure Git version control as below, you can either use GitHub or Azure DevOps for this, I want to create a new Directory for storing my canvas app like GitHub Test which is not yet in my GitHub Account.

You need to your GitHub Account settings to create a new token as below.

For the personal access token, give repo level scope and click generate token.

Copy the personal access token in the connect to a Git Repository window, once authenticated, you should see a message like below.

Click Yes, you should see something like below

Within a minute, you should the below screen w

So, you should the code files being created in your GitHub Account as below

Now, your team can make the changes in the GitHub, since GitHub allows multiple people to work on it, latest commit will be reflected whenever you try to open the Canvas App from the maker portal. This helps developers build and deploy to source control without leaving the maker portal. Whenever you try to open this app, it will ask your Git Account credentials.

Do note that these features are currently available in US Preview region as they are just released last week in Build and would be released to other regions in near future, possibly in June 2024.

Hope you learned something new coming up next month or sooner…

That’s it for today…

Cheers,

PMDY

Setup Copilot in a Model-driven app – Quick Review

Hi Folks,

Wondering how you can enable Copilot in Dynamics 365 Model Driven App …? Then you come to the right place, few days ago, I was trying to use it few days back but couldn’t. Hence this blog post is from my experience.

There were few things to configure for your Copilot to respond to your queries. So, I will be taking about that in this blog post today. Let’s get started…

Copilot in model-driven Power Apps was in Preview since July 2023.

Prerequisite: You must have a non-production environment with Dataverse database, apps, and data.

Step 1: Go to Power Platform Admin Center –> Select the environment –> Settings –> Product –> Features –> Select On for AI-powered experience as highlighted below, if you were App maker and want to try it for yourself, you would also need to check the option in yellow below.

Step 2: Go to Power Platform Admin Center –> Select the environment –> Settings –> Product –> Behaviour –> Select Monthly channel or Auto  for Model-driven app release channel option and click save.

Step 3: Well, this step is important, in this task, we configure a Dataverse table and columns for Copilot.

Go to Power Apps and make sure that you have the correct environment.

Select tables and navigate to the respective table for which you want to enable Copilot capability.

Step 4: Here I am using OOB Account entity, you can choose whichever entity you wish to setup.

Step 5: Navigate to Properties for the Account table as below

Step 6: Choose settings as highlighted below and click on save.

Step 8: Open the Account table and go views

Step 9: Here in this step, would need configure the Quick Find View, add the necessary fields to the view for it to be searchable for Copilot. Add in the fields which your users would be searching for in the Copilot.

Step 10: Here we have to make sure the fields are added to the view and then save and publish.

That’s it, the configuration is done.

Step 11: In this step, we will test the Copilot by opening the App in which the configured entity is available. Click on the Copilot icon as highlighted below, this shows the Chat window for Copilot

Step 12:

Test 1: Prompt: How many Accounts are there which Primary Contact starting with H? Well, it showed correctly as below.

Test 2: Prompt: Show Accounts whose Annual Revenue is more than 300,000? It showed correctly as below.

Hope this helps you to setup Copilot for your Model Driven Apps. I will leave it to yourself to try this out.

Make sure, you give all the details in the prompt itself, it will not be able to store the previous response, meaning you can’t continue your conversation providing information in bits and pieces. You can setup the same for your Custom entity also, make sure you add the fields to the quick find view of that entity.

It is not recommended for Production environments as it is still a preview feature. In case, the response is not accurate, you can report this to Microsoft by hitting thumbs up or thumbs down and provide the relevant feedback.

Lot more to come in the upcoming days, learning different aspects of Copilot became a necessity these days.

That’s it for today…hope this helps…

Cheers,

PMDY

Using Bulk Operations messages – #01 (Plugins)

Well, this could be a very interesting post as we talk about optimizing the Dataverse performance using bulk operation messages and too using Dataverse plugin customizations but wait, this post is not complete because of an issue which I will talk later in the blog. First let’s dig into this feature by actually trying out. Generally, every business wants improved performance for any logic tagged out to out of box messages and so developers try to optimize their code in various ways when using Dataverse messages.

Firstly, before diving deeper into this article, let’s first understand the differences between Standard and Elastic tables, if you want to know a bit of introduction to elastic tables which were newly introduced last year, you can refer to my previous post on elastic tables here.

The type of table you choose to store your data has the greatest impact on how much throughput you can expect with bulk operations. You can choose out of two types of tables in Dataverse, below are some key differences you can refer to: 

 Standard TablesElastic Tables
Data StructureDefined SchemaFlexible Schema
Stores data in Azure SQLStores data in Azure Cosmos DB
Data IntegrityEnsuredLess Strict
Relationship modelSupportedLimited
PerformancePredictableVariable, preferred for unpredictable and spiky workloads
AgilityLimitedHigh
PersonalizationLimitedExtensive
Standard and Elastic Table Differences

Plugins:

With Bulk Operation messages, the APIs being introduced are Create MultipleUpdateMultiple,DeleteMultiple (only for Elastic tables), Upsert Request(preview). As of now you’re not required to migrate your plug-ins to use CreateMultiple and Update Multiple instead of Create and Update messages. Your logic for Create and Update continues to be applied when applications use CreateMultiple or UpdateMultiple

This is mainly done to prevent two separate business logics for short running and long duration activities. So, it means Microsoft have merged the message processing pipelines for these messages (Create, Create Multiple; Update, Update Multiple) that means Create, Update messages continue to trigger for your existing implemented scenarios, when you update to use Create Multiple, Update Multiple still the Create, Update will behave.

Few points for consideration:

  1. While I have tested and still could see IPluginExecutionContext only provides the information and still I have noted Microsoft Documentation suggests using IPluginExecutionContext4 for Bulk Messages in Plugins where it is being shown as null yet.
  2. While you were working with Create, Update, Delete, you could have used Target property to get the input parameters collection, while working with Bulk Operation messages, you need to use Targets instead of Target.
  3. Instead of checking whether the target is Entity you need to use Entity Collection, we need to loop through and perform our desired business logic
  4. Coming to Images in plugin, these will be retrieved only when you have used IPluginExecutionContext4.

Below is the image from Plugin Registration Tool to refer(e.g. I have taken UpdateMultiple as reference, you can utilize any of the bulk operation messages)

Sample:

Below is the sample, how your Bulk operation message plugin can look like…you don’t need to use all the contexts, I have used to just check that out.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Crm.Sdk;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;
namespace Plugin_Sample
{
public class BulkMessagePlugin : IPlugin
{
public void Execute(IServiceProvider serviceProvider)
{
IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
IPluginExecutionContext2 context2 = (IPluginExecutionContext2)serviceProvider.GetService(typeof(IPluginExecutionContext2));
IPluginExecutionContext3 context3 = (IPluginExecutionContext4)serviceProvider.GetService(typeof(IPluginExecutionContext3));
IPluginExecutionContext4 context4 = (IPluginExecutionContext4)serviceProvider.GetService(typeof(IPluginExecutionContext4));
ITracingService trace = (ITracingService)serviceProvider.GetService(typeof(ITracingService));
// Verify input parameters
if (context4.InputParameters.Contains("Targets") && context.InputParameters["Targets"] is EntityCollection entityCollection)
{
// Verify expected entity images from step registration
if (context4.PreEntityImagesCollection.Length == entityCollection.Entities.Count)
{
int count = 0;
foreach (Entity entity in entityCollection.Entities)
{
EntityImageCollection entityImages = context4.PreEntityImagesCollection[count];
// Verify expected entity image from step registration
if (entityImages.TryGetValue("preimage", out Entity preImage))
{
bool entityContainsSampleName = entity.Contains("fieldname");
bool entityImageContainsSampleName = preImage.Contains("fieldname");
if (entityContainsSampleName && entityImageContainsSampleName)
{
// Verify that the entity 'sample_name' values are different
if (entity["fieldname"] != preImage["fieldname"])
{
string newName = (string)entity["fieldname"];
string oldName = (string)preImage["fieldname"];
string message = $"\\r\\n – 'sample_name' changed from '{oldName}' to '{newName}'.";
// If the 'sample_description' is included in the update, do not overwrite it, just append to it.
if (entity.Contains("sample_description"))
{
entity["sample_description"] = entity["sample_description"] += message;
}
else // The sample description is not included in the update, overwrite with current value + addition.
{
entity["sample_description"] = preImage["sample_description"] += message;
}
}
}
}
}
}
}
}
}
}

I have posted this question to Microsoft regarding the same to know more details on this why the IPluginExecutionContext4 is null , while still I am not sure if this is not deployed to my region, my environment is in India.

Recommendations for Plugins:

  • Don’t try to introduce CreateMultiple, UpdateMultiple, UpsertMultiple in a separate step as it would trigger the logic to be fired twice one for Create operation and another for CreateMultiple.
  • Don’t use batch request types such as ExecuteMultipleRequest, ExecuteTransactionRequest, CreateMultipleRequest, UpdateMultipleRequest, UpsertMultipleRequest in Plugins as user experiences are degraded and timeout errors can occur.
  • Instead use Bulk operation messages like CreateMultipleRequestUpdateMultipleRequest, UpsertMultipleRequest
    • No need to use ExecuteTransactionRequest in Synchronous Plugins as already they will be executed in the transaction.

    Hope this guidance will help someone trying to customize their Power Platform solutions using Plugins.

    I will write another blog post on using Bulk operation messages for Client Applications…

    Cheers,

    PMDY

    All you need to know for migrating your Power Platform environments from one region to another

    Geo Migration is a great feature/flexibility offered by Microsoft for customers who wish to move to a region which is in closest proximity to their operations even though initially their Power Platform environment region based out of a different one when they signed up. I checked out online but couldn’t find a good reference blog article yet online, hence this post.

    I will make this post detailed but a comprehensive one for anyone to understand the migration. Customers can also opt for Multi Geo for those who have a need to store data in multiple geographies to satisfy their data residency requirements. If you don’t know where your Power Platform environment resides, you can check from Power Platform Admin Center.

    If you were not aware yet, Microsoft Azure is the only cloud provider which offers services in more regions when compared to AWS (Amazon Web Services) and GCP (Google Cloud Platform). The Geo Migration feature seamlessly allows customers to move their environments in a single tenant from one region to another. e.g. for Singapore, it is as below.

    Important:

    1. Geo Migration is not generally available, so please exercise with caution.
    2. You may reach out to your TAM(Microsoft Technical Account Manager) quoting your request
    3. There were several limitations, see below references for more details.

    Mandatory Pre-Migration Check list:

    1. Any Power Apps, Power Automate Flows should be manually exported prior to the migration. Custom Connectors aren’t supported as of now, they must manually reconfigure or created in the new environment. You can export them individually or export them in group.
    2. Canvas Apps, Custom Pages, Code Components like PCF and libraries should be deleted from the environment before your migration activity starts. Else they might be in corrupted state after migration activity.
    3. If any of your Apps are not solution aware because of any reason like App calls a Power Automate when a button is called etc., you may need to explicitly export it out and take the backup.

    Post Migration Check list:

    1. After the migration, import all the packages which you have taken backup during pre migration. For those which were not solution aware, import them manually.
    2. If you have Power Portals or Power Virtual Agents, those should be exported explicitly.
    3. Make sure you test all functionalities in order not to impact end users.

    Notes:

    You don’t need to build Apps and Flows from scratch. Dynamics 365 marketing App is not supported yet. There could be some configuration changes post migration.

    While I try to put the information to the best available as per sources from Microsoft, this may change over time and variation could be different as each customer will have different workloads and dependencies with other services, so please read the references carefully before proceeding. Contact Microsoft Support or TAM as necessary.

    Hope this helps to get a sneak peek into the migration process.

    References:

    Where is your data stored?

    MultiGeo Architecture

    Dynamics 365 & Power Platform new regions

    Advance Data Residency Move Program

    Geo to Geo Migrations

    Cheers,

    PMDY

    Execution Timeout Expired. The timeout period elapsed prior to completion of the operation, or the server is not responding – Troubleshooting timeouts in Power BI

    Hi Folks,

    When I was working with my Power BI reports, I suddenly started encountering this error. I don’t have any more clue except this error message which I could see in my Power BI Desktop as below. Initially I thought there could be some problem connecting to my SQL end point of my Dataverse connection, but it isn’t.

    The error message above clearly say that the Queries are blocked. I then quickly started reviewing the model of the Power BI Report to see if there were any issues like the Relationships etc. But I couldn’t find anything in my relationships. Since I was using SQL Connection to my Dataverse, I tried to increase the Command timeout in minutes (max value being 120 minutes) from Advanced options of my connection but still the same error.

    Cause: Then I quickly noticed that in my model I have fetched the same table data both using Direct Query and Import mode. So, when I was refreshing, because of the relationships, the one imported is being dependent on the one with Direct Query.

    Fix: After review, the unnecessary Direct Query table was removed and voila it fixed the issue.

    If anyone is facing the same problem, I strongly recommend you review the Semantic Model of your Power BI Report.

    Cheers,

    PMDY

    Use environment variable to deploy different version of Power BI Reports across environments in Power Platform

    Hi Folks,

    Thank you for visiting my blog…in this post, we will see how we can create and manage a Power BI Environment variable in Model driven apps in Power Platform.

    So, let’s say, we have two environments 1. Dev 2. Default, we want to deploy export the solution with Power BI report from Dev environment as managed solution and import that to Default environment. The report in Default environment should point to Production workspace in Power BI.

    I have the following reports in workspaces.

    Development workspace:

    Production Workspace:

    Now in order to deploy the report to Production, we need to use a managed solution and the report should point to Production workspace. So, in order to handle this, we will need to define an environment variable to store the workspace information. So, let’s get started.

    First, we will create a Power BI embedded report in Development environment.

    While you were creating a Power BI embedded report, you will be presented an option to choose from the Power BI workspace.

    In order to achieve this requirement of deploying different versions of Power BI report in different instances, we need to use environment variable, so check the Use environment variable option.

    1. The environment variable will be specific to this report and should be included in the solution when we want to deploy this report to higher environment.
    2. The next thing to note is that Default workspace would reflect the default value for this report and current value is required when we want to set to another report in a different environment.

    In Development environment, we choose as below..

    Once the environment variable is saved, we now have 1 Dashboard and 1 environment variable component in the solution.

    This solution is published and then exported as Managed solution, imported to another environment (Default environment which serves as Production environment here).

    While importing, it asks to update environment variable, you can proceed to click on Import.

    Now we have the solution in Default environment.

    In order to update the value of the report to consider from Production environment, we need to open the report and click on the Pencil icon besides the Power BI Environment variable.

    Then choose Prod workspace and its respective report and click save, publish.

    That’s it…

    You will be able to see two different reports in your Development and Default instances.

    In this way, it is very easy to manage and deploy different versions of Power BI Report to different environments like Dev, Test, Prod.

    Hope this helps…

    Cheers,

    PMDY

    What are Named Formulas in Canvas Apps?

    Hi Folks,

    Most of us know how to declare variables in our program…declaring a Var variable type is simplest one possible either in C#, Javascript or any scripting language.

    Do you know that we can declare variables similarly in Canvas Apps using PowerFx…? A feature which was Generally available now..it’s none other than Named formulas.

    With the named formulas, we can easily define and declare variables and only they were run when required, you don’t need to initialize it before hand, thus improving performance. Here you don’t even need to use Var while declaring the variable, you just name it…Also it offers below advantages.

    • The formula’s value is always available.  There is no timing dependency, no App.OnStart that must run first before the value is set, no time in which the formula’s value is incorrect.  Named formulas can refer to each other in any order, so long as they don’t create a circular reference.  They can be calculated in parallel.
    • The formula’s value is always up to date.  The formula can perform a calculation that is dependent on control properties or database records, and as they change, the formula’s value automatically updates.  You don’t need to manually update the value as you do with a variable.  
    • The formula’s definition is immutable.  The definition in App.Formulas is the single source of truth and the value can’t be changed somewhere else in the app.  With variables, it is possible that some code unexpectedly changes a value, but this is not possible with named formulas. That doesn’t mean a formula’s value needs to be static – it can change – but only if dependencies change.
    • The formula’s calculation can be deferred.  Because its value it immutable, it can always be calculated when needed, which means it need not actually be calculated until it is actually needed. If the value is never used, the formula need never be calculated.  Formula values that aren’t used until screen2 of an app is displayed need not be calculated until screen screen2 is visible.  This can dramatically improve app load time and declarative in nature.
    • Named formulas is an Excel concept. Power Fx leverages Excel concepts where possible since so many people know Excel well.  

    Tip: Use App.Formulas instead of App.OnStart

    The best way to reduce loading time for both Power Apps Studio and your app is to replace variable and collection initialization in App.OnStart with named formulas in App.Formulas.

    Example without Named Formulas:

    ClearCollect(
    MySplashSelectionsCollection,
    {
    MySystemCol: First(
    Filter(
    Regions,
    Region = MyParamRegion
    )
    ).System.'System Name',
    MyRegionCol: First(
    Filter(
    Regions,
    Region = MyParamRegion
    )
    ).'Region Name',
    MyFacilityCol: ParamFacility,
    MyFacilityColID: LookUp(
    FacilitiesList,
    Id = GUID(Param("FacilityID"))
    ).Id
    }
    );

    Example with Named Formulas:

    MyRegion = LookUp(
    Regions,
    Region = MyParamRegion
    );
    MyFacility = LookUp(
    FacilitiesList,
    Id = GUID(Param("FacilityID")
    );
    MySplashSelectionsCollection =
    {
    MySystemCol: MyRegion.System.'System Name',
    MyRegionCol: MyRegion.'Region Name',
    MyFacilityCol: ParamFacility,
    MyFacilityColID: MyFacility.Id
    };

    You see the difference between the above two, the one with named formulas is more readable while improving your App performance. Isn’t great…?

    References:

    https://powerapps.microsoft.com/en-us/blog/power-fx-introducing-named-formulas/

    https://learn.microsoft.com/en-gb/power-apps/maker/canvas-apps/working-with-large-apps?WT.mc_id=5004279#use-appformulas-instead-of-apponstart

    Cheers,

    PMDY

    Start Transitioning your Dynamics 365 Client Applications to use Dataverse Client

    Hi Folks,

    This blog post deals about what you need to do for your client applications in specific to use Dataverse Client API instead of existing CrmServiceClient(Core Assemblies) API.

    Below were 3 reasons cited by Microsoft and why we need to just be aware of this move.

    1.Cross Platform Application Support: With the introduction of Microsoft.PowerPlatform.Dataverse.Client, the new Dataverse Service Client supports Cross Platform Support.

    2. MSAL Authentication: New Dataverse ServiceClient API uses MSAL while our older CrmServiceClient API uses ADAL. ADAL.Net is no longer supported.

    3. Performance and functional benefits: We can have one authentication handler per web service connection instead of just one per process. The Dataverse Service Client class supports a smaller interface surface, inline authentication by instance, and Microsoft.Extensions.Logging.ILogger.

    What’s the impact?

    • Plug-ins or custom workflow activities – no changes
    • New or existing online applications – changes are needed but not immediately…
    • On-premises applications – this article is not for you, yet

    So, meaning it impacts Online Client applications only. While you really don’t need to worry much about this the class member signatures of ServiceClient and CrmServiceClient are the same, except for the class names themselves being slightly different. Application code should not need any significant changes.

    As of now, no changes to your code are required, but it is better to keep in mind that in the future the CRM 2011 Service End Point would be deprecated, and this change would be made mandatory.

    So, what should you do to incorporate this change?

    Use the following assemblies from Nuget instead of CrmSdk.CoreAssemblies

    Add the below using statement to use Microsoft.PowerPlatform.Dataverse.Client

    Use ServiceClient instead of CrmServiceClient, ServiceClient would return your OrganizationService.

    Instead of

    Be strategic to minimize the impact to your apps.

    Cheers,

    PMDY

    Unable to profile Custom Workflow using Profiler – Quick Fix

    Hi Folks,

    I am a big fan of Power Automate…but this post is not about flows but features about Custom Workflow in Dynamics 365 CE.

    Did you ever come across this problem where you were not able to debug custom workflow extension. I had come across this and this blog post is all about it…I successfully registered my Custom workflow, but it is not triggering at all.

    So, I need to debug it to see what the exact issue was…as I am encounter this error.

    Error message says Duplicate workflow activity group name: ‘EcellorsDemo.Cases(1.0.0.0) (Profiled)‘. So, I tried to check my code, plugin steps and any activated plugins but couldn’t find any duplicates.

    Usually while debugging your custom workflow using profiler, your workflow will go into draft mode and another copy of the same workflow gets created with name of (Profiled) attached to the name. However, in my case, I didn’t see the same behavior and at the same time, I was unable to use Profiler after the first profiling session and it gave me error shown above.

    In order to resolve, this just delete the Plugin Assemblies which could find in the default solution like highlighted below…

    Once you have deleted this, try to debug the custom workflow and voila!!!

    Hope this helps someone troubleshooting Custom workflow…!

    Cheers,

    PMDY

    Debug Plugins with Dataverse Browser – Quick Recap

    Hi Folks,

    This post is for all who are working on D365 Model Driven Apps and mainly Plugins.

    Yes, you saw it right, in this blog post, we will see how can debug plugin without using our favorite plugin profiler which is very widely used from quite some time by everyone working on Plugins for Dynamics 365. All this is done by a tool called Dataverse Browser, which is not yet on XrmToolBox. Please note that there were some limitations as detailed in limitation section below.

    Here are 3 simple steps to follow..

    1. Install Dataverse Browser
    2. Attach the Debugger
    3. Run your actual operation.
    4. Step into your code and debug it.

    The tool embeds a web browser based on Chromium. It works by translating the Web API requests to SDK requests. Then it analyzes if plugin steps are registered on the message and it loads them, make them run locally. All other requests are sent to the Dataverse, so that the plugins are interacting with the real database.

    Download the latest source code of Dataverse browser here.

    Next extract the zip file downloaded as highlighted below

    Extract the zip file downloaded, open Dataverse.Browser Application as highlighted below.

    In the popup window, click on More info as highlighted below…

    Then run the application anyway…you will be presented with a window where you can select the environment. Going forward, any time you want to open Dataverse browser, just open the Dataverse.Browser.exe and choose the environment as below.

    Click on New, enter the details as above and key in the details.

    • Enter the settings of your environment:
      • A name meaningful for you
      • The host name of your instance (without the https://)
      • The path to the plugins assembly file (the dll). For a better experience, it should be compiled in debug mode with the pdb file generated.

    Then click Go.

    You just need to Authenticate to your instance.

    Once Authenticated to the respective model driven apps, all the Web API requests sent to Dataverse will be shown as below.

    I have following Plugin Libraries registered.

    Next step is to choose the instance and perform the respective operation which triggers the Plugin. So, in here, I will perform an update to the Account entity from the Dataverse Browser which triggers the Plugin.

    Once an update is performed, a Web API request gets recorded in the Dataverse browser as highlighted below.

    Since the Plugin is in Post Operation, i.e. Stage number is 40

    Just expand the Patch Request, you should see two operations on 30, 40, but area of interest here is for the Plugin which was registered on stage 40.

    Make sure you open the Visual Studio and perform the below steps from Dataverse Browser.

    Attach the debugger from Dataverse Browser by clicking on the Plug Symbol as below which will show the list of debugger options available for you to select from. Here I have selected Execute Plugins, plugin will be invoked. You can either select any of the three options as presented below.

    1.Do not execute plugins – recommended when you want to debug without actually triggering your plugin logic. i.e. With this approach even you can check the code in Production environment.

    2. Execute plugins/Execute plugins with auto break – recommended when you want to debug by triggering your actual plugin, this is recommended in case your plugin code had changed recently and in Development environments.

    Just select Ecellors Demo – Microsoft Visual Studio: Visual Studio Professional 2022 version which will launch an existing Visual studio 2022 as below in break mode. Next click on Continue as highlighted below or press Click F5 on your keyboard.

    This shows you that the debugger has been attached when you navigate to Dataverse Browser asking you to place your breakpoints.

    Now just place breakpoints in your code in Visual Studio. Just go back to Dataverse Browser and click on Ok on the Diaglog box.

    Perform the operation which triggers the Plugin from Dataverse Browser itself, this will hit the break point in Visual Studio from where you can debug your plugin.

    As you might have observed, your code need not throw exception in order to debug, you could do similarly to the way you would debug using Profiler. But here just that you don’t need to deploy the latest code to the Dataverse just for debugging purpose.

    This gives a lot more flexibility eases the way you debug plugins.

    Limitions:

    • There is no support for transactions.
    • When plugins are triggered because of a server-side operation, they will not be run locally.
    • For many reasons, behavior will never be perfectly similar to the one when plugins are executed on server side.

    Happy debugging, I hope you found this post useful…

    References:

    Dataverse Dev Browser

    Cheers,

    PMDY