Creating In-App Notifications in Model Driven Apps in an easier way – Quick Review

Hi Folks,

In App notifications are trending these days where many customers are showing interest in implementing these for their businesses.

So, in this blog post, I am going to show you the easiest way to generate In App notification using XrmToolBox in few clicks. Use the below tool to generate one.

So, let me walk you through step by step

Step 1: Open In App Notification Builder in XrmToolBox

Step 2: In App notification is a setting that should be enabled at App level, so meaning if you have developed few Model Driven Apps, you will be able to enable the In App notification individually for each one of them.

Step 3: In the above snapshot, we should be able to select the respective App for which we want to enable the In App Notification. Red bubble besides indicate that the In App notification is not enabled for this App.

So, we need to enable it by clicking on the red icon itself, you should then be able to get this prompt as below.

Step 5: Upon confirming the confirmation dialog box, the In App notification will be enabled for that App and you the red button turns to green as below saying that In App Notification is enabled.

Now that the In App notification is enabled in the App, we will proceed with the remaining setup.

Step 6: You can proceed to give a meaningful title, body for you In App Notification. Also mention the required toast type and specify the expiry duration, Icon. Also Click on Add icon and choose the action required to be performed when In App notification is clicked.

Step 9: You can even choose the type of action to be performed…

For example, let’s use to open as dialog and show list view

Your screen should look something like below

Step 10: Once done, you can click on create and that’s it you have now created In App Notification. Now let’s test this for the user who have priveleges to access this App.

If not, you will face this error..

Log in with user account for which the In App Notification is triggered.

Hurray!!!! That’s it, how easy it was to create In App Notification in Low Code manner.

You can even get the code behind this as well…

However, there were other ways to trigger the In App Notification from a Pro Code angle, let’s discuss those as well.

In this case you need to manually turn the In App Notification feature on by going to settings for the Model Driven App as below first.

Notifications can be sent using the SendAppNotification message using SDK.

You can either trigger from and can choose based on your convenience to trigger a similar notification.

Client Scripting

var systemuserid = '<user-guid>';
var data = {
"actions": [
{
"data": {
"url": "?pagetype=entitylist&etn=account&viewid=00000000-0000-0000-00aa-000010001002",
"navigationTarget": "dialog"
},
"title": "Link to list of notifications"
}
]
};
var notificationRecord =
{
'title': 'Learning In App Notificaiton',
'body': `In-App Notifications in Model-Driven Apps are messages or alerts designed to notify users of important events or actions within the app. These notifications appear directly inside the application, providing a seamless way to deliver information without relying on external methods such as emails.`,
'ownerid@odata.bind': '/systemusers(' + systemuserid + ')',
'icontype': 100000003, // Warning
'toasttype': 200000000, // Timed
'ttlinseconds': 1209600,
'data': JSON.stringify(data)
}
Xrm.WebApi.createRecord('appnotification', notificationRecord).
then(
function success(result) {
console.log('notification created with single action: ' + result.id);
},
function (error) {
console.log(error.message);
// handle error conditions
}
);
view raw JS hosted with ❤ by GitHub

Plugin/SDK

var notification = new Entity("appnotification")
{
["title"] = @"Learning In App Notificaiton",
["body"] = @"In-App Notifications in Model-Driven Apps are messages or alerts designed to notify users of important events or actions within the app. These notifications appear directly inside the application, providing a seamless way to deliver information without relying on external methods such as emails.",
["ownerid"] = new EntityReference("systemuser", new Guid("00000000-0000-0000-0000-000000000000")),
["icontype"] = new OptionSetValue(100000003), // Warning
["toasttype"] = new OptionSetValue(200000000), // Timed
["ttlinseconds"] = 1209600,
["data"] = @"{
""actions"": [
{
""data"": {
""url"": ""?pagetype=entitylist&etn=account&viewid=00000000-0000-0000-00aa-000010001002"",
""navigationTarget"": ""dialog""
},
""title"": ""Link to list of notifications""
}
]
}"
};
service.Create(notification);
view raw gistfile1.txt hosted with ❤ by GitHub

Power Automate:

You should design your Power Automate something like below to trigger a similar notification.

Note: Currently In App Notification will be triggered for only Model Driven Apps.

Reference:

In App Notification Documentation

Hope this saves some of your time…

Cheers,

PMDY

Simplify Power BI Management with Environment Variables

Introduction

Power Platform solutions often rely on dynamic configuration data, like Power BI workspace IDs, report URLs, or API endpoints. Environment variables make it easier to manage such configurations, especially in managed solutions, without hard coding values. This blog will walk you through the steps to update a Power BI environment variable in managed solutions, focusing on the task of switching the workspace to the correct one directly within Power BI integration when working on different environments.

What are Environment Variables in Power Platform?

Before we dive into the steps, let’s quickly cover what environment variables are and their role in solutions:

  • Environment Variables are settings defined at the environment level and can be used across apps, flows, and other resources in Power Platform.
  • They store values like URLs, credentials, or workspace IDs that can be dynamically referenced.
  • In managed solutions, these variables allow for configuration across multiple environments (e.g., development, testing, production).

Why Update Power BI Environment Variables in Managed Solutions?

Updating environment variables for Power BI in managed solutions ensures:

  • Simplified Management: You don’t need to hardcode workspace or report IDs; you can simply update the values as needed.
  • Better Configuration: The values can be adjusted depending on which environment the solution is deployed in, making it easier to scale and maintain.
  • Dynamic Reporting: Ensures that Power BI reports or dashboards are correctly linked to the right workspace and data sources.
  • Best and Recommended: Changing the environment variables and pointing to right workspace is the correct and is best way to point your Power BI Report to respective workspace and recommended by Microsoft.

Prerequisites

Before proceeding with the update, ensure you meet these prerequisites:

  1. You have the necessary permissions to edit environment variables and manage solutions.
  2. The Power BI integration is already set up within your Power Platform environment.
  3. You have a managed solution where the Power BI environment variable is defined.

Steps to Update a Power BI Environment Variable in Managed Solutions

Step 1: Navigate to the Power Platform Admin Center
Step 2: Open the Solution in Which the Environment Variable is Defined
  • Go to Solutions in the left navigation menu.
  • Select the Managed Solution that contains the Power BI environment variable you need to update.
Step 3: Find the Environment Variable
  • In the solution, locate Environment Variables under the Components section.
  • Identify the Power BI environment variable (such as API URL or workspace ID) that you need to modify.
Step 4: Click on Dashboards to Update the Workspace
  • To update the Power BI environment variable related to the workspace, click on Dashboards.
  • Find the existing environment variable tied to the workspace and click to edit it.
  • Here, you’ll see the current workspace configuration for the Power BI resource.
Step 5: Update the Workspace ID
  • In the environment variable settings, you will now change the workspace to the new one.
  • Select the appropriate workspace from the list or manually enter the new workspace ID, ensuring it aligns with the target environment (development, production, etc.).
  • If necessary, update other properties like report or dataset IDs based on your environment needs.
Step 6: Save and Apply Changes
  • After updating the workspace and any other relevant properties, click Save.
  • The environment variable will now reflect the new workspace or configuration.
Step 7: Publish the Solution
  • If you’re using a managed solution, ensure that the updated environment variable is properly published to apply the changes across environments.
  • You may need to export the solution to other environments (like test or production) if applicable.
Step 8: Test the Changes
  • After saving and publishing, test the Power BI integration to ensure that the updated workspace is correctly applied.
  • Check the relevant Power BI reports, dashboards, or flows to confirm that the new workspace is being used.

Best Practices

  • Document Changes: Always document the updates to environment variables, including what changes were made and why.
  • Use Descriptive Names: When defining environment variables, use clear and descriptive names to make it easy to understand their purpose.
  • Cross-Environment Testing: After updating environment variables, test them in different environments (dev, test, prod) to ensure consistency and reliability.
  • Security Considerations: If the environment variable includes sensitive information (like API keys), make sure it’s properly secured.

Conclusion

Updating Power BI environment variables in managed solutions allows you to maintain flexibility while keeping your configurations centralized and dynamic. By following the steps outlined in this blog post, you can efficiently manage workspace IDs and other key configuration data across multiple environments. This approach reduces the need for hardcoded values and simplifies solution deployment in Power Platform.

Cheers,

PMDY

Dataverse – Git Integration – Preview – Quick Review

Hi Folks,

This post is about Dataverse and Git Integration which is the most sought after feature in the todays automation Era. This is a preview feature, you would need to create a new environment with Early Access enabled to test this feature or you can use an existing US Preview environment for testing this out.

While every MDA(Model Driven Application) and it’s components can be safely and moved across the environments using Solutions with the help of Azure DevOps Pipelines. However when coming to integrating Power Platform Solutions to Azure DevOps, we had to manually export the solution and download them each and every time when we would like to commit the Solution Artifacts to Azure DevOps Repo.

With this new Preview feature we can directly integrate the Power Platform Solutions to Azure DevOps.

Let’s see this action…wait a moment, there were some prerequisites to be considered…

  1. Environment should be a Managed Environment to start using this and you need to be an Admin for the environment
  2. Azure DevOps subscription and license should be available to set this up, also permission to read source files and commits(should be a member of contributor group in Azure DevOps) from a Repo
  3. Your email address used for Azure DevOps and Power Platform Solutions should be the same

Setup:

Connecting Dataverse with Azure DevOps is easy but requires a bit of understanding of the Binding options available.

Well, there were two types of Binding options

  1. Environment Binding – Single root folder binds to all the unmanaged solutions in the environment
  2. Solution Binding – Different solutions uses a different root folder in Azure DevOps for binding

Note: Once the binding is setup, there isn’t away to change, so set this up carefully, else you may need to delete the folder and create a new one in Azure DevOps.

Let’s see one by one…for demoing purpose, I have created two projects in Azure DevOps Instance

  1. Solution Binding: When we use this, all the components will be available as pending changes
  2. Environment Binding: When we use this, all the unmanaged solution components will be mapped to one Azure DevOps root folder. Let’s set this up.

We are currently able to use only Solution binding, as Environment Binding doesn’t show up any changes to be committed, but there is a catch here.

We can set up for Environment binding and verify if the Solution components are getting marked as pending changes or not. Do note that Setting up the Binding is a one time activity for environment, once setup, it can’t be changed from one type to another.

Open https://make.powerapps.com and navigate to solutions and click on ellipses as below

Once clicked on Connect to Git

Since we were currently using Environment binding, let’s select the Connection Type as Environment

Then click on Connect, once connected, you should a alert message in power apps maker portal at the top.

Now create a new solution as below named ecellors Solution

Verify the integration by clicking on Git Integration as below

It should show as below

Now let’s add few components to the solution we created

Once added, let’s publish the unmanaged solution and verify it..

Do look closely, you should see a Source Control icon highlighted in yellow color for illustration.

Also, you should see a commit option available at the top

You should now be able to commit the solution components as if you are committing the code changes.

It also specifies the branch to which we were commiting…

While it takes few minutes unlike pushing the code to Azure DevOps to push the changes, however this would depend based on the number of solution components you were pushing..once it is done, it will show a commit message like below…

Now let’s verify our Azure DevOps Repo..for this you can go back to the main solutions page, click on Git Connection at the top..

After clicking on Git Connection, click on the link to Microsoft Azure DevOps as below

Then you should be navigated to Azure DevOps folder as below where all the solution files will be tracked component wise.

Now we will move back to Power Apps maker portal and make some changes to any of the components inside the solution…

Let’s say, I just edited the flow name and created a new connection reference, saved and published the customizations.

If you did some changes at the Azure DevOps repo level, you can come back and click on Check for updates, if there were any conflicts between changes done in Azure DevOps and component in solution, it will be shown as conflict.

We now have 3 component changes and all were listed here…you can click on Commit.

As soon as the changes are committed, you should see a message saying Commit Successful and 0 Changes, 0 Updates, 0 Conflicts.

Now you successfully integrated Dataverse Solution components with Azure DevOps without any manual intervention required while deploying solutions using Azure DevOps Pipelines.

Hope you learned something new today…while feature is still in Preview and only available for early release, while couple of issues still need to fixed by Microsoft.

I have tested this feature by creating an environment in US Preview region and this feature will be a good value to projects using Automation and this solution repository can be further deployed to other environments using Azure DevOps Pipelines.

This will be rolled out soon next year, hope you learned something new today…

Cheers,

PMDY

Understanding Dataverse search in Dynamics 365 – Quick Review

Hi Folks,

One of my colleagues asked about Dataverse search, hence I am writing this article on Dataverse Search in Dynamics 365 and in the end, will compare different search options available in Dynamics 365.

Dataverse Search:

In layman terms, Dataverse Search is a powerful search tool that helps you find information quickly across your organization’s data in Microsoft Dataverse, which is the underlying data platform for apps like Power Apps, Dynamics 365, and more, shows you all the related information from across different tables or records in one place.

In short, Dataverse Search is the evolved version of Relevance Search, offering a more robust, faster, and user-friendly search experience including search results for text in documents that are stored in Dataverse such as PDF, Microsoft Office documents, HTML, XML, ZIP, EML, plain text, and JSON file formats. It also searches text in notes and attachments. Before enabling it, just note that once Dataverse search is enabled, it will be affected in all your Model Driven Apps, as of now, just take note.

It is on by default, here is where you can now turn off the Dataverse Search:

  1. Navigate to https://admin.powerplatform.com
  2. Click on Environments –> Choose your required environment –> Settings –>Features

3. Disable/Enable the Dataverse search feature.

Once enabled, we need to configure the tables for Dataverse Search so that indexing is performed at the backend, in order to do this…

  1. Navigate to https://make.powerapps.com, select your desired solution –> Click on Overview as shown below

Now you need to choose Manage Search Index and you can choose your desired table and fields, there isn’t a limit on the number of tables you can configure, but there is a limit on the number of fields you can configure for an environment, a maximum of 1000 fields are permitted both including system and custom fields, 50 fields are used by system, so you can configure 950 fields.

Just note that some field types are treated as multiple fields in the Dataverse search index as indicated in this table.

Field typeNumber of fields used in
the Dataverse search index
Lookup (customer, owner, or Lookup type attribute)3
Option Set (state, or status type attribute)2
All other types of fields1

At the bottom of the snap above, you could see the percentage of columns indexed in this environment.

When Dataverse search is enabled, the search box is always available at the top of every page in your app. You can start a new search and quickly find the information that you’re looking for.

When Dataverse search is turned on, it becomes your default and only global search experience for all of your model-driven apps. You won’t be able to switch to quick find search also known as categorized search.

You can also enable Quick actions as shown in the below table

TableQuick actions
AccountAssign, Share, Email a link
ContactAssign, Share, Email a link
AppointmentMark complete, Cancel, Set Regarding, Assign, Email a link
TaskMark complete, Cancel, Set Regarding, Assign, Email a link
Phone CallMark complete, Cancel, Set Regarding, Assign, Email a link
EmailCancel, Set Regarding, Email a link

Here is the short table comparing all types of searches in Dynamics 365…

FunctionalityDataverse searchQuick FindAdvanced Find
Enabled by default?Yes.
Note: For non-production environments an administrator must manually enable it.
Yes, for the table grid.
No, for multiple-table quick find (categorized search). An administrator must first disable Dataverse search before multiple-table grid find can be enabled.
Yes
Single-table search scopeNot available in a table grid. You can filter the search results by a table on the results page.Available in a table grid.Available in a table grid.
Multi-table search scopeThere is no maximum limit on the number of tables you can search.Searches up to 10 tables, grouped by a table.Multi-table search not available.
Search behaviorFinds matches to any word in the search term in any column in the table.Finds matches to all words in the search term in one column in a table; however, the words can be matched in any order in the column.Query builder where you can define search criteria for the selected row type. Can also be used to prepare data for export to Office Excel so that you analyze, summarize,or aggregate data, or create PivotTables to view your data from different perspectives.
Searchable columnsText columns like Single Line of Text, Multiple Lines of Text, Lookups, and Option Sets. Doesn’t support searching in columns of Numeric or Date data type.All searchable columns.All searchable columns.
Search resultsReturns the search results in order of their relevance, in a single list.For single-table, returns the search results in a table grid. For multi-table, returns the search results grouped by categories, such as accounts, contacts, or leads.Returns search results of the selected row type with the columns you have specified, in the sort order you have configured.

Hope you learned something today…if you have any questions, do let me know in the comments…

Cheers,

PMDY

Setup Copilot in a Model-driven app – Quick Review

Hi Folks,

Wondering how you can enable Copilot in Dynamics 365 Model Driven App …? Then you come to the right place, few days ago, I was trying to use it few days back but couldn’t. Hence this blog post is from my experience.

There were few things to configure for your Copilot to respond to your queries. So, I will be taking about that in this blog post today. Let’s get started…

Copilot in model-driven Power Apps was in Preview since July 2023.

Prerequisite: You must have a non-production environment with Dataverse database, apps, and data.

Step 1: Go to Power Platform Admin Center –> Select the environment –> Settings –> Product –> Features –> Select On for AI-powered experience as highlighted below, if you were App maker and want to try it for yourself, you would also need to check the option in yellow below.

Step 2: Go to Power Platform Admin Center –> Select the environment –> Settings –> Product –> Behaviour –> Select Monthly channel or Auto  for Model-driven app release channel option and click save.

Step 3: Well, this step is important, in this task, we configure a Dataverse table and columns for Copilot.

Go to Power Apps and make sure that you have the correct environment.

Select tables and navigate to the respective table for which you want to enable Copilot capability.

Step 4: Here I am using OOB Account entity, you can choose whichever entity you wish to setup.

Step 5: Navigate to Properties for the Account table as below

Step 6: Choose settings as highlighted below and click on save.

Step 8: Open the Account table and go views

Step 9: Here in this step, would need configure the Quick Find View, add the necessary fields to the view for it to be searchable for Copilot. Add in the fields which your users would be searching for in the Copilot.

Step 10: Here we have to make sure the fields are added to the view and then save and publish.

That’s it, the configuration is done.

Step 11: In this step, we will test the Copilot by opening the App in which the configured entity is available. Click on the Copilot icon as highlighted below, this shows the Chat window for Copilot

Step 12:

Test 1: Prompt: How many Accounts are there which Primary Contact starting with H? Well, it showed correctly as below.

Test 2: Prompt: Show Accounts whose Annual Revenue is more than 300,000? It showed correctly as below.

Hope this helps you to setup Copilot for your Model Driven Apps. I will leave it to yourself to try this out.

Make sure, you give all the details in the prompt itself, it will not be able to store the previous response, meaning you can’t continue your conversation providing information in bits and pieces. You can setup the same for your Custom entity also, make sure you add the fields to the quick find view of that entity.

It is not recommended for Production environments as it is still a preview feature. In case, the response is not accurate, you can report this to Microsoft by hitting thumbs up or thumbs down and provide the relevant feedback.

Lot more to come in the upcoming days, learning different aspects of Copilot became a necessity these days.

That’s it for today…hope this helps…

Cheers,

PMDY

Execution Timeout Expired. The timeout period elapsed prior to completion of the operation, or the server is not responding – Troubleshooting timeouts in Power BI

Hi Folks,

When I was working with my Power BI reports, I suddenly started encountering this error. I don’t have any more clue except this error message which I could see in my Power BI Desktop as below. Initially I thought there could be some problem connecting to my SQL end point of my Dataverse connection, but it isn’t.

The error message above clearly say that the Queries are blocked. I then quickly started reviewing the model of the Power BI Report to see if there were any issues like the Relationships etc. But I couldn’t find anything in my relationships. Since I was using SQL Connection to my Dataverse, I tried to increase the Command timeout in minutes (max value being 120 minutes) from Advanced options of my connection but still the same error.

Cause: Then I quickly noticed that in my model I have fetched the same table data both using Direct Query and Import mode. So, when I was refreshing, because of the relationships, the one imported is being dependent on the one with Direct Query.

Fix: After review, the unnecessary Direct Query table was removed and voila it fixed the issue.

If anyone is facing the same problem, I strongly recommend you review the Semantic Model of your Power BI Report.

Cheers,

PMDY

Start Transitioning your Dynamics 365 Client Applications to use Dataverse Client

Hi Folks,

This blog post deals about what you need to do for your client applications in specific to use Dataverse Client API instead of existing CrmServiceClient(Core Assemblies) API.

Below were 3 reasons cited by Microsoft and why we need to just be aware of this move.

1.Cross Platform Application Support: With the introduction of Microsoft.PowerPlatform.Dataverse.Client, the new Dataverse Service Client supports Cross Platform Support.

2. MSAL Authentication: New Dataverse ServiceClient API uses MSAL while our older CrmServiceClient API uses ADAL. ADAL.Net is no longer supported.

3. Performance and functional benefits: We can have one authentication handler per web service connection instead of just one per process. The Dataverse Service Client class supports a smaller interface surface, inline authentication by instance, and Microsoft.Extensions.Logging.ILogger.

What’s the impact?

  • Plug-ins or custom workflow activities – no changes
  • New or existing online applications – changes are needed but not immediately…
  • On-premises applications – this article is not for you, yet

So, meaning it impacts Online Client applications only. While you really don’t need to worry much about this the class member signatures of ServiceClient and CrmServiceClient are the same, except for the class names themselves being slightly different. Application code should not need any significant changes.

As of now, no changes to your code are required, but it is better to keep in mind that in the future the CRM 2011 Service End Point would be deprecated, and this change would be made mandatory.

So, what should you do to incorporate this change?

Use the following assemblies from Nuget instead of CrmSdk.CoreAssemblies

Add the below using statement to use Microsoft.PowerPlatform.Dataverse.Client

Use ServiceClient instead of CrmServiceClient, ServiceClient would return your OrganizationService.

Instead of

Be strategic to minimize the impact to your apps.

Cheers,

PMDY

Unable to profile Custom Workflow using Profiler – Quick Fix

Hi Folks,

I am a big fan of Power Automate…but this post is not about flows but features about Custom Workflow in Dynamics 365 CE.

Did you ever come across this problem where you were not able to debug custom workflow extension. I had come across this and this blog post is all about it…I successfully registered my Custom workflow, but it is not triggering at all.

So, I need to debug it to see what the exact issue was…as I am encounter this error.

Error message says Duplicate workflow activity group name: ‘EcellorsDemo.Cases(1.0.0.0) (Profiled)‘. So, I tried to check my code, plugin steps and any activated plugins but couldn’t find any duplicates.

Usually while debugging your custom workflow using profiler, your workflow will go into draft mode and another copy of the same workflow gets created with name of (Profiled) attached to the name. However, in my case, I didn’t see the same behavior and at the same time, I was unable to use Profiler after the first profiling session and it gave me error shown above.

In order to resolve, this just delete the Plugin Assemblies which could find in the default solution like highlighted below…

Once you have deleted this, try to debug the custom workflow and voila!!!

Hope this helps someone troubleshooting Custom workflow…!

Cheers,

PMDY

Debug Plugins with Dataverse Browser – Quick Recap

Hi Folks,

This post is for all who are working on D365 Model Driven Apps and mainly Plugins.

Yes, you saw it right, in this blog post, we will see how can debug plugin without using our favorite plugin profiler which is very widely used from quite some time by everyone working on Plugins for Dynamics 365. All this is done by a tool called Dataverse Browser, which is not yet on XrmToolBox. Please note that there were some limitations as detailed in limitation section below.

Here are 3 simple steps to follow..

  1. Install Dataverse Browser
  2. Attach the Debugger
  3. Run your actual operation.
  4. Step into your code and debug it.

The tool embeds a web browser based on Chromium. It works by translating the Web API requests to SDK requests. Then it analyzes if plugin steps are registered on the message and it loads them, make them run locally. All other requests are sent to the Dataverse, so that the plugins are interacting with the real database.

Download the latest source code of Dataverse browser here.

Next extract the zip file downloaded as highlighted below

Extract the zip file downloaded, open Dataverse.Browser Application as highlighted below.

In the popup window, click on More info as highlighted below…

Then run the application anyway…you will be presented with a window where you can select the environment. Going forward, any time you want to open Dataverse browser, just open the Dataverse.Browser.exe and choose the environment as below.

Click on New, enter the details as above and key in the details.

  • Enter the settings of your environment:
    • A name meaningful for you
    • The host name of your instance (without the https://)
    • The path to the plugins assembly file (the dll). For a better experience, it should be compiled in debug mode with the pdb file generated.

Then click Go.

You just need to Authenticate to your instance.

Once Authenticated to the respective model driven apps, all the Web API requests sent to Dataverse will be shown as below.

I have following Plugin Libraries registered.

Next step is to choose the instance and perform the respective operation which triggers the Plugin. So, in here, I will perform an update to the Account entity from the Dataverse Browser which triggers the Plugin.

Once an update is performed, a Web API request gets recorded in the Dataverse browser as highlighted below.

Since the Plugin is in Post Operation, i.e. Stage number is 40

Just expand the Patch Request, you should see two operations on 30, 40, but area of interest here is for the Plugin which was registered on stage 40.

Make sure you open the Visual Studio and perform the below steps from Dataverse Browser.

Attach the debugger from Dataverse Browser by clicking on the Plug Symbol as below which will show the list of debugger options available for you to select from. Here I have selected Execute Plugins, plugin will be invoked. You can either select any of the three options as presented below.

1.Do not execute plugins – recommended when you want to debug without actually triggering your plugin logic. i.e. With this approach even you can check the code in Production environment.

2. Execute plugins/Execute plugins with auto break – recommended when you want to debug by triggering your actual plugin, this is recommended in case your plugin code had changed recently and in Development environments.

Just select Ecellors Demo – Microsoft Visual Studio: Visual Studio Professional 2022 version which will launch an existing Visual studio 2022 as below in break mode. Next click on Continue as highlighted below or press Click F5 on your keyboard.

This shows you that the debugger has been attached when you navigate to Dataverse Browser asking you to place your breakpoints.

Now just place breakpoints in your code in Visual Studio. Just go back to Dataverse Browser and click on Ok on the Diaglog box.

Perform the operation which triggers the Plugin from Dataverse Browser itself, this will hit the break point in Visual Studio from where you can debug your plugin.

As you might have observed, your code need not throw exception in order to debug, you could do similarly to the way you would debug using Profiler. But here just that you don’t need to deploy the latest code to the Dataverse just for debugging purpose.

This gives a lot more flexibility eases the way you debug plugins.

Limitions:

  • There is no support for transactions.
  • When plugins are triggered because of a server-side operation, they will not be run locally.
  • For many reasons, behavior will never be perfectly similar to the one when plugins are executed on server side.

Happy debugging, I hope you found this post useful…

References:

Dataverse Dev Browser

Cheers,

PMDY

Connecting to your Dataverse instance to run SQL Queries without using XrmToolBox

Hi Folks,

Do you know that you can connect to your Dataverse DB right from your old toolbox SSMS, an express version would be more than enough to try out. Possibly we didn’t think of, but yes, we can…so let’s see that in this blog post.

Open SSMS..

1.Select Server type as Database Engine

2. Server name as the environment URL from your Power Platform Admin Center as below.

3. So key in those details as below, make sure to Select Authentication method as Azure Active Directory – Universal with MFA option.

Once you click on Connect, you will be prompted for authentication via browser.

Once your Sign-In is successful, you will be able to see.

That’s it, how simple it was connecting to your Dataverse instances…

Having said that it’s easy to connect to Dataverse, not all operations performed using normal transact SQL are supported here using Dataverse SQL. You could see it says Read-Only besides the instance name, that means that you don’t have any capabilities to modify from SQL.

Because Dataverse SQL is a subset of Transact-SQL. If you want to see what statements are supported and what not, just go ahead to this link to find out.

This opens a whole lot of opportunities to explore, so don’t forget to check this out.

References:

Dataverse SQL and Transact SQL

Cheers,

PMDY