When I was working with my Power BI reports, I suddenly started encountering this error. I don’t have any more clue except this error message which I could see in my Power BI Desktop as below. Initially I thought there could be some problem connecting to my SQL end point of my Dataverse connection, but it isn’t.
The error message above clearly say that the Queries are blocked. I then quickly started reviewing the model of the Power BI Report to see if there were any issues like the Relationships etc. But I couldn’t find anything in my relationships. Since I was using SQL Connection to my Dataverse, I tried to increase the Command timeout in minutes (max value being 120 minutes) from Advanced options of my connection but still the same error.
Cause: Then I quickly noticed that in my model I have fetched the same table data both using Direct Query and Import mode. So, when I was refreshing, because of the relationships, the one imported is being dependent on the one with Direct Query.
Fix: After review, the unnecessary Direct Query table was removed and voila it fixed the issue.
If anyone is facing the same problem, I strongly recommend you review the Semantic Model of your Power BI Report.
Thank you for visiting my blog…in this post, we will see how we can create and manage a Power BI Environment variable in Model driven apps in Power Platform.
So, let’s say, we have two environments 1. Dev 2. Default, we want to deploy export the solution with Power BI report from Dev environment as managed solution and import that to Default environment. The report in Default environment should point to Production workspace in Power BI.
I have the following reports in workspaces.
Development workspace:
Production Workspace:
Now in order to deploy the report to Production, we need to use a managed solution and the report should point to Production workspace. So, in order to handle this, we will need to define an environment variable to store the workspace information. So, let’s get started.
First, we will create a Power BI embedded report in Development environment.
While you were creating a Power BI embedded report, you will be presented an option to choose from the Power BI workspace.
In order to achieve this requirement of deploying different versions of Power BI report in different instances, we need to use environment variable, so check the Use environment variable option.
The environment variable will be specific to this report and should be included in the solution when we want to deploy this report to higher environment.
The next thing to note is that Default workspace would reflect the default value for this report and current value is required when we want to set to another report in a different environment.
In Development environment, we choose as below..
Once the environment variable is saved, we now have 1 Dashboard and 1 environment variable component in the solution.
This solution is published and then exported as Managed solution, imported to another environment (Default environment which serves as Production environment here).
While importing, it asks to update environment variable, you can proceed to click on Import.
Now we have the solution in Default environment.
In order to update the value of the report to consider from Production environment, we need to open the report and click on the Pencil icon besides the Power BI Environment variable.
Then choose Prod workspace and its respective report and click save, publish.
That’s it…
You will be able to see two different reports in your Development and Default instances.
In this way, it is very easy to manage and deploy different versions of Power BI Report to different environments like Dev, Test, Prod.
Most of us know how to declare variables in our program…declaring a Var variable type is simplest one possible either in C#, Javascript or any scripting language.
Do you know that we can declare variables similarly in Canvas Apps using PowerFx…? A feature which was Generally available now..it’s none other than Named formulas.
With the named formulas, we can easily define and declare variables and only they were run when required, you don’t need to initialize it before hand, thus improving performance. Here you don’t even need to use Var while declaring the variable, you just name it…Also it offers below advantages.
The formula’s value is always available. There is no timing dependency, no App.OnStart that must run first before the value is set, no time in which the formula’s value is incorrect. Named formulas can refer to each other in any order, so long as they don’t create a circular reference. They can be calculated in parallel.
The formula’s value is always up to date. The formula can perform a calculation that is dependent on control properties or database records, and as they change, the formula’s value automatically updates. You don’t need to manually update the value as you do with a variable.
The formula’s definition is immutable. The definition in App.Formulas is the single source of truth and the value can’t be changed somewhere else in the app. With variables, it is possible that some code unexpectedly changes a value, but this is not possible with named formulas. That doesn’t mean a formula’s value needs to be static – it can change – but only if dependencies change.
The formula’s calculation can be deferred. Because its value it immutable, it can always be calculated when needed, which means it need not actually be calculated until it is actually needed. If the value is never used, the formula need never be calculated. Formula values that aren’t used until screen2 of an app is displayed need not be calculated until screen screen2 is visible. This can dramatically improve app load time and declarative in nature.
Named formulas is an Excel concept. Power Fx leverages Excel concepts where possible since so many people know Excel well.
Tip: Use App.Formulas instead of App.OnStart
The best way to reduce loading time for both Power Apps Studio and your app is to replace variable and collection initialization in App.OnStart with named formulas in App.Formulas.
Example without Named Formulas:
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This blog post deals about what you need to do for your client applications in specific to use Dataverse Client API instead of existing CrmServiceClient(Core Assemblies) API.
Below were 3 reasons cited by Microsoft and why we need to just be aware of this move.
1.Cross Platform Application Support: With the introduction of Microsoft.PowerPlatform.Dataverse.Client, the new Dataverse Service Client supports Cross Platform Support.
3. Performance and functional benefits: We can have one authentication handler per web service connection instead of just one per process. The Dataverse Service Client class supports a smaller interface surface, inline authentication by instance, and Microsoft.Extensions.Logging.ILogger.
What’s the impact?
Plug-ins or custom workflow activities – no changes
New or existing online applications – changes are needed but not immediately…
On-premises applications – this article is not for you, yet
So, meaning it impacts Online Client applications only. While you really don’t need to worry much about this the class member signatures of ServiceClient and CrmServiceClient are the same, except for the class names themselves being slightly different. Application code should not need any significant changes.
As of now, no changes to your code are required, but it is better to keep in mind that in the future the CRM 2011 Service End Point would be deprecated, and this change would be made mandatory.
So, what should you do to incorporate this change?
Use the following assemblies from Nuget instead of CrmSdk.CoreAssemblies
Add the below using statement to use Microsoft.PowerPlatform.Dataverse.Client
Use ServiceClient instead of CrmServiceClient, ServiceClient would return your OrganizationService.
I am a big fan of Power Automate…but this post is not about flows but features about Custom Workflow in Dynamics 365 CE.
Did you ever come across this problem where you were not able to debug custom workflow extension. I had come across this and this blog post is all about it…I successfully registered my Custom workflow, but it is not triggering at all.
So, I need to debug it to see what the exact issue was…as I am encounter this error.
Error message says Duplicate workflow activity group name: ‘EcellorsDemo.Cases(1.0.0.0) (Profiled)‘. So, I tried to check my code, plugin steps and any activated plugins but couldn’t find any duplicates.
Usually while debugging your custom workflow using profiler, your workflow will go into draft mode and another copy of the same workflow gets created with name of (Profiled) attached to the name. However, in my case, I didn’t see the same behavior and at the same time, I was unable to use Profiler after the first profiling session and it gave me error shown above.
In order to resolve, this just delete the Plugin Assemblies which could find in the default solution like highlighted below…
Once you have deleted this, try to debug the custom workflow and voila!!!
Hope this helps someone troubleshooting Custom workflow…!
This post is for all who are working on D365 Model Driven Apps and mainly Plugins.
Yes, you saw it right, in this blog post, we will see how can debug plugin without using our favorite plugin profiler which is very widely used from quite some time by everyone working on Plugins for Dynamics 365. All this is done by a tool called Dataverse Browser, which is not yet on XrmToolBox. Please note that there were some limitations as detailed in limitation section below.
Here are 3 simple steps to follow..
Install Dataverse Browser
Attach the Debugger
Run your actual operation.
Step into your code and debug it.
The tool embeds a web browser based on Chromium. It works by translating the Web API requests to SDK requests. Then it analyzes if plugin steps are registered on the message and it loads them, make them run locally. All other requests are sent to the Dataverse, so that the plugins are interacting with the real database.
Download the latest source code of Dataverse browser here.
Next extract the zip file downloaded as highlighted below
Extract the zip file downloaded, open Dataverse.Browser Application as highlighted below.
In the popup window, click on More info as highlighted below…
Then run the application anyway…you will be presented with a window where you can select the environment. Going forward, any time you want to open Dataverse browser, just open the Dataverse.Browser.exe and choose the environment as below.
Click on New, enter the details as above and key in the details.
Enter the settings of your environment:
A name meaningful for you
The host name of your instance (without the https://)
The path to the plugins assembly file (the dll). For a better experience, it should be compiled in debug mode with the pdb file generated.
Then click Go.
You just need to Authenticate to your instance.
Once Authenticated to the respective model driven apps, all the Web API requests sent to Dataverse will be shown as below.
I have following Plugin Libraries registered.
Next step is to choose the instance and perform the respective operation which triggers the Plugin. So, in here, I will perform an update to the Account entity from the Dataverse Browser which triggers the Plugin.
Once an update is performed, a Web API request gets recorded in the Dataverse browser as highlighted below.
Since the Plugin is in Post Operation, i.e. Stage number is 40
Just expand the Patch Request, you should see two operations on 30, 40, but area of interest here is for the Plugin which was registered on stage 40.
Make sure you open the Visual Studio and perform the below steps from Dataverse Browser.
Attach the debugger from Dataverse Browser by clicking on the Plug Symbol as below which will show the list of debugger options available for you to select from. Here I have selected Execute Plugins, plugin will be invoked. You can either select any of the three options as presented below.
1.Do not execute plugins – recommended when you want to debug without actually triggering your plugin logic. i.e. With this approach even you can check the code in Production environment.
2. Execute plugins/Execute plugins with auto break – recommended when you want to debug by triggering your actual plugin, this is recommended in case your plugin code had changed recently and in Development environments.
Just select Ecellors Demo – Microsoft Visual Studio: Visual Studio Professional 2022 version which will launch an existing Visual studio 2022 as below in break mode. Next click on Continue as highlighted below or press Click F5 on your keyboard.
This shows you that the debugger has been attached when you navigate to Dataverse Browser asking you to place your breakpoints.
Now just place breakpoints in your code in Visual Studio. Just go back to Dataverse Browser and click on Ok on the Diaglog box.
Perform the operation which triggers the Plugin from Dataverse Browser itself, this will hit the break point in Visual Studio from where you can debug your plugin.
As you might have observed, your code need not throw exception in order to debug, you could do similarly to the way you would debug using Profiler. But here just that you don’t need to deploy the latest code to the Dataverse just for debugging purpose.
This gives a lot more flexibility eases the way you debug plugins.
Limitions:
There is no support for transactions.
When plugins are triggered because of a server-side operation, they will not be run locally.
For many reasons, behavior will never be perfectly similar to the one when plugins are executed on server side.
Happy debugging, I hope you found this post useful…
Do you know that you can connect to your Dataverse DB right from your old toolbox SSMS, an express version would be more than enough to try out. Possibly we didn’t think of, but yes, we can…so let’s see that in this blog post.
Open SSMS..
1.Select Server type as Database Engine
2. Server name as the environment URL from your Power Platform Admin Center as below.
3. So key in those details as below, make sure to Select Authentication method as Azure Active Directory – Universal with MFA option.
Once you click on Connect, you will be prompted for authentication via browser.
Once your Sign-In is successful, you will be able to see.
That’s it, how simple it was connecting to your Dataverse instances…
Having said that it’s easy to connect to Dataverse, not all operations performed using normal transact SQL are supported here using Dataverse SQL. You could see it says Read-Only besides the instance name, that means that you don’t have any capabilities to modify from SQL.
Because Dataverse SQL is a subset of Transact-SQL. If you want to see what statements are supported and what not, just go ahead to this link to find out.
This opens a whole lot of opportunities to explore, so don’t forget to check this out.
In my previous blog post, I have already explained how you can utilize Power Pipelines which is the OOB Dynamics 365 Product capability.
Power Platform have many ways where we can deploy our solutions to higher environments…in this blog post, we will see how we can utilize Power Platform CLI to deploy our solutions.
Prerequisites: Power Platform CLI
If you don’t have installed yet in your machine, you can download Power Platform CLI from this link in either of the ways below
Once you got this installed, make sure you set your environment variable in your machine as below
Then you can use your favorite IDE or Command line. I personally recommend using Visual Studio Code because of the flexibility it offers and ease of installation, use.
Export and Import can be done very easily with CLI with a few commands once you were authenticated with your instance.
For authentication with your instance. Open a new terminal in visual studio code.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Once you have set up correctly, it will show that it is connected.
Now in order to export your solution..use the below commands from Vs Code
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
You should see a Solution zip file got created with the same name as mentioned above…
Similarly, u can import solutions using CLI..
Here I have a solution named ecellorstest in the same folder in my machine..
Let’s try to import using CLI..inorder to import your solution,use the below commands from Vs Code…
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
While you may have noticed this or not, but it’s real. Now Dynamics 365 CE existing table types have a new companion called Elastic, it is yet to be announced.
However let’s take a quick look of the table types showing up when you were trying to create a new one in Dataverse.
While everyone is aware about Standard, Activity, Virtual types in Model Driven Apps. Elastic tables are new tables which came in to Dataverse and probably it will be announced in the upcoming Microsoft Build 2023.
From my view, Elastic tables were
1. Built similar to the concept of Elastic Queries in Azure which is usually meant for the purposes of Data archiving needs.
2. You can scale out queries to large data tiers and visualize the results in business intelligence (BI) reports.
3. Elastic Query provides a complete T-SQL Database Querying capability on Azure SQL, possibly Dataverse.
Hope we get all the capabilities released with Elastic Queries of Azure SQL be released in Dataverse as well.