After installing the prerequisites, I was trying to connect to my Power Pages available in my organization from VS Code terminal using below command.
pac paportalist
It’s then I encountered the below error
It’s then I understood that due to inactivity, it is failing…
Your Power Platform CLI connection is failing due to an expired refresh token and an ExternalTokenManagement Authentication configuration issue. Here’s how you can resolve it:
This is another post related to Plugins in Dynamics 365 CE.
Considering medium to large scale implementations, there isn’t a single Power Platform Project which don’t require merging of external assemblies.
We relied on ILMerge to merge those assemblies into a single DLL. We used to search for ILMerge assemblies in Nuget and installed them for use.
Then the plugins are signed in for several reasons, primarily related to security, assembly integrity, and versioning of the sandbox worker process.
But either of the above are no longer needed with the help of Dependent Assembly feature…with few simple steps, you can build the Plugin…Interesting, isn’t it, read on…
Pre requisites:
Download Visual Studio 2022 Community Edition here
Download and install NuGet Package Explorer from this link NuGet Package Explorer open the NuGet Package Explorer
Avoid Direct Plugin Project Creation in Visual Studio
Never create a Plugin project directly from Visual Studio or any other IDE here after.
Use Microsoft PowerApps CLI instead
Always use Power Apps CLI as it easy and only requires a single command to create the entire Plugin project scaffolding
This ensures a standardized and reliable development environment.
It automatically creates a Nuget Package file that will be used to avoid ‘Could not load assemblies or its dependencies‘.
Ok, let’s begin.
Once you have downloaded all the prerequisites mentioned, make sure you have installed them in your local machine. Others are straight forward to download, for NuGet Package explorer, you need to search in Windows store to install.
Create a local folder for the Plugins
Navigate to that folder from VS Code
Now open terminal, run the pac command as below
Execute the following command to create plugin project
Browse to the directory where you want to create the plugin project
Execute the command on CMD to create plugin project “pac plugin init“
A plugin project will be created at your desired location as follows
Plugin project in local folder will be created as below
That’s it, you can close the VS Code for now.
Click on the CS Proj file and open it in Visual Studio
By default, 2 files are automatically created when you create a plugin project as shown above.
Now will install Bouncy Castle which is an external library, right click on the Plugin Solution –> Manage Nuge Packages
I have added Bouncy Castle NuGet Package to my plugin project for Encryption and Decryption. You can have your own required NuGet Package as per your need.
Build your project
After a successful build, you will get the output result as follows
Browse the directory of your project
Open the file Plugin_Project.1.0.0.nupkg in Nuget Package Explorer by double clicking it
Now you can see that this nuget package file contains the information related to the added nuget package of Bouncy Castle that we want to include in our plugin project package as follows. In your case, you can have the required nuget package that you want to add
Now open up plugin registration tool
Click to create new connection
Provide login details and login
Click to Register New Package
Browse to the directory where your nuget package file was created automatically when you build the project and import this file
Select the Command Data Service Default Solution and import it
Click on view and Display by package
Now your Plugin Project is successfully registered with all dependent assemblies and ready to use.
While this post gives you a structure on how you can do build a plugin assembly, you can add the business logic as per your need.
Conclusion:
In conclusion, navigating the intricacies of Microsoft Dynamics 365 CRM plugins demands a nuanced approach, especially when dealing with NuGet Packages and dependent assemblies. This article has delved into the critical process of resolving the persistent ‘Could not load assemblies or its dependencies‘ issue, offering a comprehensive, step-by-step demonstration.
By following the recommended best practices, such as avoiding direct plugin project creation in Visual Studio and harnessing the power of Microsoft PowerApps CLI, developers can establish a standardized and reliable development environment. The CLI’s automatic creation of a NuGet Package file not only streamlines the process but also reduces the errors.
To further facilitate your journey, prerequisites such as downloading and installing essential tools like the Plugin Registration tool, Microsoft PowerApps CLI, and NuGet Package Explorer are highlighted. The guide emphasizes the significance of these tools in ensuring a smooth plugin development experience.
By adopting these practices and incorporating the suggested steps into your workflow, you not only troubleshoot existing issues but also fortify your understanding of the entire process. Take charge of your Dynamics 365 CRM plugin development, elevate your skills, and sidestep common pitfalls by mastering the art of handling NuGet Packages and dependencies seamlessly.
It’s been a since I posted on Dynamics 365 Plugins, so this blog post talks about one small tip when connecting to your Dynamics 365 instance from Plugin Registration Tool either if you were connecting from Standalone Plugin Registration Tool or using Plugin Registration Tool from XrmToolBox.
If you were looking to install plugin registration tool itself, you can check the below post or if you want to learn about all Plugin related issues at once, you can check the references at the bottom of this post, else you can continue reading this post.
If you don’t know this tip, it will be difficult and least you will spend many minutes figuring out the error message you see in the Plugin registration tool.
This is applicable for applications who have MFA enabled, even if you haven’t enabled, it was enabled by Microsoft by default to enforce security.
As usually, you select:
Office 365
Enable Display list of available organizations, Show Advanced
Provide User Name, Password
Click on Login
You will be prompted this error in such case
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Error : AADSTS50076: Due to a configuration change made by your administrator, or because you moved to a new location, you must use multi-factor authentication to access '00000007-0000-0000-c000-000000000000'. Trace ID: 7a7cac23-056c-4e77-ba82-98d50c0b7001 Correlation ID: d8b32fe6-6197-4d9a-a460-3834c8dc292a Timestamp: 2025-04-12 09:09:52Z
at Microsoft.Xrm.Tooling.CrmConnectControl.CrmConnectionManager.QueryOAuthDiscoveryServer(Uri discoServer, ClientCredentials liveCreds, UserIdentifier user, String clientId, Uri redirectUri, PromptBehavior promptBehavior, String tokenCachePath, Boolean useGlobalDisco)
at Microsoft.Xrm.Tooling.CrmConnectControl.CrmConnectionManager.QueryOnlineServerList(ObservableCollection`1 svrs, OrganizationDetailCollection col, ClientCredentials liveCreds, Uri trimToDiscoveryUri, Uri globalDiscoUriToUse)
at Microsoft.Xrm.Tooling.CrmConnectControl.CrmConnectionManager.FindCrmOnlineDiscoveryServer(ClientCredentials liveCreds)
at Microsoft.Xrm.Tooling.CrmConnectControl.CrmConnectionManager.ValidateServerConnection(CrmOrgByServer selectedOrg)
Error : {"error":"interaction_required","error_description":"AADSTS50076: Due to a configuration change made by your administrator, or because you moved to a new location, you must use multi-factor authentication to access '00000007-0000-0000-c000-000000000000'. Trace ID: 7a7cac23-056c-4e77-ba82-98d50c0b7001 Correlation ID: d8b32fe6-6197-4d9a-a460-3834c8dc292a Timestamp: 2025-04-12 09:09:52Z","error_codes":[50076],"timestamp":"2025-04-12 09:09:52Z","trace_id":"7a7cac23-056c-4e77-ba82-98d50c0b7001","correlation_id":"d8b32fe6-6197-4d9a-a460-3834c8dc292a","error_uri":"https://login.microsoftonline.com/error?code=50076","suberror":"basic_action"}: Unknown error
======================================================================================================================
Inner Exception Level 2 :
Source : Not Provided
Method : Not Provided
Date : 12/4/2025
Time : 5:09:52 pm
Error : {"error":"interaction_required","error_description":"AADSTS50076: Due to a configuration change made by your administrator, or because you moved to a new location, you must use multi-factor authentication to access '00000007-0000-0000-c000-000000000000'. Trace ID: 7a7cac23-056c-4e77-ba82-98d50c0b7001 Correlation ID: d8b32fe6-6197-4d9a-a460-3834c8dc292a Timestamp: 2025-04-12 09:09:52Z","error_codes":[50076],"timestamp":"2025-04-12 09:09:52Z","trace_id":"7a7cac23-056c-4e77-ba82-98d50c0b7001","correlation_id":"d8b32fe6-6197-4d9a-a460-3834c8dc292a","error_uri":"https://login.microsoftonline.com/error?code=50076","suberror":"basic_action"}: Unknown error
Stack Trace : Not Provided
======================================================================================================================
Based on the above inner exception, we can clearly understand that it is looking for Multifactor Authentication, so untick the Show Advanced checkbox, it then asks for Multifactor Authentication as shown below.
That’s it, with this simple tick of unchecking the Show Advanced, you were able to overcome this error, how cool is it…?
I have written lot of articles with respect to Plugin registration tool, you can check them below
Well this post talks about pretty basic but an important tip, it is the best way to get the flow Metadata.
Have you every got a requirement to get the Metadata about the flow you were running in…yes, then this post is for you.
Let’s say you were running a flow, you want to report out from when this flow has been running and Flow Guid etc…then Add the below action to get the details in a Compose step…
Run the flow to see the details retrieved…
If you carefully review the parameters returned..
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Have you noticed Visualize this view button in in the app bar of any grid view of Dynamics 365?
Here is a dashboard built within couple of minutes. While this can greatly help end users visualize the data present in the system. So, in this post, let’s understand bit more details about this capability and what are the some of the features which are left behind.
Let’s understand the how’s this is generated along with its capabilities and disadvantages compared to traditional Power BI dashboard both from Developer and end user perspective, please note that this is my understanding..
For Developers:
a.Visualize this view uses a PCF Control which calls the Power BI REST API and then generates the embed token for the report embedding that into an Iframe.
b. Then uses Power BI JavaScript API to handle user interactions with the embedded report such as filtering or highlighting data points.
c. When Power BI first generates your report, it takes a look through your data to identify patterns and distributions and pick a couple of fields to use as starting points for creating the initial set of visuals when data is not preselected.
d. Any changes to the data fields calls the UpdateView of the PCF Control and there by passing the updated data fields to REST API and then displays the visuals.
e. Visuals will be created with both selected and non-selected fields which are the related to the selected fields in the data pane.
For End Users & Developers:
Advantages:
Visuals are generated when no data is selected
Cross Highlighting is possible
Click on the Report to see Personalize this visual option
People with Contributor, Member, or Admin role assigned can save the Report to workspace
Users with no access to Power BI cant view this feature, they can request for free Power BI License
Free License users can save the Report to thier personal workspace
Users get build permission when any role above Contributor is assigned and reshare permission is given
The report will be saved as direct query with SSO enabled and honours dataverse settings
Show data table presents a list of tables if the model comprises of multiple tables.
Able to specify the aggregation for each of the field in the model.
Disadvantages:
Only able to export summarized data from Visuals, you will be able to export the data in table from data table.
Only Visual Level, no page level or report level filters
During these reports creation, the model is configured to use Direct Query with Single Sign On.
Embed a report on a Dataverse form requires modifying the XML of the solution
Report published into the workspace are available to download but downloaded reports will not be able to customize further in Power BI Desktop as it would be built using Native Queries.
If the page is kept ideal for long time or the user navigates to other browser window, the session and report will be lost.
Considerations & Limitations:
Power BI Pro license is required to create these reports
While this is wonderful for end users to visualize the data but this is not an alteranative to building reports using Power BI Desktop.
Looking to improve your experience developing code using Power Fx. Then try this feature which was in preview…I want to show you how you can execute them. It is actually quite ideal for prototyping, debugging, testing to see the Power-Fx expression behavior. If you were new to Power-Fx, you can check my introductory post here on this which I wrote way back in 2021.
All you can do this is to execute few commands from VS Code, they directly run on your Dataverse environment.
Install PowerShell Module for VS Code (install from here)
Install Power Platform Tools for VS Code (install from here)
Once you have everything installed you were good to go, few more steps required to set up your VS Code to run successfully.
As you have already installed the Power Platform Tools extension above, you should see an icon at the side bar as highlighted below.
Create an authentication profile for your target environment, click on the plus symbol besides the AUTH PROFILES, I have already created few earlier.
Provide your login credentials using the login prompt.
Once authenticated, you should be able to see all your environments in your tenant like below.
Open a terminal in VS Code
You should see something like below.
Now you were all set to run Power-Fx Commands targeting your envirnment, let’s try it out. In order to interact with Dataverse, use the below commands, thereby reducing the time and complexity of your Dataverse operations by using Power Fx.
1: pac org who: Displays information about your current Dataverse Organization
We heard about Read-Eval-Print Loop while working on other languages mainly Python, we now have that included while using Power Fx, to start using if. Enter the below in your Vs Code Terminal, it should show something like below and now you can execute commands and once connected, you can use Dataverse commands.
By the way, we need to use Plural names below.
pac power-fx repl command:
a. Add Rows: Use Collect (Contacts, { firstname: “Pavan”, lastname: “Mani Deep”})
With this command, you can run a file of Power Fx instructions.
a. Create Dataverse records: By using same Collect Command we used above in a file.
Now execute the command
pac power-fx run –file Name of the file -e true
b. Query a Dataverse table: Save the below command in a file located in the folder path.
Now execute the command
c. Filter a Dataverse table: While I used the filter command, I was not able to filter the data, rather I was getting null. I hope Microsoft will be fixing this when these features are out of preview.
I hope this gives you an idea of how you can execute Power-Fx commands within your favorite IDE(Vs Code).
🚀 You’re Invited to the Global AI Bootcamp 2025 – Singapore Edition!
Are you passionate about AI, Power Platform, and Microsoft technologies? Want to learn how AI is transforming businesses and industries? Then this event is for you!
🎯 What to Expect? ✅ Expert-led sessions on AI, Copilot, Power Platform, and more ✅ Hands-on workshops to apply AI in real-world scenarios ✅ Networking opportunities with industry leaders and AI enthusiasts ✅ Absolutely FREE to attend!
Here is how you can quickly call action using Web API, with this method you can execute a single action, function, or CRUD operation. In the below example, let’s see how you can call an action. Here is function…to achieve this..
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This example details with Unbound action, which is not tagged to any entity, however if in case on Bound action, you will specify the entity name for bound parameter instead of null. You need to specify the Metadata accordingly for your Action. Let’s understand it’s syntax first…
Thank you for visiting my blog today…this is post is mainly for Pro developers. Encryption is crucial to maintain the confidentiality in this digital age for the security of our sensitive information. So here is a blog about it. This is in continuation to my previous blog post on encrypting files using GnuPG.
In this blog post, I will give you sample how you can encrypt/decrypt using GnuPG with command line scripts from C# code.
If you didn’t go through my previous article, I strongly recommend you go through that article below first to understand the background.
Next, in order to encrypt/decrypt a given csv file (taken for simplicity), we can use the following C# codes. For illustration purpose, I have just provided you the logic in the form of a Console.
Encryption:
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
string gpgPath = @"D:\Softwares\Kleo patra\GnuPG\bin\gpg.exe";//Once GPG installed, you can look for gpg.exe in the bin folder of the installation
string inputFile = "Input encrypted file";//Replace with your gpg encrypted file location
string outputFile = "Decrypted CSV file"; //give it a name for the decrypted file and location, output file path doesnt exists yet, you may give a sample name
All you need is to copy and replace the file locations in the code. Sit back and enjoy encrypting and decrypting with GnuPG. I should say once known, this is the easiest way to encrypt/decrypt from C# code, no strings attached.
If you need any other information, please do let me know in comments.
This post is continuation to my previous post on COE Starter Kit, if in case you have just landed on this page. I would suggest go here and check out my blog post on introduction to COE Starter Kit.
Important:
Do test out each and every component, rolling out to production without testing as you need to keep in mind that there were many flows which can trigger emails to users which may keep them annoyed.
You need to install the components present in the COE Starter Kit extracted folder in the dedicated environment, preferably Sandbox environment (not in Default environment, so that you can test it out first before moving changes to Production), make sure you have Dataverse installed in the environment. First let’s install the Solutions and later we can proceed to customize them.
Install CenterofExcellenceCoreComponents managed solution from your extracted folder, the exact version may be different and differ as the time goes at the time of installing this, the version was as below CenterofExcellenceCoreComponents_4.24_managed
Then proceed to click on Import as we will be configuring these environment variables whenever required later. It takes a couple of seconds to process, it asks to set the connections which I had talked about in previous post, just create new connection if one not available and click next. Make sure you have green checkboxes for each connection, and you are good to click next.
Then you will be presented with the screen to input Environment variables as below, we will configure later so for now, just proceed by clicking on Import button.
The import process may take a while like around 15 minutes, once imported, you should see a notification message on your screen something like below.
Step 1:
You will have a bunch of Apps, Flows installed in your environment. Configure the COE Settings by opening the Centre of Excellence setup and upgrade wizard from the installed Center of Excellence – Core Components managed solution.
It should look something like below when opened. You will be presented with some prerequisites
Proceed with this step-by-step configuration, you don’t need to change any of the setting, just proceed by clicking on Next.
Step 2: In this step, you can configure different communication groups to coordinate by creating different personas
You can click on Configure group, choose the group from the drop down and enter the details and click create a group.
Provide a group name and email address without domain in the next steps and proceed to create a group, these were actually Microsoft 365 groups.
Once you have setup, it should show..
However, this step is optional, but for efficient tracking and maximum benefit of COE, it is recommended to set this up.
Step 3: While the tenant Id gets populated automatically. Make sure to select no here instead of yes if you were using Sandbox or Production Environment and configure your Admin email and click Next.
Step 4: Configure the inventory data source.
Tip: In case you were not able to see the entire content in the page, you can minimize the Copilot and press F11 so that entire text in the page would be visible to you.
This is required for the Power Platform Admin Connectors to crawl your tenant data and store them in Dataverse tables. This is similar to how search engines crawl entire internet to show any search results. While Data export is in preview, so we proceed with using Cloud flows.
Click Next.
Step 5:
This step is Run the setup flows, click on refresh to start the process. In the background, all the necessary admin flows will be running. Refresh again after 15 minutes to see all the 3 admin flows are running and collecting your tenant data as below and click Next.
Step 6:
In the next step, make sure you set all the inventory flows to On.
By the way inventory flows are a set of flows that are repeatedly gathering a lot of information about your Power Platform tenant. This includes all Canvas Apps, Model Driven Apps, Power Pages, Cloud Flows, Desktop Flows, Power Virtual Agent Bots, Connectors, Solutions and even more.
To enable them, open the COE Admin Command Center App from Center of Excellence – Core Components Solution. Make sure you turn on all the flows available.
So, after turning on all the flows, come back and check on Center of Excellence Wizard Setup, you should see a message something like below saying all flows have been turned on.
Configure data flows is optional, as we haven’t configured it earlier, this step would be skipped.
Step 7: In the next step, all the Apps came in with Power Platform COE Kit should be shared accordingly based on your actual requirement to different. personas.
Step 8:
This part of the wizard currently consists of a collection of links to resources, helping to configure and use the Power BI Dashboards included in the CoE.
Finish
Once you click Done, you will be presented with more features to setup.
These setups have similar structure but varies a bit based on the feature architecture.
As we got started with setting Starter Kit and had set up the Core Components of the Starter Kit which is important one, now you can keep customizing further, in the future posts, we will see how we can set up Center of Excellence – Governance Components, Center of Excellence – Innovation Backlog. These components are required to finally set up the Power BI Dashboard and use effectively to plan your strategy.
Everyone who’s ever installed or updated the CoE knows how time-consuming it can be. Not just the setup procedure, but also the learning process, the evaluation and finally the configuration and adoption of new features. It’s definitely challenging to keep up with all this. Especially since new features are delivered almost every month. This attempt from me is to try my best to keep it concise, yet making you understand the process.
While such setup wizard is clear and handy resource to get an overview of the CoE architecture and a great starting point for finding any documentation. This simplifies administration, operations, maintenance and may be even customizations.