It’s been a since I posted on Dynamics 365 Plugins, so this blog post talks about one small tip when connecting to your Dynamics 365 instance from Plugin Registration Tool either if you were connecting from Standalone Plugin Registration Tool or using Plugin Registration Tool from XrmToolBox.
If you were looking to install plugin registration tool itself, you can check the below post or if you want to learn about all Plugin related issues at once, you can check the references at the bottom of this post, else you can continue reading this post.
If you don’t know this tip, it will be difficult and least you will spend many minutes figuring out the error message you see in the Plugin registration tool.
This is applicable for applications who have MFA enabled, even if you haven’t enabled, it was enabled by Microsoft by default to enforce security.
As usually, you select:
Office 365
Enable Display list of available organizations, Show Advanced
Provide User Name, Password
Click on Login
You will be prompted this error in such case
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Error : AADSTS50076: Due to a configuration change made by your administrator, or because you moved to a new location, you must use multi-factor authentication to access '00000007-0000-0000-c000-000000000000'. Trace ID: 7a7cac23-056c-4e77-ba82-98d50c0b7001 Correlation ID: d8b32fe6-6197-4d9a-a460-3834c8dc292a Timestamp: 2025-04-12 09:09:52Z
at Microsoft.Xrm.Tooling.CrmConnectControl.CrmConnectionManager.QueryOAuthDiscoveryServer(Uri discoServer, ClientCredentials liveCreds, UserIdentifier user, String clientId, Uri redirectUri, PromptBehavior promptBehavior, String tokenCachePath, Boolean useGlobalDisco)
at Microsoft.Xrm.Tooling.CrmConnectControl.CrmConnectionManager.QueryOnlineServerList(ObservableCollection`1 svrs, OrganizationDetailCollection col, ClientCredentials liveCreds, Uri trimToDiscoveryUri, Uri globalDiscoUriToUse)
at Microsoft.Xrm.Tooling.CrmConnectControl.CrmConnectionManager.FindCrmOnlineDiscoveryServer(ClientCredentials liveCreds)
at Microsoft.Xrm.Tooling.CrmConnectControl.CrmConnectionManager.ValidateServerConnection(CrmOrgByServer selectedOrg)
Error : {"error":"interaction_required","error_description":"AADSTS50076: Due to a configuration change made by your administrator, or because you moved to a new location, you must use multi-factor authentication to access '00000007-0000-0000-c000-000000000000'. Trace ID: 7a7cac23-056c-4e77-ba82-98d50c0b7001 Correlation ID: d8b32fe6-6197-4d9a-a460-3834c8dc292a Timestamp: 2025-04-12 09:09:52Z","error_codes":[50076],"timestamp":"2025-04-12 09:09:52Z","trace_id":"7a7cac23-056c-4e77-ba82-98d50c0b7001","correlation_id":"d8b32fe6-6197-4d9a-a460-3834c8dc292a","error_uri":"https://login.microsoftonline.com/error?code=50076","suberror":"basic_action"}: Unknown error
======================================================================================================================
Inner Exception Level 2 :
Source : Not Provided
Method : Not Provided
Date : 12/4/2025
Time : 5:09:52 pm
Error : {"error":"interaction_required","error_description":"AADSTS50076: Due to a configuration change made by your administrator, or because you moved to a new location, you must use multi-factor authentication to access '00000007-0000-0000-c000-000000000000'. Trace ID: 7a7cac23-056c-4e77-ba82-98d50c0b7001 Correlation ID: d8b32fe6-6197-4d9a-a460-3834c8dc292a Timestamp: 2025-04-12 09:09:52Z","error_codes":[50076],"timestamp":"2025-04-12 09:09:52Z","trace_id":"7a7cac23-056c-4e77-ba82-98d50c0b7001","correlation_id":"d8b32fe6-6197-4d9a-a460-3834c8dc292a","error_uri":"https://login.microsoftonline.com/error?code=50076","suberror":"basic_action"}: Unknown error
Stack Trace : Not Provided
======================================================================================================================
Based on the above inner exception, we can clearly understand that it is looking for Multifactor Authentication, so untick the Show Advanced checkbox, it then asks for Multifactor Authentication as shown below.
That’s it, with this simple tick of unchecking the Show Advanced, you were able to overcome this error, how cool is it…?
I have written lot of articles with respect to Plugin registration tool, you can check them below
🚀 You’re Invited to the Global AI Bootcamp 2025 – Singapore Edition!
Are you passionate about AI, Power Platform, and Microsoft technologies? Want to learn how AI is transforming businesses and industries? Then this event is for you!
🎯 What to Expect? ✅ Expert-led sessions on AI, Copilot, Power Platform, and more ✅ Hands-on workshops to apply AI in real-world scenarios ✅ Networking opportunities with industry leaders and AI enthusiasts ✅ Absolutely FREE to attend!
Here is how you can quickly call action using Web API, with this method you can execute a single action, function, or CRUD operation. In the below example, let’s see how you can call an action. Here is function…to achieve this..
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This example details with Unbound action, which is not tagged to any entity, however if in case on Bound action, you will specify the entity name for bound parameter instead of null. You need to specify the Metadata accordingly for your Action. Let’s understand it’s syntax first…
Thank you for visiting my blog today…this is post is mainly for Pro developers. Encryption is crucial to maintain the confidentiality in this digital age for the security of our sensitive information. So here is a blog about it. This is in continuation to my previous blog post on encrypting files using GnuPG.
In this blog post, I will give you sample how you can encrypt/decrypt using GnuPG with command line scripts from C# code.
If you didn’t go through my previous article, I strongly recommend you go through that article below first to understand the background.
Next, in order to encrypt/decrypt a given csv file (taken for simplicity), we can use the following C# codes. For illustration purpose, I have just provided you the logic in the form of a Console.
Encryption:
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
string gpgPath = @"D:\Softwares\Kleo patra\GnuPG\bin\gpg.exe";//Once GPG installed, you can look for gpg.exe in the bin folder of the installation
string inputFile = "Input encrypted file";//Replace with your gpg encrypted file location
string outputFile = "Decrypted CSV file"; //give it a name for the decrypted file and location, output file path doesnt exists yet, you may give a sample name
All you need is to copy and replace the file locations in the code. Sit back and enjoy encrypting and decrypting with GnuPG. I should say once known, this is the easiest way to encrypt/decrypt from C# code, no strings attached.
If you need any other information, please do let me know in comments.
This post is about Dataverse and Git Integration which is the most sought after feature in the todays automation Era. This is a preview feature, you would need to create a new environment with Early Access enabled to test this feature or you can use an existing US Preview environment for testing this out.
While every MDA(Model Driven Application) and it’s components can be safely and moved across the environments using Solutions with the help of Azure DevOps Pipelines. However when coming to integrating Power Platform Solutions to Azure DevOps, we had to manually export the solution and download them each and every time when we would like to commit the Solution Artifacts to Azure DevOps Repo.
With this new Preview feature we can directly integrate the Power Platform Solutions to Azure DevOps.
Let’s see this action…wait a moment, there were some prerequisites to be considered…
Environment should be a Managed Environment to start using this and you need to be an Admin for the environment
Azure DevOps subscription and license should be available to set this up, also permission to read source files and commits(should be a member of contributor group in Azure DevOps) from a Repo
Your email address used for Azure DevOps and Power Platform Solutions should be the same
Setup:
Connecting Dataverse with Azure DevOps is easy but requires a bit of understanding of the Binding options available.
Well, there were two types of Binding options
Environment Binding – Single root folder binds to all the unmanaged solutions in the environment
Solution Binding – Different solutions uses a different root folder in Azure DevOps for binding
Note: Once the binding is setup, there isn’t away to change, so set this up carefully, else you may need to delete the folder and create a new one in Azure DevOps.
Let’s see one by one…for demoing purpose, I have created two projects in Azure DevOps Instance
Solution Binding: When we use this, all the components will be available as pending changes
Environment Binding: When we use this, all the unmanaged solution components will be mapped to one Azure DevOps root folder. Let’s set this up.
We are currently able to use only Solution binding, as Environment Binding doesn’t show up any changes to be committed, but there is a catch here.
We can set up for Environment binding and verify if the Solution components are getting marked as pending changes or not. Do note that Setting up the Binding is a one time activity for environment, once setup, it can’t be changed from one type to another.
Since we were currently using Environment binding, let’s select the Connection Type as Environment
Then click on Connect, once connected, you should a alert message in power apps maker portal at the top.
Now create a new solution as below named ecellors Solution
Verify the integration by clicking on Git Integration as below
It should show as below
Now let’s add few components to the solution we created
Once added, let’s publish the unmanaged solution and verify it..
Do look closely, you should see a Source Control icon highlighted in yellow color for illustration.
Also, you should see a commit option available at the top
You should now be able to commit the solution components as if you are committing the code changes.
It also specifies the branch to which we were commiting…
While it takes few minutes unlike pushing the code to Azure DevOps to push the changes, however this would depend based on the number of solution components you were pushing..once it is done, it will show a commit message like below…
Now let’s verify our Azure DevOps Repo..for this you can go back to the main solutions page, click on Git Connection at the top..
After clicking on Git Connection, click on the link to Microsoft Azure DevOps as below
Then you should be navigated to Azure DevOps folder as below where all the solution files will be tracked component wise.
Now we will move back to Power Apps maker portal and make some changes to any of the components inside the solution…
Let’s say, I just edited the flow name and created a new connection reference, saved and published the customizations.
If you did some changes at the Azure DevOps repo level, you can come back and click on Check for updates, if there were any conflicts between changes done in Azure DevOps and component in solution, it will be shown as conflict.
We now have 3 component changes and all were listed here…you can click on Commit.
As soon as the changes are committed, you should see a message saying Commit Successful and 0 Changes, 0 Updates, 0 Conflicts.
Now you successfully integrated Dataverse Solution components with Azure DevOps without any manual intervention required while deploying solutions using Azure DevOps Pipelines.
Hope you learned something new today…while feature is still in Preview and only available for early release, while couple of issues still need to fixed by Microsoft.
I have tested this feature by creating an environment in US Preview region and this feature will be a good value to projects using Automation and this solution repository can be further deployed to other environments using Azure DevOps Pipelines.
This will be rolled out soon next year, hope you learned something new today…
This is an introductory post, but it’s worth going through where I will be sharing basics about using Centre of Excellence(COE) in Power Platform. Let’s get started.
So, what’s Center of Excellence? COE plays a key role in deriving strategy and move forward in this fast-paced world to keep up with the innovation. Firstly, we may need to ask ourselves few questions…Do your organization have lot of flows, apps and copilots aka power virtual agents? Do you want to effective manage them? Then how you want to move forward…using COE Starter kit is a great choice. It is absolutely free to download, the starter kit is a collection of components and tools which will help to oversee and adopt Power Platform Solutions. The assets part of the CoE Starter Kit should be seen as a template from which you inherit your individual solution or can serve as inspiration for implementing your own apps and flows.
There were some prerequisites before you can install your COE Starter Kit. Many of the medium to large scale enterprise Power Platform implementations should be possessing in their tenant.
Microsoft Power Platform Service Admin, global tenant admin, or Dynamics 365 service admin role.
Dataverse is the foundation for the kit.
Power Apps Per User license (non-trial) and Microsoft 365 license.
Power Automate Per User license, or Per Flow licenses (non-trial).
The identity must have access to an Office 365 mailbox that has the REST API enabled meeting the requirements of Outlook connector.
Make sure you enable the Power Apps Code Components in Power Platform Admin Center
If you want to track unique users and app launches, you need to have Azure App Registration having access to Microsoft 365 audit log.
If you would like to share the reports in Power BI, minimally you require a Power BI pro license.
Setting up communication groups to talk between Admins, Makers and Users.
Create 2 environments, 1 for test and 1 for production use of Starter Kit
Install Creator Kit in your environment by downloading the components from here
The following connectors should be allowed to effectively use data loss prevention policies(DLP)
Once you were done checking the requirements, you can download from the starter kit here.
You can optionally install from App Source here or using Power Platform CLI here.
The kit provides some automation and tooling to help teams build monitoring and automation necessary to support a CoE.
While we saw what advantages are of having COE in your organization and other prerequisites. In the upcoming blog post, we will see how you can install COE starter kit in your Power Platform tenant and set it up to effectively plan your organization resources for highest advantage.
Well this post is not related to Power Platform, but I want to bring it on here to specify the significance of using NOLOCK in Power Platform Implementations using SQL Server.
Recently during our Deployment activity, we had a SSIS job which is writing a lot of data into SQL Server, at the same time, we were trying to read the data from the same table. I received never ending Executing query … message. It is when I had arguments on this, hence I would like to share the significance of NOLOCK.
The default behaviour in SQL Server is for every query to acquire its own shared lock prior to reading data from a given table. This behaviour ensures that you are only reading committed data. However, the NOLOCK table hint allows you to instruct the query optimizer to read a given table without obtaining an exclusive or shared lock. The benefits of querying data using the NOLOCK table hint is that it requires less memory and prevents deadlocks from occurring with any other queries that may be reading similar data.
In SQL Server, the NOLOCK hint, also known as the READUNCOMMITTED isolation level, allows a SELECT statement to read data from a table without acquiring shared locks on the data. This means it can potentially read uncommitted changes made by other transactions, which can lead to what’s called dirty reads.
Here’s an example:
Let’s say you have a table named Employee with columns EmployeeID and EmployeeName.
Now, if two transactions are happening concurrently:
Transaction 1:
BEGIN TRANSACTION
UPDATE Employee
SET EmployeeName = 'David'
WHERE EmployeeID = 1;
Transaction 2:
SELECT EmployeeName
FROM Employee WITH (NOLOCK)
WHERE EmployeeID = 1;
If Transaction 2 uses WITH (NOLOCK) when reading the Employee table, it might read the uncommitted change made by Transaction 1 and retrieve 'David' as the EmployeeName for EmployeeID 1. However, if Transaction 1 rolled back the update, Transaction 2 would have obtained inaccurate or non-existent data, resulting in a dirty read.
Key takeaways about NOLOCK:
✅ Pros: Reduces memory use, avoids blocking, speeds up reads.
❌ Cons: May read uncommitted or inconsistent data.
Using NOLOCK can be helpful in scenarios where you prioritize reading data speed over strict consistency. So, in my case as I want to just view the data, using NOLOCK is good without locking the query. However, it’s essential to be cautious since it can lead to inconsistent or inaccurate results, especially in critical transactional systems.
Other considerations like potential data inconsistencies, increased chance of reading uncommitted data, and potential performance implications should be weighed before using NOLOCK.
Conclusion:
There are benefits and drawbacks to specifying NOLOCK table hint as a result they should not just be included in every T-SQL script without a clear understanding of what they do. Nevertheless, should a decision be made to use NOLOCK table hint, it is recommended that you include the WITHkeyword. Using NOLOCK without WITH Statement is deprecated. Always use a COMMIT keyword at the end of the transaction.
In App notifications are trending these days where many customers are showing interest in implementing these for their businesses.
So, in this blog post, I am going to show you the easiest way to generate In App notification using XrmToolBox in few clicks. Use the below tool to generate one.
So, let me walk you through step by step
Step 1: Open In App Notification Builder in XrmToolBox
Step 2: In App notification is a setting that should be enabled at App level, so meaning if you have developed few Model Driven Apps, you will be able to enable the In App notification individually for each one of them.
Step 3: In the above snapshot, we should be able to select the respective App for which we want to enable the In App Notification. Red bubble besides indicate that the In App notification is not enabled for this App.
So, we need to enable it by clicking on the red icon itself, you should then be able to get this prompt as below.
Step 5: Upon confirming the confirmation dialog box, the In App notification will be enabled for that App and you the red button turns to green as below saying that In App Notification is enabled.
Now that the In App notification is enabled in the App, we will proceed with the remaining setup.
Step 6: You can proceed to give a meaningful title, body for you In App Notification. Also mention the required toast type and specify the expiry duration, Icon. Also Click on Add icon and choose the action required to be performed when In App notification is clicked.
Step 9: You can even choose the type of action to be performed…
For example, let’s use to open as dialog and show list view
Your screen should look something like below
Step 10: Once done, you can click on create and that’s it you have now created In App Notification. Now let’s test this for the user who have priveleges to access this App.
If not, you will face this error..
Log in with user account for which the In App Notification is triggered.
Hurray!!!! That’s it, how easy it was to create In App Notification in Low Code manner.
You can even get the code behind this as well…
However, there were other ways to trigger the In App Notification from a Pro Code angle, let’s discuss those as well.
In this case you need to manually turn the In App Notification feature on by going to settings for the Model Driven App as below first.
Notifications can be sent using the SendAppNotification message using SDK.
You can either trigger from and can choose based on your convenience to trigger a similar notification.
Client Scripting
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
'body': `In-App Notifications in Model-Driven Apps are messages or alerts designed to notify users of important events or actions within the app. These notifications appear directly inside the application, providing a seamless way to deliver information without relying on external methods such as emails.`,
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
["body"] = @"In-App Notifications in Model-Driven Apps are messages or alerts designed to notify users of important events or actions within the app. These notifications appear directly inside the application, providing a seamless way to deliver information without relying on external methods such as emails.",
["ownerid"] = new EntityReference("systemuser", new Guid("00000000-0000-0000-0000-000000000000")),
["icontype"] = new OptionSetValue(100000003), // Warning
["toasttype"] = new OptionSetValue(200000000), // Timed
This blog post is all about the latest features released in Power BI Desktop for DAX(Data Analysis Expressions) using DAX Query View.
Do you have the requirement any time to document your DAX functions used in your Power BI Report, then use the DAX query view which introduced new DAX functions to get metadata about your semantic model with the INFO DAX functions.
Firstly, if you were not aware, DAX Query view is the recent addition where we can query the model similar to how the analysts and developers used Power BI Desktop or other 3rd party tools to get the same information earlier. You can access DAX Query view as below in green.
When you navigate to the DAX Query view, key points to note are as below
DAX Queries will be directly saved to your Model when saved from DAX Query View
DAX Query View will not be visible when the Power BI Report is published to the Power BI Service
The results of the DAX will be visible at the bottom of the page as shown below
IntelliSense is provided by default
There were 4 DAX INFO.VIEW Functions introduced as below
List all your Measures using INFO.VIEW.MEASURES() This lists down all the measures in your Semantic Model, it also provides the Expression used for the Measure along with which table it was created.
I have selected the whole results of the measures and Copy the results you see see in the table below
Just go to Model View and click Enter Data
You will be shown a screen like this
Just do a Cntrl + V as you have previously copied the table information
That’s it, how easy it was to document all the Measures, similarly you can document all the Meta Data available for the Power BI Report.
That’s it for today, hope you learned a new feature in Power BI Desktop…
This is the second blog post series on Canvas Apps where you can learn and grow from Zero – Hero in Canvas Power Apps. In this blog post, we will talk about different ways you can get started with creating canvas apps.
Introduction
Power Apps Canvas Apps allow users to build powerful applications with a drag-and-drop interface, requiring little to no coding. Whether you’re a beginner or an experienced user, setting up your first Canvas App is a straightforward process. This guide walks you through each step.
Basic knowledge of what you want to build (e.g., a simple data entry form).
Step 1: Accessing Power Apps Studio
There were different ways you can create a Canvas Apps
You can create a canvas app by giving your requirement in Copilot which will in turn build your Canvas Apps.
2. You can design them using any of the existing templates available
3. You can also design your App using Plan designer which is the latest feature released and still in preview, for this you need to enable
For this you need to have an plan available
You can click on See more plans option available, create new plans if necessary
You have to state your business problem, this is pretty much same as using the Copilot in the old experience but here you just tell what problem you have been solving by creating the App, that’s it.
I entered Tracking Student Attendances as my problem and within a matter of 1 min, it designed whole data model where you can accept or propose for a new change.
Once you accept, next it will go ahead and start preparing the data necessary.
After you accept this, it will start designing for the user experiences
Once everything is done, you
It will ask you to save in a solution, this way you will be able to save all your changes to a solution which can be safely moved across environments.
And that’s it, your fully functional app is ready in few mins.
Step 2: Designing Your App
Once inside the Power Apps Studio:
Drag and drop controls from the left-side panel to the canvas.
Add labels, text inputs, buttons, and galleries as needed.
Resize and align elements for a clean layout.
Below is the sample Power App screen in Studio containing the components.
Step 3: Connecting to a Data Source
Click on Data in the left panel.
Select Add data and choose a source like SharePoint, Dataverse, or Excel.
Connect your app to the data source.
Step 4: Adding Functionality with Formulas
Power Apps uses Excel-like formulas to add functionality. Example:
To navigate to another screen: Navigate(Screen2, ScreenTransition.Fade)
To filter data: Filter(Orders, Status="Pending")
Step 5: Previewing and Testing Your App
Click on the Play button in the top-right corner.
Test the app’s functionality.
Fix any layout or data issues as needed.
Image Suggestion: Screenshot showing the app running in preview mode.
Step 6: Saving and Publishing
Click File > Save As.
Choose Cloud as the storage option.
Click Publish to make your app available.
Image Suggestion: Screenshot of the Save & Publish screen.
Conclusion
Congratulations! You’ve built your first Canvas App. You can continue refining it by adding more features, integrating AI, or automating workflows.
Are you ready to explore more? Share your first Canvas App experience in the comments!
Do you want step by step guided walk through, then check this App in a Day Workshop from Microsoft where you can start from scratch and build a fully functional Canvas App.