Establishing tenant hygiene with the CoE Starter Kit – Learn COE #04

Hi Folks,

In this blog post, I am going to talk about establishing tenant hygiene using COE Stater kit, in today’s world where there increasing Power Platform demand. Organizations have become mature, that every implementation is now looking for having some kind of governance being established.

If you were some one who want to get some knowledge of implementing governance, you were at right place.

In order to efficiently implement governance, we need to understand the environment strategy, your current implementation has used. Of course if you were looking for some guidance, there were examples of tooling available in the CoE Starter Kit and out-of-the-box capabilities to help CoE teams effectively manage and optimize their Power Platform solutions.

Few key steps to be considered for maintaing this in your environment, so let’s get started…

  1. Define Environment Strategy
  • Assign your admins the Power Platform service admin or Dynamics 365 service admin role.
  • Restrict the creation of net-new trial and production environments to admins
  • Rename the default environment to ‘Personal Productivity’
  • Provision a new Production environment for non-personal apps/flows
  • Define and implement your DLP policies for your environments
  • When establishing a DLP strategy, you may need multiple environments for the same department
  • When establishing your Power Platform environment strategy, based upon your licensing, you may find that you need to provision environments without a Dataverse (previously called Common Data Service) database and also use DLP policies to restrict the use of premium connectors.
  • Establish a process for requesting access or creation of environments
  • Dev/Test/Production environments for specific business groups or application
  • Individual-use environments for Proof of Concepts and training workshops
  • Use a service account to deploy production solutions
  • Reduce the number of shared development environments
  • Share resources with Microsoft Entra Security Groups.

2. Compliance and Adoption:

The Compliance page in the CoE Starter Kit’s Compliance and adoption dashboard can help you identify apps and flows with no owners, noncompliant apps, and suspended flows.

  • Rename and secure the default environment
  • Identify unused apps, pending suspension, suspended cloud flows and not without an owner or not in solutions
  • Quarantined noncompliant apps and clean up orphaned resources
  • Enable Managed Environments and establish a data loss prevention policy
  • Apply cross tenant isolation
  • Assign Administrator roles appropriately
  • Apps and flows with duplicate names not compliant with DLP policies or billing policies
  • Apps shared with everyone and apps shared with more than 100 users and Apps not launched in the last month and in the last quarter
  • Flows using plain text passwords and using HTTP actions
  • Cross-tenant connections
  • Environments with no apps or flows
  • Custom connectors using HTTP environments

3. Managing Dataverse for Teams environments

If you were not using Dataverse for Teams, you can safely skip this step, else please review

The Microsoft Teams environments page in the CoE Starter Kits dashboard provides you with an overview of your existing Teams environments, apps and flows in those environments, and the last launched date of apps.

Screenshot of a Microsoft Teams Environments overview.

By checking for new Dataverse for Teams environments daily, organizations can ensure they’re aware of all environments in use. 

State of Dataverse for TeamsPower Platform action
83 days after no user activitySend a warning that the environment will be disabled. Update the environment state on the Environments list page and the Environment page.
87 days after no user activitySend a warning that the environment will be disabled. Update the inactive environment state on the Environments list page and the Environment page.
90 days after no user activityDisable the environment. Send a notice that the environment has been disabled. Update the disabled environment state on the Environments list page and the Environment page.
113 days after no user activitySend a warning that the environment will be deleted. Update the disabled environment state on the Environments list page and the Environment page.
117 days after no user activitySend a warning that the environment will be deleted. Update the disabled environment state on the Environments list page and the Environment page.
120 days after no user activityDelete the environment. Send a notice that the environment has been deleted.

Please note a warning is displayed only if the Dataverse for Teams environment is <= 7 days until disablement.

4. Highly used apps

The Power BI Dashboard available out of the box with COE Starter Kit will provide you the necessary guidance over high performing apps and also your most active users.

5. Communicating governance to your makers

This is one of the important step while setting up COE and governance guidelines, follow the below approaches

  • Clearly communicate the purpose and benefits of governance policies:Explain how governance policies protect organizational data
  • Make governance policies and guidelines easily accessible:Place the policies and guidelines in a central location that is easily accessible to all makers
  • Provide training and support:Offer training sessions and resources to help makers understand and comply with governance policies.
  • Encourage open communication: Foster culture where makers can ask questions and raise concerns about governance policies.
  • Incorporate governance into the development process:For example, you can require a compliance review before deploying a solution.

6. Administration of the platform

Power Platform Administrator Planning Tool which comes with COE Strater Kit provides guidance and best practices for administration. Also the planning tool can optimize environments, security, data loss prevention, monitoring and reporting.

6. Securing the environments

It is critical to establish a Data Loss Prevention (DLP) strategy to control connector availability.

The DLP editor (impact analysis) tool is available for use before making changes to existing policies or creating new DLP policies. It reveals the impact of changes on existing apps and cloud flows and helps you make informed decisions.

Reference: COE Starter Kit Documentation

If you face issues using the COE Starter Kit, you can always report them at https://aka.ms/coe-starter-kit-issues

Hope this helps…. someone maintaining tenant governance with COE starter kit…. if you have any feedback or questions, do let me know in comments….

Cheers,

PMDY

The refresh token has expired due to inactivity when connecting to Power Pages using Power Apps CLI – Quick Fix

Hi Folks,

This post is about a quick fix for an error occurred with Power Apps CLI.

I was trying to connect to my organization using CLI and that’s when I encountered this error.

Prerequisites:

Power Apps CLI, Visual Studio Code

After installing the prerequisites, I was trying to connect to my Power Pages available in my organization from VS Code terminal using below command.

pac paportalist 

It’s then I encountered the below error

It’s then I understood that due to inactivity, it is failing…

Your Power Platform CLI connection is failing due to an expired refresh token and an ExternalTokenManagement Authentication configuration issue. Here’s how you can resolve it:

Fix:

Reauthenticate with Dataverse

pac auth clear
pac auth create --url https://orgXXX.crm8.dynamics.com --username admin@Ecellors.onmicrosoft.com --password [your password]

Creating new authentication profile resolves this issue…

    Now try to run the above command.

    This should prompt a new login window to authenticate your request, provide the details and you should be able to login.

    Hope this helps..

    Cheers,

    PMDY

    Deploy dependent assemblies easily using PAC CLI

    Hi Folks,

    This is another post related to Plugins in Dynamics 365 CE.

    Considering medium to large scale implementations, there isn’t a single Power Platform Project which don’t require merging of external assemblies.

    We relied on ILMerge to merge those assemblies into a single DLL. We used to search for ILMerge assemblies in Nuget and installed them for use.

    Then the plugins are signed in for several reasons, primarily related to security, assembly integrity, and versioning of the sandbox worker process.

    But either of the above are no longer needed with the help of Dependent Assembly feature…with few simple steps, you can build the Plugin…Interesting, isn’t it, read on…

    Pre requisites:

    • Download Visual Studio 2022 Community Edition here
    • Download VS Code from here
    • Download Plugin registration tool from here
    • Download PAC CLI from here
    • Download and install NuGet Package Explorer from this link NuGet Package Explorer open the NuGet Package Explorer

    Avoid Direct Plugin Project Creation in Visual Studio

    • Never create a Plugin project directly from Visual Studio or any other IDE here after.
    Use Microsoft PowerApps CLI instead
    • Always use Power Apps CLI as it easy and only requires a single command to create the entire Plugin project scaffolding
    • This ensures a standardized and reliable development environment.
    • It automatically creates a Nuget Package file that will be used to avoid ‘Could not load assemblies or its dependencies‘.

    Ok, let’s begin.

    Once you have downloaded all the prerequisites mentioned, make sure you have installed them in your local machine. Others are straight forward to download, for NuGet Package explorer, you need to search in Windows store to install.

    1. Create a local folder for the Plugins

    Navigate to that folder from VS Code

    Now open terminal, run the pac command as below

    Execute the following command to create plugin project 

    • Browse to the directory where you want to create the plugin project
    • Execute the command on CMD to create plugin project “pac plugin init

    A plugin project will be created at your desired location as follows

    Plugin project in local folder will be created as below

    That’s it, you can close the VS Code for now.

    Click on the CS Proj file and open it in Visual Studio

    By default, 2 files are automatically created when you create a plugin project as shown above.

    Now will install Bouncy Castle which is an external library, right click on the Plugin Solution –> Manage Nuge Packages

    I have added Bouncy Castle NuGet Package to my plugin project for Encryption and Decryption. You can have your own required NuGet Package as per your need.

    Build your project

    After a successful build, you will get the output result as follows

    Browse the directory of your project

    Open the file Plugin_Project.1.0.0.nupkg in Nuget Package Explorer by double clicking it

    Now you can see that this nuget package file contains the information related to the added nuget package of Bouncy Castle that we want to include in our plugin project package as follows. In your case, you can have the required nuget package that you want to add 

    Now open up plugin registration tool

    Click to create new connection

    Provide login details and login

    Click to Register New Package

    Browse to the directory where your nuget package file was created automatically when you build the project and import this file 

    Select the Command Data Service Default Solution and import it

    Click on view and Display by package

    Now your Plugin Project is successfully registered with all dependent assemblies and ready to use.

    While this post gives you a structure on how you can do build a plugin assembly, you can add the business logic as per your need.

    Conclusion:

    In conclusion, navigating the intricacies of Microsoft Dynamics 365 CRM plugins demands a nuanced approach, especially when dealing with NuGet Packages and dependent assemblies. This article has delved into the critical process of resolving the persistent ‘Could not load assemblies or its dependencies‘ issue, offering a comprehensive, step-by-step demonstration.

    By following the recommended best practices, such as avoiding direct plugin project creation in Visual Studio and harnessing the power of Microsoft PowerApps CLI, developers can establish a standardized and reliable development environment. The CLI’s automatic creation of a NuGet Package file not only streamlines the process but also reduces the errors.

    To further facilitate your journey, prerequisites such as downloading and installing essential tools like the Plugin Registration tool, Microsoft PowerApps CLI, and NuGet Package Explorer are highlighted. The guide emphasizes the significance of these tools in ensuring a smooth plugin development experience.

    By adopting these practices and incorporating the suggested steps into your workflow, you not only troubleshoot existing issues but also fortify your understanding of the entire process. Take charge of your Dynamics 365 CRM plugin development, elevate your skills, and sidestep common pitfalls by mastering the art of handling NuGet Packages and dependencies seamlessly.

    References:

    Build and package plug-in code

    Cheers,

    PMDY 

    Fix Plugin Registration Tool Connection Issues with Multi-Factor Authentication (MFA) Enabled in Dynamics 365

    Hi Folks,

    It’s been a since I posted on Dynamics 365 Plugins, so this blog post talks about one small tip when connecting to your Dynamics 365 instance from Plugin Registration Tool either if you were connecting from Standalone Plugin Registration Tool or using Plugin Registration Tool from XrmToolBox.

    If you were looking to install plugin registration tool itself, you can check the below post or if you want to learn about all Plugin related issues at once, you can check the references at the bottom of this post, else you can continue reading this post.

    If you don’t know this tip, it will be difficult and least you will spend many minutes figuring out the error message you see in the Plugin registration tool.

    This is applicable for applications who have MFA enabled, even if you haven’t enabled, it was enabled by Microsoft by default to enforce security.

    As usually, you select:

    1. Office 365
    2. Enable Display list of available organizations, Show Advanced
    3. Provide User Name, Password
    4. Click on Login

    You will be prompted this error in such case

    ======================================================================================================================
    Source : Microsoft.IdentityModel.Clients.ActiveDirectory
    Method : MoveNext
    Date : 12/4/2025
    Time : 5:09:52 pm
    Error : AADSTS50076: Due to a configuration change made by your administrator, or because you moved to a new location, you must use multi-factor authentication to access '00000007-0000-0000-c000-000000000000'. Trace ID: 7a7cac23-056c-4e77-ba82-98d50c0b7001 Correlation ID: d8b32fe6-6197-4d9a-a460-3834c8dc292a Timestamp: 2025-04-12 09:09:52Z
    Stack Trace : at Microsoft.Xrm.Tooling.Connector.CrmWebSvc.ProcessAdalExecption(Uri serviceUrl, ClientCredentials clientCredentials, X509Certificate2 userCert, UserIdentifier& user, String clientId, Uri redirectUri, PromptBehavior promptBehavior, String tokenCachePath, Boolean isOnPrem, String authority, Uri& targetServiceUrl, AuthenticationContext& authContext, String& resource, CrmLogEntry logSink, Boolean useDefaultCreds, String& authToken, AdalException adalEx)
    at Microsoft.Xrm.Tooling.Connector.CrmWebSvc.ExecuteAuthenticateServiceProcess(Uri serviceUrl, ClientCredentials clientCredentials, X509Certificate2 userCert, UserIdentifier user, String clientId, Uri redirectUri, PromptBehavior promptBehavior, String tokenCachePath, Boolean isOnPrem, String authority, Uri& targetServiceUrl, AuthenticationContext& authContext, String& resource, UserIdentifier& userIdent, CrmLogEntry logSink, Boolean useDefaultCreds, SecureString clientSecret)
    at Microsoft.Xrm.Tooling.Connector.CrmWebSvc.DiscoverGlobalOrganizations(Uri discoveryServiceUri, ClientCredentials clientCredentials, X509Certificate2 loginCertificate, UserIdentifier user, String clientId, Uri redirectUri, PromptBehavior promptBehavior, String tokenCachePath, Boolean isOnPrem, String authority, UserIdentifier& userOut, CrmLogEntry logSink, Boolean useGlobalDisco, Boolean useDefaultCreds)
    at Microsoft.Xrm.Tooling.Connector.CrmWebSvc.DiscoverOrganizations(Uri discoveryServiceUri, ClientCredentials clientCredentials, UserIdentifier user, String clientId, Uri redirectUri, PromptBehavior promptBehavior, String tokenCachePath, Boolean isOnPrem, String authority, UserIdentifier& userOut, CrmLogEntry logSink, Boolean useGlobalDisco, Boolean useDefaultCreds)
    at Microsoft.Xrm.Tooling.CrmConnectControl.CrmConnectionManager.QueryOAuthDiscoveryServer(Uri discoServer, ClientCredentials liveCreds, UserIdentifier user, String clientId, Uri redirectUri, PromptBehavior promptBehavior, String tokenCachePath, Boolean useGlobalDisco)
    at Microsoft.Xrm.Tooling.CrmConnectControl.CrmConnectionManager.QueryOnlineServerList(ObservableCollection`1 svrs, OrganizationDetailCollection col, ClientCredentials liveCreds, Uri trimToDiscoveryUri, Uri globalDiscoUriToUse)
    at Microsoft.Xrm.Tooling.CrmConnectControl.CrmConnectionManager.FindCrmOnlineDiscoveryServer(ClientCredentials liveCreds)
    at Microsoft.Xrm.Tooling.CrmConnectControl.CrmConnectionManager.ValidateServerConnection(CrmOrgByServer selectedOrg)
    ======================================================================================================================
    Inner Exception Level 1 :
    Source : Not Provided
    Method : Not Provided
    Date : 12/4/2025
    Time : 5:09:52 pm
    Error : Response status code does not indicate success: 400 (BadRequest).
    Stack Trace : Not Provided
    ======================================================================================================================
    Inner Exception Level 2 :
    Source : Not Provided
    Method : Not Provided
    Date : 12/4/2025
    Time : 5:09:52 pm
    Error : {"error":"interaction_required","error_description":"AADSTS50076: Due to a configuration change made by your administrator, or because you moved to a new location, you must use multi-factor authentication to access '00000007-0000-0000-c000-000000000000'. Trace ID: 7a7cac23-056c-4e77-ba82-98d50c0b7001 Correlation ID: d8b32fe6-6197-4d9a-a460-3834c8dc292a Timestamp: 2025-04-12 09:09:52Z","error_codes":[50076],"timestamp":"2025-04-12 09:09:52Z","trace_id":"7a7cac23-056c-4e77-ba82-98d50c0b7001","correlation_id":"d8b32fe6-6197-4d9a-a460-3834c8dc292a","error_uri":"https://login.microsoftonline.com/error?code=50076&quot;,"suberror":"basic_action"}: Unknown error
    Stack Trace : Not Provided
    ======================================================================================================================
    ======================================================================================================================
    Inner Exception Level 2	: 
    Source	: Not Provided
    Method	: Not Provided
    Date	: 12/4/2025
    Time	: 5:09:52 pm
    Error	: {"error":"interaction_required","error_description":"AADSTS50076: Due to a configuration change made by your administrator, or because you moved to a new location, you must use multi-factor authentication to access '00000007-0000-0000-c000-000000000000'. Trace ID: 7a7cac23-056c-4e77-ba82-98d50c0b7001 Correlation ID: d8b32fe6-6197-4d9a-a460-3834c8dc292a Timestamp: 2025-04-12 09:09:52Z","error_codes":[50076],"timestamp":"2025-04-12 09:09:52Z","trace_id":"7a7cac23-056c-4e77-ba82-98d50c0b7001","correlation_id":"d8b32fe6-6197-4d9a-a460-3834c8dc292a","error_uri":"https://login.microsoftonline.com/error?code=50076","suberror":"basic_action"}: Unknown error
    Stack Trace	: Not Provided
    ======================================================================================================================

    Based on the above inner exception, we can clearly understand that it is looking for Multifactor Authentication, so untick the Show Advanced checkbox, it then asks for Multifactor Authentication as shown below.

    That’s it, with this simple tick of unchecking the Show Advanced, you were able to overcome this error, how cool is it…?

    I have written lot of articles with respect to Plugin registration tool, you can check them below

    Issues related to Plugins and Plugin Registration Tool

    Hope this helps…

    Cheers,

    PMDY

    Visualize this view – what this mean to developers and end users…?

    Hi Folks,

    Have you noticed Visualize this view button in in the app bar of any grid view of Dynamics 365?

    Here is a dashboard built within couple of minutes. While this can greatly help end users visualize the data present in the system. So, in this post, let’s understand bit more details about this capability and what are the some of the features which are left behind.

    Let’s understand the how’s this is generated along with its capabilities and disadvantages compared to traditional Power BI dashboard both from Developer and end user perspective, please note that this is my understanding..

    For Developers:

    a. Visualize this view uses a PCF Control which calls the Power BI REST API and then generates the embed token for the report embedding that into an Iframe.

    b. Then uses Power BI JavaScript API to handle user interactions with the embedded report such as filtering or highlighting data points.

    c. When Power BI first generates your report, it takes a look through your data to identify patterns and distributions and pick a couple of fields to use as starting points for creating the initial set of visuals when data is not preselected.

    d. Any changes to the data fields calls the UpdateView of the PCF Control and there by passing the updated data fields to REST API and then displays the visuals.

    e. Visuals will be created with both selected and non-selected fields which are the related to the selected fields in the data pane.

    For End Users & Developers:

    Advantages:

    1. Visuals are generated when no data is selected
    2. Cross Highlighting is possible
    3. Click on the Report to see Personalize this visual option
    4. People with Contributor, Member, or Admin role assigned can save the Report to workspace
    5. Users with no access to Power BI cant view this feature, they can request for free Power BI License
    6. Free License users can save the Report to thier personal workspace
    7. Users get build permission when any role above Contributor is assigned and reshare permission is given
    8. The report will be saved as direct query with SSO enabled and honours dataverse settings
    9. Show data table presents a list of tables if the model comprises of multiple tables.
    10. Able to specify the aggregation for each of the field in the model.

    Disadvantages:

    1. Only able to export summarized data from Visuals, you will be able to export the data in table from data table.
    2. Only Visual Level, no page level or report level filters
    3. During these reports creation, the model is configured to use Direct Query with Single Sign On.
    4. Embed a report on a Dataverse form requires modifying the XML of the solution
    5. Report published into the workspace are available to download but downloaded reports will not be able to customize further in Power BI Desktop as it would be built using Native Queries.
    6. If the page is kept ideal for long time or the user navigates to other browser window, the session and report will be lost.

    Considerations & Limitations:

    1. Power BI Pro license is required to create these reports
    2. While this is wonderful for end users to visualize the data but this is not an alteranative to building reports using Power BI Desktop.

    Hope this helps.

    Cheers,

    PMDY

    Using Power-Fx commands in Vs Code Terminal – Preview

    Hi Folks,

    Looking to improve your experience developing code using Power Fx. Then try this feature which was in preview…I want to show you how you can execute them. It is actually quite ideal for prototyping, debugging, testing to see the Power-Fx expression behavior. If you were new to Power-Fx, you can check my introductory post here on this which I wrote way back in 2021.

    All you can do this is to execute few commands from VS Code, they directly run on your Dataverse environment.

    Few prerequisites to follow:

    1. Install VS Code (install from here)
    2. Install Power Platform CLI (install from here)
    3. Install PowerShell Module for VS Code (install from here)
    4. Install Power Platform Tools for VS Code (install from here)
    5. Once you have everything installed you were good to go, few more steps required to set up your VS Code to run successfully.

    As you have already installed the Power Platform Tools extension above, you should see an icon at the side bar as highlighted below.

    Create an authentication profile for your target environment, click on the plus symbol besides the AUTH PROFILES, I have already created few earlier.

    Provide your login credentials using the login prompt.

    Once authenticated, you should be able to see all your environments in your tenant like below.

    Open a terminal in VS Code

    You should see something like below.

    Now you were all set to run Power-Fx Commands targeting your envirnment, let’s try it out. In order to interact with Dataverse, use the below commands, thereby reducing the time and complexity of your Dataverse operations by using Power Fx.

    1: pac org who: Displays information about your current Dataverse Organization

    2: pac admin list :List all environments from your tenant.

    3. Read-Eval-Print Loop & Dataverse Commands

    We heard about Read-Eval-Print Loop while working on other languages mainly Python, we now have that included while using Power Fx, to start using if. Enter the below in your Vs Code Terminal, it should show something like below and now you can execute commands and once connected, you can use Dataverse commands.

    By the way, we need to use Plural names below.

    pac power-fx repl command:

    a. Add Rows: Use Collect (Contacts, { firstname: “Pavan”, lastname: “Mani Deep”})

    b. Help: Use Help ()

    c. Exit: Use Exit() to exit the repl command

    4. Run Power-Fx Commands from a file: All you need to do is save the Power-Fx commands in a text file at the below location.

    pac power-fx run:

    With this command, you can run a file of Power Fx instructions.

    a. Create Dataverse records: By using same Collect Command we used above in a file.

    Now execute the command

    pac power-fx run –file Name of the file -e true

    b. Query a Dataverse table: Save the below command in a file located in the folder path.

    Now execute the command

    c. Filter a Dataverse table: While I used the filter command, I was not able to filter the data, rather I was getting null. I hope Microsoft will be fixing this when these features are out of preview.

    I hope this gives you an idea of how you can execute Power-Fx commands within your favorite IDE(Vs Code).

    Cheers,

    PMDY

    Global AI Bootcamp 2025 – Singapore

    🚀 You’re Invited to the Global AI Bootcamp 2025 – Singapore Edition!

    Are you passionate about AI, Power Platform, and Microsoft technologies? Want to learn how AI is transforming businesses and industries? Then this event is for you!

    🎯 What to Expect?
    ✅ Expert-led sessions on AI, Copilot, Power Platform, and more
    ✅ Hands-on workshops to apply AI in real-world scenarios
    ✅ Networking opportunities with industry leaders and AI enthusiasts
    ✅ Absolutely FREE to attend!


    Event Date: Saturday, 8 March 2025

    10:00 am – 4:00 pm (GMT+8)

    Meeting Link

    This is your chance to upgrade your AI skills, connect with like-minded professionals, and explore the latest AI innovations.

    #GlobalAIBootcamp #AI #MicrosoftAI #PowerPlatform #SingaporeTech #FreeEvent

    Call Custom Actions in Dataverse using Web API – Quick Recap

    Hi Folks,

    Here is how you can quickly call action using Web API, with this method you can execute a single action, function, or CRUD operation. In the below example, let’s see how you can call an action. Here is function…to achieve this..

    var formContext = executionContext.getFormContext();
    var message = "Please enter a valid NRIC Number";
    var uniqueId = "nric_valid";
    var NRIC = formContext.getAttribute("new_nric").getValue();
    if(NRIC !== null)
    {
    var execute_ValidateNRIC = {
    NRIC: NRIC, // Call this function only when NRIC value which is non-null
    getMetadata: function () {
    return {
    boundParameter: null,
    parameterTypes: {
    NRIC: { typeName: "Edm.String", structuralProperty: 1 }
    },
    operationType: 0,
    operationName: "new_ValidateNRIC",
    outputParameterTypes: {
    IsValid: { typeName: "Edm.Boolean" }
    }
    };
    }
    };
    Xrm.WebApi.execute(execute_new_ValidateNRIC).then(
    function success(response) {
    if (response.ok) {
    response.json().then(function (data) {
    if (!data.IsValid) {
    formContext.getControl("new_nric").setNotification(message, uniqueId);
    } else {
    formContext.getControl("new_nric").clearNotification(uniqueId);
    }
    }).catch(function (error) {
    Xrm.Navigation.openAlertDialog("Error occured from Validate NRIC "+error);
    });
    }
    }
    ).catch(function (error) {
    Xrm.Navigation.openAlertDialog(error.message);
    }).catch(function (error) {
    Xrm.Navigation.openAlertDialog(error.message);
    });
    }
    }

    This example details with Unbound action, which is not tagged to any entity, however if in case on Bound action, you will specify the entity name for bound parameter instead of null. You need to specify the Metadata accordingly for your Action. Let’s understand it’s syntax first…

    Xrm.WebApi.online.execute(request).then(successCallback, errorCallback);

    Parameters

    Encrypting/Decrypting a file using Public & Private Key Pair with GnuPG

    Hi Folks,

    Thank you for visiting my blog today…this is post is mainly for Pro developers. Encryption is crucial to maintain the confidentiality in this digital age for the security of our sensitive information. So here is a blog about it. This is in continuation to my previous blog post on encrypting files using GnuPG.

    In this blog post, I will give you sample how you can encrypt/decrypt using GnuPG with command line scripts from C# code.

    If you didn’t go through my previous article, I strongly recommend you go through that article below first to understand the background.

    Next, in order to encrypt/decrypt a given csv file (taken for simplicity), we can use the following C# codes. For illustration purpose, I have just provided you the logic in the form of a Console.

    Encryption:

    using System;
    using System.Collections.Generic;
    using System.Linq;
    using System.Text;
    using System.Threading.Tasks;
    using System.Diagnostics;
    namespace eHintsBatchDecryptionTest
    {
    class Program
    {
    static void Main(string[] args)
    {
    string gpgPath = @"D:\Softwares\Kleo Patra\GnuPG\bin\gpg.exe"; //This is the place where you have installed GnuPG Software
    string inputFile = "location of input file";
    string outputFile = "location of output file";
    string passphrase = "passPhrase";
    DecryptGPGFile(gpgPath, inputFile, outputFile, passphrase);
    }
    static void DecryptGPGFile(string gpgPath, string inputFile, string outputFile, string passphrase)
    {
    using (Process process = new Process())
    {
    process.StartInfo.FileName = gpgPath;
    process.StartInfo.Arguments = $"–batch –yes –pinentry-mode=loopback –passphrase {passphrase} -d -o \"{outputFile}\" \"{inputFile}\"";
    process.StartInfo.UseShellExecute = false;
    process.StartInfo.RedirectStandardOutput = true;
    process.StartInfo.RedirectStandardError = true;
    process.StartInfo.RedirectStandardInput = true;
    process.StartInfo.CreateNoWindow = true;
    process.Start();
    string output = process.StandardOutput.ReadToEnd();
    string error = process.StandardError.ReadToEnd();
    process.WaitForExit();
    if (process.ExitCode == 0)
    {
    Console.WriteLine("Decryption successful.");
    }
    else
    {
    Console.WriteLine("Decryption failed. Error: " + error);
    }
    }
    }
    }
    }

    Decryption:

    using System;
    using System.Collections.Generic;
    using System.Linq;
    using System.Text;
    using System.Threading.Tasks;
    using System.Diagnostics;
    namespace BatchDecryptionTest
    {
    class Program
    {
    static void Main(string[] args)
    {
    string gpgPath = @"D:\Softwares\Kleo patra\GnuPG\bin\gpg.exe";//Once GPG installed, you can look for gpg.exe in the bin folder of the installation
    string inputFile = "Input encrypted file";//Replace with your gpg encrypted file location
    string outputFile = "Decrypted CSV file"; //give it a name for the decrypted file and location, output file path doesnt exists yet, you may give a sample name
    string passphrase = "passPhrase";
    DecryptGPGFile(gpgPath, inputFile, outputFile, passphrase);
    }
    static void DecryptGPGFile(string gpgPath, string inputFile, string outputFile, string passphrase)
    {
    using (Process process = new Process())
    {
    process.StartInfo.FileName = gpgPath;
    process.StartInfo.Arguments = $"–batch –yes –pinentry-mode=loopback –passphrase {passphrase} -d -o \"{outputFile}\" \"{inputFile}\""; //Pass the PassPhrase, Input and Output file paths as parameters
    process.StartInfo.UseShellExecute = false;
    process.StartInfo.RedirectStandardOutput = true;
    process.StartInfo.RedirectStandardError = true;
    process.StartInfo.RedirectStandardInput = true;
    process.StartInfo.CreateNoWindow = true;
    process.Start();
    string output = process.StandardOutput.ReadToEnd();
    string error = process.StandardError.ReadToEnd();
    process.WaitForExit();
    if (process.ExitCode == 0)
    {
    Console.WriteLine("Decryption successful.");
    }
    else
    {
    Console.WriteLine("Decryption failed. Error: " + error);
    }
    }
    }
    }
    }

    All you need is to copy and replace the file locations in the code. Sit back and enjoy encrypting and decrypting with GnuPG. I should say once known, this is the easiest way to encrypt/decrypt from C# code, no strings attached.

    If you need any other information, please do let me know in comments.

    Cheers,

    PMDY