Want to check if you have added metadata to the entity in Power Apps – Table Segmentation properties?

Hi Folks,

After a break, I am back with my next blog post, this is a very short one.

Whenever you were working on any implementation, you could have added entity assets to the solution, many people miss adding metadata for the entity, since they don’t have a way to check properly, folks end up removing and readding the entity with metadata toggle on.

But don’t worry, here is a simple way to check this..

Let’s say you have added a table to the form like below

Now you want to add the metadata for this, click on the table name below

Click on Elipses…

Choose table segmentation as shown above

So as highlighted above, you can include all the objects or include table metadata.

Hope this small tip helps…so even if you miss adding metadata, you can safely add it later at any point of time.

Cheers,

PMDY

Building a Cloud-Native Power Apps ALM Pipeline with GitLab, Google Cloud Build and PAC CLI: Streamlining Solution Lifecycle Automation

A unique combination to achieve deployment automation of Power Platform Solutions

Hi Folks,

This post is about ALM in Power Platform integrating with a different ecosystem than usual, i.e. using Google Cloud, sounds interesting..? This approach is mainly intended for folks using Google Cloud or GitLab as part of their implementation.

Integrating Google Cloud Build with Power Platform for ALM (Application Lifecycle Management) using GitLab is feasible and beneficial. This integration combines GitLab as a unified DevOps platform with Google Cloud Build for executing CI/CD pipelines, enabling automated build, test, export, and deployment of Power Platform solutions efficiently. This was the core idea for my session on Friday 28 November, at New Zealand Business Applications Summit 2025.

Detailed Steps for this implementation

Create an access token in GitLab for API Access and Read Access

Click on Add new token, you can select at the minimum the below scopes while you were working with CI-CD using GitLab

Create a host connection for the repository in GitLab

Specify the personal access token created in the previous step

Link your repository

The created host connections in the previous step will be shown under Connec ctions drop down

Create Trigger in Google Cloud Build

Click on Create trigger above, provide a name, select a nearest region

Event:

For now, I am choosing Manual invocation for illustration

Specify where the name of the Repository where your YAML in GitLab resides

You can optionally specify the substitution variables which are nothing but parameters you can pass to your pipeline from Google Cloud Build Configuration

You can optionally give this for any approval and choose the service account tagged to your google account in the drop down.

Click on Save.

Next proceed to GitLab YAML

You can find the full code below

steps:
– id: "export_managed"
name: "mcr.microsoft.com/dotnet/sdk:9.0"
entrypoint: "bash"
args:
– "-c"
– |
echo "=== 🏁 Starting Export Process ==="
# ✅ Define solution name from substitution variable
SOLUTION_NAME="${_SOLUTION_NAME}"
# ✅ Install PAC CLI
mkdir -p "${_HOME}/.dotnet/tools"
dotnet tool install –global Microsoft.PowerApps.CLI.Tool –version 1.48.2 || true
# Add dotnet global tools dir to the shell PATH for this step/session (preserve existing PATH)
export PATH="$_PATH:${_HOME}/.dotnet/tools"
echo "=== 🔐 Authenticating to Power Platform Environment ==="
pac auth create –name "manual" –url "https://ecellorsdev.crm8.dynamics.com" –tenant "XXXXX-XXXX-XXXXX-XXXXXX-XXXXX" –applicationId "XXXXXXXXXXXXXX" –clientSecret "XXXXXXXXXXXXXXXX"
pac auth list
echo "=== 📦 Exporting Solution: ${_SOLUTION_NAME} ==="
pac solution export \
–name "${_SOLUTION_NAME}" \
–path "/tmp/${_SOLUTION_NAME}.zip" \
–managed true \
–environment "${_SOURCE_ENV_URL}"
echo "=== ✅ Solution exported to /tmp/${_SOLUTION_NAME}.zip ==="
echo "=== 🔐 Authenticating to Target Environment ==="
pac auth create \
–name "target" \
–url "https://org94bd5a39.crm.dynamics.com" \
–tenant "XXXXXXXXXXXXXXXXXXXXXXXX" \
–applicationId "XXXX-XXXXX-XXXXX-XXXXXX" \
–clientSecret "xxxxxxxxxxxxxxxxxxxx"
echo "=== 📥 Importing Solution to Target Environment ==="
pac solution import \
–path "/tmp/${_SOLUTION_NAME}.zip" \
–environment "${_TARGET_ENV_URL}" \
–activate-plugins \
–publish-changes
echo "=== 🎉 Solution imported successfully! ==="
options:
logging: CLOUD_LOGGING_ONLY
substitutions:
_SOLUTION_NAME: "PluginsForALM_GitLab"
_SOURCE_ENV_URL: "https://org.crm.dynamics.com"
_TARGET_ENV_URL: "https://org.crm.dynamics.com"
_TENANT_ID: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
_CLIENT_ID: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
_CLIENT_SECRET: "xxxxxxxxxxxxxxxxxxxxxxxxxxxx"
_SOLUTIONS_DIR: "/workspace/Plugins/08112025"
view raw GitLabDemo.yaml hosted with ❤ by GitHub

Solution from Source Environment

Now lets run the trigger which will export the solution from the source environment and import to the target environment….we have manual trigger, automatic trigger whenever there is an commit to the repo in GitLab etc., you may pick whatever suits your needs best.

Solution imported to the target environment using Google Cloud Build

The below table illustrates key differences between Google Cloud Build and Azure Devops….

AspectGoogle Cloud BuildAzure DevOps Build Pipelines
Pricing ModelPay-as-you-go with per-second billingPer-minute billing with tiered pricing
Cost OptimizationSustained use discounts, preemptible VMsReserved capacity and enterprise agreements
Build EnvironmentServerless, container-native, managed by Google CloudRequires self-hosted or paid hosted agents
Free TierAvailable with build minutes and creditsAvailable but more limited
Operational OverheadLow, no need to manage build agentsHigher, managing agents or paying for hosted agents
Ideal ForVariable, short, or containerized workloadsLarge Microsoft-centric organizations
Integration Cost ImpactTightly integrated with Google Cloud serverless infrastructureIntegrated with Microsoft ecosystem but may incur additional licensing costs

Conclusion:

PAC CLI is a powerful command-line tool that automates authentication, environment, and solution management within Power Platform ALM, enabling consistent and repeatable deployment workflows. It integrates smoothly with DevOps tools like GitLab and Google Cloud Build, helping teams scale ALM practices efficiently while maintaining control and visibility over Power Platform environments. Just note, my intention was showcase the power of PAC CLI with wider ecosystem, not only with Microsoft.

Cheers,

PMDY

Power Platform Solution Blue Print Review – Quick Recap

The Solution blueprint review is covers all required topics. The workshop can also be conducted remotely. When the workshop is done remotely, it is typical to divide the review into several sessions over several days.

The following sections cover the top-level topics of the Solution blueprint review and provide a sampling of the types of questions that are covered in each section.

Program strategy

Program strategy covers the process and structures that will guide the implementation. It also reviews the approach that will be used to capture, validate, and manage requirements, and the plan and schedule for creation and adoption of the solution.

This topic focuses on answering questions such as:

  • What are the goals of the implementation, and are they documented, well understood, and can they be measured?
  • What is the methodology being used to guide the implementation, and is it well understood by the entire implementation team?
  • What is the structure that is in place for the team that will conduct the implementation?
  • Are roles and responsibilities of all project roles documented and understood?
  • What is the process to manage scope and changes to scope, status, risks, and issues?
  • What is the plan and timeline for the implementation?
  • What is the approach to managing work within the plan?
  • What are the external dependencies and how are they considered in the project plan?
  • What are the timelines for planned rollout?
  • What is the approach to change management and adoption?
  • What is the process for gathering, validating, and approving requirements?
  • How and where will requirements be tracked and managed?
  • What is the approach for traceability between requirements and other aspects of the implementation (such as testing, training, and so on)?
  • What is the process for assessing fits and gaps?

Test strategy

Test strategy covers the various aspects of the implementation that deal with validating that the implemented solution works as defined and will meet the business need.

This topic focuses on answering questions such as:

  • What are the phases of testing and how do they build on each other to ensure validation of the solution?
  • Who is responsible for defining, building, implementing, and managing testing?
  • What is the plan to test performance?
  • What is the plan to test security?
  • What is the plan to test the cutover process?
  • Has a regression testing approach been planned that will allow for efficient uptake of updates?

Business process strategy

Business process strategy considers the underlying business processes (the functionality) that will be implemented on the Microsoft Dynamics 365 platform as part of the solution and how these processes will be used to drive the overall solution design.

This topic focuses on answering questions such as:

  • What are the top processes that are in scope for the implementation?
  • What is currently known about the general fit for the processes within the Dynamics 365 application set?
    • How are processes being managed within the implementation and how do they relate to subsequent areas of the solution such as user stories, requirements, test cases, and training?
    • Is the business process implementation schedule documented and understood?
    • Are requirements established for offline implementation of business processes?

Based on the processes that are in scope, the solution architect who is conducting the review might ask a series of feature-related questions to gauge complexity or understand potential risks or opportunities to optimize the solution based on the future product roadmap.

Application strategy

Application strategy considers the various apps, services, and platforms that will make up the overall solution.

This topic focuses on answering questions such as:

  • Which Dynamics 365 applications or services will be deployed as part of the solution?
  • Which Microsoft Azure capabilities or services will be deployed as part of the solution?
  • What if new external application components or services will be deployed as part of the solution?
  • What if legacy application components or services will be deployed as part of the solution?
  • What extensions to the Dynamics 365 applications and platform are planned?

Data strategy

Data strategy considers the design of the data within the solution and the design for how legacy data will be migrated to the solution.

This topic focuses on answering questions such as:

  • What are the plans for key data design issues like legal entity structure and data localization?
  • What is the scope and planned flow of key master data entities?
  • What is the scope and planned flow of key transactional data entities?
  • What is the scope of data migration?
  • What is the overall data migration strategy and approach?
  • What are the overall volumes of data to be managed within the solution?
  • What are the steps that will be taken to optimize data migration performance?

Integration strategy

Integration strategy considers the design of communication and connectivity between the various components of the solution. This strategy includes the application interfaces, middleware, and the processes that are required to manage the operation of the integrations.

This topic focuses on answering questions such as:

  • What is the scope of the integration design at an interface/interchange level?
  • What are the known non-functional requirements, like transaction volumes and connection modes, for each interface?
  • What are the design patterns that have been identified for use in implementing interfaces?
  • What are the design patterns that have been identified for managing integrations?
  • What middleware components are planned to be used within the solution?

Business intelligence strategy

Business intelligence strategy considers the design of the business intelligence features of the solution. This strategy includes traditional reporting and analytics. It includes the use of reporting and analytics features within the Dynamics 365 components and external components that will connect to Dynamics 365 data.

This topic focuses on answering questions such as:

  • What are the processes within the solution that depend on reporting and analytics capabilities?
  • What are the sources of data in the solution that will drive reporting and analytics?
  • What are the capabilities and constraints of these data sources?
  • What are the requirements for data movement across solution components to facilitate analytics and reporting?
  • What solution components have been identified to support reporting and analytics requirements?
  • What are the requirements to combine enterprise data from multiple systems/sources, and what does that strategy look like?

Security strategy

Security strategy considers the design of security within the Dynamics 365 components of the solution and the other Microsoft Azure and external solution components.

This topic focuses on answering questions such as:

  • What is the overall authentication strategy for the solution? Does it comply with the constraints of the Dynamics 365 platform?
  • What is the design of the tenant and directory structures within Azure?
  • Do unusual authentication needs exist, and what are the design patterns that will be used to solve them?
  • Do extraordinary encryption needs exist, and what are the design patterns that will be used to solve them?
  • Are data privacy or residency requirements established, and what are the design patterns that will be used to solve them?
  • Are extraordinary requirements established for row-level security, and what are the design patterns that will be used to solve them?
  • Are requirements in place for security validation or other compliance requirements, and what are the plans to address them?

Application lifecycle management strategy

Application lifecycle management (ALM) strategy considers those aspects of the solution that are related to how the solution is developed and how it will be maintained given that the Dynamics 365 apps are managed through continuous update.

This topic focuses on answering questions such as:

  • What is the preproduction environment strategy, and how does it support the implementation approach?
  • Does the environment strategy support the requirements of continuous update?
  • What plan for Azure DevOps will be used to support the implementation?
  • Does the implementation team understand the continuous update approach that is followed by Dynamics 365 and any other cloud services in the solution?
  • Does the planned ALM approach consider continuous update?
  • Who is responsible for managing the continuous update process?
  • Does the implementation team understand how continuous update will affect go-live events, and is a plan in place to optimize versions and updates to ensure supportability and stability during all phases?
  • Does the ALM approach include the management of configurations and extensions?

Environment and capacity strategy

Deployment architecture considers those aspects of the solution that are related to cloud infrastructure, environments, and the processes that are involved in operating the cloud solution.

This topic focuses on answering questions such as:

  • Has a determination been made about the number of production environments that will be deployed, and what are the factors that went into that decision?
  • What are the business continuance requirements for the solution, and do all solution components meet those requirements?
  • What are the master data and transactional processing volume requirements?
  • What locations will users access the solution from?
  • What are the network structures that are in place to provide connectivity to the solution?
  • Are requirements in place for mobile clients or the use of other specific client technologies?
  • Are the licensing requirements for the instances and supporting interfaces understood?

Solution blueprint is very essential for an effective Solution Architecture, using the above guiding principles will help in this process.

Thank you for reading…

Hope this helps…

Cheers,

PMDY

Triggers not available in Custom Connectors – Quick Review

Power Platform folks rarely build new custom connectors in a project, while most of them work on existing ones, it is often observed that the triggers are missing from the custom connector, below are the steps you can review if so…

1. Wrong Portal

If you’re building the connector in Power Apps, you won’t see trigger options. ✅ Fix: Use the Power Automate portal to define and test triggers. Only Power Automate supports trigger definitions for custom connectors.

2. Trigger Not Properly Defined

If your OpenAPI (Swagger) definition doesn’t include a valid x-ms-trigger, the trigger won’t appear.

Fix:

  • Make sure your OpenAPI includes a webhook or polling trigger.
  • Example:json"x-ms-trigger": { "type": "Webhook", "workflow": true }

3. Connector Not Refreshed

Sometimes, even after updating the connector, the UI doesn’t refresh.

Fix:

  • Delete and re-add the connector in your flow.
  • Or create a new connection in Power Automate to force a refresh.

4. Licensing or Environment Issues

If you’re in a restricted environment or missing permissions, triggers might not be available.

Fix:

  • Check if your environment allows custom connectors with triggers.
  • Ensure your user role has permission to create and use custom connectors.

5. Incorrect Host/Path in Swagger

If the host or path fields in your Swagger are misconfigured, the connector might fail silently.

Fix:

  • Ensure the host and path are correctly defined.
  • Avoid using just / as a path — use something like /trigger/start instead.

5. Incorrect Environment

Make sure you were in the right environment of the Power Platform, sometimes when juggling things around, we often mistakenly try using connectors from a wrong environment. Do take a note.

Finally you will be able to see Triggers while creating custom connectors…

Hope reviewing these will help…

Cheers,

PMDY

Python + Dataverse Series – #02 – use Datavese Web API using Python

Hi Folks,

This is in continuation to the previous blog post…if you haven’t gone through the earlier post on connecting to Dataverse using Python, please have a look here

Now, we will see how you can retrieve the records in Dataverse using Web API using Python…

  1. Follow the previous blog post for connecting to Dataverse using Python
  2. Once you get the access token via the TDS End point, we can invoke the Dataverse Web API using below code…
import pyodbc
import msal
import requests
import json
# Azure AD details
client_id = 'XXXX'
client_secret = 'XXXX'
tenant_id = 'XXXX'
authority = f'https://login.microsoftonline.com/{tenant_id}'
resource = 'https://XXXX.crm8.dynamics.com'
# SQL endpoint
sql_server = 'XXXX.crm8.dynamics.com'
database = 'XXXX'
# Get token with error handling
try:
print(f"Attempting to authenticate with tenant: {tenant_id}")
print(f"Authority URL: {authority}")
app = msal.ConfidentialClientApplication(client_id, authority=authority, client_credential=client_secret)
print("Acquiring token…")
token_response = app.acquire_token_for_client(scopes=[f'{resource}/.default'])
if 'error' in token_response:
print(f"Token acquisition failed: {token_response['error']}")
print(f"Error description: {token_response.get('error_description', 'No description available')}")
else:
access_token = token_response['access_token']
print("Token acquired successfully and your token is!"+access_token)
print(f"Token length: {len(access_token)} characters")
except ValueError as e:
print(f"Configuration Error: {e}")
print("\nPossible solutions:")
print("1. Verify your tenant ID is correct")
print("2. Check if the tenant exists and is active")
print("3. Ensure you're using the right Azure cloud (commercial, government, etc.)")
except Exception as e:
print(f"Unexpected error: {e}")
#Get 5 contacts from Dataverse using Web API
import requests
import json
try:
print("Making Web API request to get contacts…")
# Dataverse Web API endpoint for contacts
web_api_url = f"{resource}/api/data/v9.2/contacts"
# Set up headers with authorization token
headers = {
'Authorization': f'Bearer {access_token}',
'OData-MaxVersion': '4.0',
'OData-Version': '4.0',
'Accept': 'application/json',
'Content-Type': 'application/json'
}
# Add query parameters to get only 5 contacts with specific fields
params = {
'$top': 5,
'$select': 'contactid,fullname,emailaddress1,telephone1,createdon'
}
# Make the GET request
response = requests.get(web_api_url, headers=headers, params=params)
if response.status_code == 200:
print("Web API request successful!")
contacts_data = response.json()
print(f"\nFound {len(contacts_data['value'])} contacts:")
print("-" * 80)
for i, contact in enumerate(contacts_data['value'], 1):
print(f"Contact {i}:")
print(f" ID: {contact.get('contactid', 'N/A')}")
print(f" Name: {contact.get('fullname', 'N/A')}")
print(f" Email: {contact.get('emailaddress1', 'N/A')}")
print(f" Phone: {contact.get('telephone1', 'N/A')}")
print(f" Created: {contact.get('createdon', 'N/A')}")
print("-" * 40)
else:
print(f"Web API request failed with status code: {response.status_code}")
print(f"Error details: {response.text}")
except requests.exceptions.RequestException as e:
print(f"Request error: {e}")
except KeyError as e:
print(f"Token not available: {e}")
except Exception as e:
print(f"Unexpected error: {e}")

You can use the VS Code as IDE, copy the above code in a python file, next click on Run Python File at the top of the VS Code

So, once you get the Access token, you can invoke the Web API using Python similar to how we did it using Javascript…

Please download the Python Jupyter Notebook if you want to work on this in VS Code

https://github.com/pavanmanideep/DataverseSDK_PythonSamples/blob/main/Python-Dataverse-WebAPI-GetData-02.ipynb

If you want to follow along in this series, please see below post

Hope this helps…

Cheers,

PMDY

Python + Dataverse Series – #01 – Console Application using Python

Hi Folks,

This series is for Pro Code Developers especially those working on Dataverse and want to know how to work with Dataverse and Python. I am starting this series as I see little to no content in this area.

So, in this post, first we will try to understand how to write a console application using Python Code utilizing(Tabular Data Stream) the TDS end point. Well, there were many posts in the internet for connecting to Dataverse using Python but uses more libraries and requires bit more code

Below posts will have hardcoded configurations as they are meant for initial trial purposes, going further, we will align with the best practices.

import pyodbc
import msal
# Azure AD details
client_id = '0e1c58b1-3d9a-4618-8889-6c6505288d3c'
client_secret = 'qlU8Q~dmhKFfdL1ph2YsLK9URbhIPn~qWmfr1ceL'
tenant_id = '97ae7e35-2f87-418b-9432-6733950f3d5c'
authority = f'https://login.microsoftonline.com/{tenant_id}'
resource = 'https://ecellorsdev.crm8.dynamics.com'
# SQL endpoint
sql_server = 'ecellorsdev.crm8.dynamics.com'
database = 'ecellorsdev'
# Get token with error handling
try:
print(f"Attempting to authenticate with tenant: {tenant_id}")
print(f"Authority URL: {authority}")
app = msal.ConfidentialClientApplication(client_id, authority=authority, client_credential=client_secret)
print("Acquiring token…")
token_response = app.acquire_token_for_client(scopes=[f'{resource}/.default'])
if 'error' in token_response:
print(f"Token acquisition failed: {token_response['error']}")
print(f"Error description: {token_response.get('error_description', 'No description available')}")
else:
access_token = token_response['access_token']
print("Token acquired successfully and your token is!"+access_token)
print(f"Token length: {len(access_token)} characters")
except ValueError as e:
print(f"Configuration Error: {e}")
print("\nPossible solutions:")
print("1. Verify your tenant ID is correct")
print("2. Check if the tenant exists and is active")
print("3. Ensure you're using the right Azure cloud (commercial, government, etc.)")
except Exception as e:
print(f"Unexpected error: {e}")

The logic just uses two libraries in Python

  1. pyodbc
  2. msal

The code efficiently handles all the errors for efficient tracking….

You can easily work with Python using VS Code as below,

Hover over Run option –> Click Start Debugging

You will able to get the Access Token after invoking the Dataverse.

Download the Python Jupyter Notebook if you want to work on this in VS Code.

https://github.com/pavanmanideep/DataverseSDK_PythonSamples/blob/main/Python-Dataverse-ConsoleApp-01.ipynb

Hope this posts helps…

If you want to continue reading this series, please follow along

Cheers,

PMDY

Establishing tenant hygiene with the CoE Starter Kit – Learn COE #04

Hi Folks,

In this blog post, I am going to talk about establishing tenant hygiene using COE Stater kit, in today’s world where there increasing Power Platform demand. Organizations have become mature, that every implementation is now looking for having some kind of governance being established.

If you were some one who want to get some knowledge of implementing governance, you were at right place.

In order to efficiently implement governance, we need to understand the environment strategy, your current implementation has used. Of course if you were looking for some guidance, there were examples of tooling available in the CoE Starter Kit and out-of-the-box capabilities to help CoE teams effectively manage and optimize their Power Platform solutions.

Few key steps to be considered for maintaing this in your environment, so let’s get started…

  1. Define Environment Strategy
  • Assign your admins the Power Platform service admin or Dynamics 365 service admin role.
  • Restrict the creation of net-new trial and production environments to admins
  • Rename the default environment to ‘Personal Productivity’
  • Provision a new Production environment for non-personal apps/flows
  • Define and implement your DLP policies for your environments
  • When establishing a DLP strategy, you may need multiple environments for the same department
  • When establishing your Power Platform environment strategy, based upon your licensing, you may find that you need to provision environments without a Dataverse (previously called Common Data Service) database and also use DLP policies to restrict the use of premium connectors.
  • Establish a process for requesting access or creation of environments
  • Dev/Test/Production environments for specific business groups or application
  • Individual-use environments for Proof of Concepts and training workshops
  • Use a service account to deploy production solutions
  • Reduce the number of shared development environments
  • Share resources with Microsoft Entra Security Groups.

2. Compliance and Adoption:

The Compliance page in the CoE Starter Kit’s Compliance and adoption dashboard can help you identify apps and flows with no owners, noncompliant apps, and suspended flows.

  • Rename and secure the default environment
  • Identify unused apps, pending suspension, suspended cloud flows and not without an owner or not in solutions
  • Quarantined noncompliant apps and clean up orphaned resources
  • Enable Managed Environments and establish a data loss prevention policy
  • Apply cross tenant isolation
  • Assign Administrator roles appropriately
  • Apps and flows with duplicate names not compliant with DLP policies or billing policies
  • Apps shared with everyone and apps shared with more than 100 users and Apps not launched in the last month and in the last quarter
  • Flows using plain text passwords and using HTTP actions
  • Cross-tenant connections
  • Environments with no apps or flows
  • Custom connectors using HTTP environments

3. Managing Dataverse for Teams environments

If you were not using Dataverse for Teams, you can safely skip this step, else please review

The Microsoft Teams environments page in the CoE Starter Kits dashboard provides you with an overview of your existing Teams environments, apps and flows in those environments, and the last launched date of apps.

Screenshot of a Microsoft Teams Environments overview.

By checking for new Dataverse for Teams environments daily, organizations can ensure they’re aware of all environments in use. 

State of Dataverse for TeamsPower Platform action
83 days after no user activitySend a warning that the environment will be disabled. Update the environment state on the Environments list page and the Environment page.
87 days after no user activitySend a warning that the environment will be disabled. Update the inactive environment state on the Environments list page and the Environment page.
90 days after no user activityDisable the environment. Send a notice that the environment has been disabled. Update the disabled environment state on the Environments list page and the Environment page.
113 days after no user activitySend a warning that the environment will be deleted. Update the disabled environment state on the Environments list page and the Environment page.
117 days after no user activitySend a warning that the environment will be deleted. Update the disabled environment state on the Environments list page and the Environment page.
120 days after no user activityDelete the environment. Send a notice that the environment has been deleted.

Please note a warning is displayed only if the Dataverse for Teams environment is <= 7 days until disablement.

4. Highly used apps

The Power BI Dashboard available out of the box with COE Starter Kit will provide you the necessary guidance over high performing apps and also your most active users.

5. Communicating governance to your makers

This is one of the important step while setting up COE and governance guidelines, follow the below approaches

  • Clearly communicate the purpose and benefits of governance policies:Explain how governance policies protect organizational data
  • Make governance policies and guidelines easily accessible:Place the policies and guidelines in a central location that is easily accessible to all makers
  • Provide training and support:Offer training sessions and resources to help makers understand and comply with governance policies.
  • Encourage open communication: Foster culture where makers can ask questions and raise concerns about governance policies.
  • Incorporate governance into the development process:For example, you can require a compliance review before deploying a solution.

6. Administration of the platform

Power Platform Administrator Planning Tool which comes with COE Strater Kit provides guidance and best practices for administration. Also the planning tool can optimize environments, security, data loss prevention, monitoring and reporting.

6. Securing the environments

It is critical to establish a Data Loss Prevention (DLP) strategy to control connector availability.

The DLP editor (impact analysis) tool is available for use before making changes to existing policies or creating new DLP policies. It reveals the impact of changes on existing apps and cloud flows and helps you make informed decisions.

Reference: COE Starter Kit Documentation

If you face issues using the COE Starter Kit, you can always report them at https://aka.ms/coe-starter-kit-issues

Hope this helps…. someone maintaining tenant governance with COE starter kit…. if you have any feedback or questions, do let me know in comments….

Cheers,

PMDY

The refresh token has expired due to inactivity when connecting to Power Pages using Power Apps CLI – Quick Fix

Hi Folks,

This post is about a quick fix for an error occurred with Power Apps CLI.

I was trying to connect to my organization using CLI and that’s when I encountered this error.

Prerequisites:

Power Apps CLI, Visual Studio Code

After installing the prerequisites, I was trying to connect to my Power Pages available in my organization from VS Code terminal using below command.

pac paportalist 

It’s then I encountered the below error

It’s then I understood that due to inactivity, it is failing…

Your Power Platform CLI connection is failing due to an expired refresh token and an ExternalTokenManagement Authentication configuration issue. Here’s how you can resolve it:

Fix:

Reauthenticate with Dataverse

pac auth clear
pac auth create --url https://orgXXX.crm8.dynamics.com --username admin@Ecellors.onmicrosoft.com --password [your password]

Creating new authentication profile resolves this issue…

    Now try to run the above command.

    This should prompt a new login window to authenticate your request, provide the details and you should be able to login.

    Hope this helps..

    Cheers,

    PMDY

    Deploy dependent assemblies easily using PAC CLI

    Hi Folks,

    This is another post related to Plugins in Dynamics 365 CE.

    Considering medium to large scale implementations, there isn’t a single Power Platform Project which don’t require merging of external assemblies.

    We relied on ILMerge to merge those assemblies into a single DLL. We used to search for ILMerge assemblies in Nuget and installed them for use.

    Then the plugins are signed in for several reasons, primarily related to security, assembly integrity, and versioning of the sandbox worker process.

    But either of the above are no longer needed with the help of Dependent Assembly feature…with few simple steps, you can build the Plugin…Interesting, isn’t it, read on…

    Pre requisites:

    • Download Visual Studio 2022 Community Edition here
    • Download VS Code from here
    • Download Plugin registration tool from here
    • Download PAC CLI from here
    • Download and install NuGet Package Explorer from this link NuGet Package Explorer open the NuGet Package Explorer

    Avoid Direct Plugin Project Creation in Visual Studio

    • Never create a Plugin project directly from Visual Studio or any other IDE here after.
    Use Microsoft PowerApps CLI instead
    • Always use Power Apps CLI as it easy and only requires a single command to create the entire Plugin project scaffolding
    • This ensures a standardized and reliable development environment.
    • It automatically creates a Nuget Package file that will be used to avoid ‘Could not load assemblies or its dependencies‘.

    Ok, let’s begin.

    Once you have downloaded all the prerequisites mentioned, make sure you have installed them in your local machine. Others are straight forward to download, for NuGet Package explorer, you need to search in Windows store to install.

    1. Create a local folder for the Plugins

    Navigate to that folder from VS Code

    Now open terminal, run the pac command as below

    Execute the following command to create plugin project 

    • Browse to the directory where you want to create the plugin project
    • Execute the command on CMD to create plugin project “pac plugin init

    A plugin project will be created at your desired location as follows

    Plugin project in local folder will be created as below

    That’s it, you can close the VS Code for now.

    Click on the CS Proj file and open it in Visual Studio

    By default, 2 files are automatically created when you create a plugin project as shown above.

    Now will install Bouncy Castle which is an external library, right click on the Plugin Solution –> Manage Nuge Packages

    I have added Bouncy Castle NuGet Package to my plugin project for Encryption and Decryption. You can have your own required NuGet Package as per your need.

    Build your project

    After a successful build, you will get the output result as follows

    Browse the directory of your project

    Open the file Plugin_Project.1.0.0.nupkg in Nuget Package Explorer by double clicking it

    Now you can see that this nuget package file contains the information related to the added nuget package of Bouncy Castle that we want to include in our plugin project package as follows. In your case, you can have the required nuget package that you want to add 

    Now open up plugin registration tool

    Click to create new connection

    Provide login details and login

    Click to Register New Package

    Browse to the directory where your nuget package file was created automatically when you build the project and import this file 

    Select the Command Data Service Default Solution and import it

    Click on view and Display by package

    Now your Plugin Project is successfully registered with all dependent assemblies and ready to use.

    While this post gives you a structure on how you can do build a plugin assembly, you can add the business logic as per your need.

    Conclusion:

    In conclusion, navigating the intricacies of Microsoft Dynamics 365 CRM plugins demands a nuanced approach, especially when dealing with NuGet Packages and dependent assemblies. This article has delved into the critical process of resolving the persistent ‘Could not load assemblies or its dependencies‘ issue, offering a comprehensive, step-by-step demonstration.

    By following the recommended best practices, such as avoiding direct plugin project creation in Visual Studio and harnessing the power of Microsoft PowerApps CLI, developers can establish a standardized and reliable development environment. The CLI’s automatic creation of a NuGet Package file not only streamlines the process but also reduces the errors.

    To further facilitate your journey, prerequisites such as downloading and installing essential tools like the Plugin Registration tool, Microsoft PowerApps CLI, and NuGet Package Explorer are highlighted. The guide emphasizes the significance of these tools in ensuring a smooth plugin development experience.

    By adopting these practices and incorporating the suggested steps into your workflow, you not only troubleshoot existing issues but also fortify your understanding of the entire process. Take charge of your Dynamics 365 CRM plugin development, elevate your skills, and sidestep common pitfalls by mastering the art of handling NuGet Packages and dependencies seamlessly.

    References:

    Build and package plug-in code

    Cheers,

    PMDY 

    Fix Plugin Registration Tool Connection Issues with Multi-Factor Authentication (MFA) Enabled in Dynamics 365

    Hi Folks,

    It’s been a since I posted on Dynamics 365 Plugins, so this blog post talks about one small tip when connecting to your Dynamics 365 instance from Plugin Registration Tool either if you were connecting from Standalone Plugin Registration Tool or using Plugin Registration Tool from XrmToolBox.

    If you were looking to install plugin registration tool itself, you can check the below post or if you want to learn about all Plugin related issues at once, you can check the references at the bottom of this post, else you can continue reading this post.

    If you don’t know this tip, it will be difficult and least you will spend many minutes figuring out the error message you see in the Plugin registration tool.

    This is applicable for applications who have MFA enabled, even if you haven’t enabled, it was enabled by Microsoft by default to enforce security.

    As usually, you select:

    1. Office 365
    2. Enable Display list of available organizations, Show Advanced
    3. Provide User Name, Password
    4. Click on Login

    You will be prompted this error in such case

    ======================================================================================================================
    Source : Microsoft.IdentityModel.Clients.ActiveDirectory
    Method : MoveNext
    Date : 12/4/2025
    Time : 5:09:52 pm
    Error : AADSTS50076: Due to a configuration change made by your administrator, or because you moved to a new location, you must use multi-factor authentication to access '00000007-0000-0000-c000-000000000000'. Trace ID: 7a7cac23-056c-4e77-ba82-98d50c0b7001 Correlation ID: d8b32fe6-6197-4d9a-a460-3834c8dc292a Timestamp: 2025-04-12 09:09:52Z
    Stack Trace : at Microsoft.Xrm.Tooling.Connector.CrmWebSvc.ProcessAdalExecption(Uri serviceUrl, ClientCredentials clientCredentials, X509Certificate2 userCert, UserIdentifier& user, String clientId, Uri redirectUri, PromptBehavior promptBehavior, String tokenCachePath, Boolean isOnPrem, String authority, Uri& targetServiceUrl, AuthenticationContext& authContext, String& resource, CrmLogEntry logSink, Boolean useDefaultCreds, String& authToken, AdalException adalEx)
    at Microsoft.Xrm.Tooling.Connector.CrmWebSvc.ExecuteAuthenticateServiceProcess(Uri serviceUrl, ClientCredentials clientCredentials, X509Certificate2 userCert, UserIdentifier user, String clientId, Uri redirectUri, PromptBehavior promptBehavior, String tokenCachePath, Boolean isOnPrem, String authority, Uri& targetServiceUrl, AuthenticationContext& authContext, String& resource, UserIdentifier& userIdent, CrmLogEntry logSink, Boolean useDefaultCreds, SecureString clientSecret)
    at Microsoft.Xrm.Tooling.Connector.CrmWebSvc.DiscoverGlobalOrganizations(Uri discoveryServiceUri, ClientCredentials clientCredentials, X509Certificate2 loginCertificate, UserIdentifier user, String clientId, Uri redirectUri, PromptBehavior promptBehavior, String tokenCachePath, Boolean isOnPrem, String authority, UserIdentifier& userOut, CrmLogEntry logSink, Boolean useGlobalDisco, Boolean useDefaultCreds)
    at Microsoft.Xrm.Tooling.Connector.CrmWebSvc.DiscoverOrganizations(Uri discoveryServiceUri, ClientCredentials clientCredentials, UserIdentifier user, String clientId, Uri redirectUri, PromptBehavior promptBehavior, String tokenCachePath, Boolean isOnPrem, String authority, UserIdentifier& userOut, CrmLogEntry logSink, Boolean useGlobalDisco, Boolean useDefaultCreds)
    at Microsoft.Xrm.Tooling.CrmConnectControl.CrmConnectionManager.QueryOAuthDiscoveryServer(Uri discoServer, ClientCredentials liveCreds, UserIdentifier user, String clientId, Uri redirectUri, PromptBehavior promptBehavior, String tokenCachePath, Boolean useGlobalDisco)
    at Microsoft.Xrm.Tooling.CrmConnectControl.CrmConnectionManager.QueryOnlineServerList(ObservableCollection`1 svrs, OrganizationDetailCollection col, ClientCredentials liveCreds, Uri trimToDiscoveryUri, Uri globalDiscoUriToUse)
    at Microsoft.Xrm.Tooling.CrmConnectControl.CrmConnectionManager.FindCrmOnlineDiscoveryServer(ClientCredentials liveCreds)
    at Microsoft.Xrm.Tooling.CrmConnectControl.CrmConnectionManager.ValidateServerConnection(CrmOrgByServer selectedOrg)
    ======================================================================================================================
    Inner Exception Level 1 :
    Source : Not Provided
    Method : Not Provided
    Date : 12/4/2025
    Time : 5:09:52 pm
    Error : Response status code does not indicate success: 400 (BadRequest).
    Stack Trace : Not Provided
    ======================================================================================================================
    Inner Exception Level 2 :
    Source : Not Provided
    Method : Not Provided
    Date : 12/4/2025
    Time : 5:09:52 pm
    Error : {"error":"interaction_required","error_description":"AADSTS50076: Due to a configuration change made by your administrator, or because you moved to a new location, you must use multi-factor authentication to access '00000007-0000-0000-c000-000000000000'. Trace ID: 7a7cac23-056c-4e77-ba82-98d50c0b7001 Correlation ID: d8b32fe6-6197-4d9a-a460-3834c8dc292a Timestamp: 2025-04-12 09:09:52Z","error_codes":[50076],"timestamp":"2025-04-12 09:09:52Z","trace_id":"7a7cac23-056c-4e77-ba82-98d50c0b7001","correlation_id":"d8b32fe6-6197-4d9a-a460-3834c8dc292a","error_uri":"https://login.microsoftonline.com/error?code=50076&quot;,"suberror":"basic_action"}: Unknown error
    Stack Trace : Not Provided
    ======================================================================================================================
    ======================================================================================================================
    Inner Exception Level 2	: 
    Source	: Not Provided
    Method	: Not Provided
    Date	: 12/4/2025
    Time	: 5:09:52 pm
    Error	: {"error":"interaction_required","error_description":"AADSTS50076: Due to a configuration change made by your administrator, or because you moved to a new location, you must use multi-factor authentication to access '00000007-0000-0000-c000-000000000000'. Trace ID: 7a7cac23-056c-4e77-ba82-98d50c0b7001 Correlation ID: d8b32fe6-6197-4d9a-a460-3834c8dc292a Timestamp: 2025-04-12 09:09:52Z","error_codes":[50076],"timestamp":"2025-04-12 09:09:52Z","trace_id":"7a7cac23-056c-4e77-ba82-98d50c0b7001","correlation_id":"d8b32fe6-6197-4d9a-a460-3834c8dc292a","error_uri":"https://login.microsoftonline.com/error?code=50076","suberror":"basic_action"}: Unknown error
    Stack Trace	: Not Provided
    ======================================================================================================================

    Based on the above inner exception, we can clearly understand that it is looking for Multifactor Authentication, so untick the Show Advanced checkbox, it then asks for Multifactor Authentication as shown below.

    That’s it, with this simple tick of unchecking the Show Advanced, you were able to overcome this error, how cool is it…?

    I have written lot of articles with respect to Plugin registration tool, you can check them below

    Issues related to Plugins and Plugin Registration Tool

    Hope this helps…

    Cheers,

    PMDY