Maximizing Your Power Platform Solution’s Reach: Essential Performance Considerations for Optimal Efficiency

Hi Folks,

This blog post is all about performance considerations for your Power Platform CE Projects and how you can plan to optimize application performance for your Power Apps. So I just want to take you through them…

Are you tired of creating solutions for longer durations and while at the end of the project or during UAT you end up facing performance issues for the solutions you have developed, one of the most important non-functional requirements for a project’s success is Performance. Satisfying performance requirements for your users can be a challenge. Poor performance may cause failures in user adoption of the system and lead to project failure, so you might need to be careful for every decision you take while you design your solutions in the below stages.

Let’s talk about them one by one..

1. Network Latency and bandwidth

A main cause of poor performance of Dynamics 365 apps is the latency of the network over which the clients connect to the organization. 

  • Bandwidth is the width or capacity of a specific communications channel.
  • Latency is the time required for a signal to travel from one point on a network to another and is a fixed cost between two points. And usually many of these “signals” travel for a single request.

Lower latencies (measured in milliseconds) generally provide better levels of performance. Even if the latency of a network connection is low, bandwidth can become a performance degradation factor if there are many resources sharing the network connection, for example, to download large files or send and receive email.

Dynamics 365 apps are designed to work best over networks that have the following elements: 

  • Bandwidth greater than 50 KBps (400 kbps)
  • Latency under 150 ms

These values are recommendations and don’t guarantee satisfactory performance. The recommended values are based on systems using out-of-the box forms that aren’t customized.

If you significantly customize the out-of-box forms, it is recommend that you test the form response to understand bandwidth needs.   

You can use the diagnostics tool to determine the latency and bandwidth:

  1. On your computer or device, start a web browser, and sign in to an organization.
  2. Enter the following URL, https://myorg.crm.dynamics.com/tools/diagnostics/diag.aspx, where crm.dynamics.com is the URL of your organization.
  3. Click Run.

Also, to mitigate higher natural latency for global rollouts, customers should leverage Dynamics 365 Apps successfully by having smart design for their applications. 

2.Smart Design for your application

Form design 

  • Keep the number of fields to a minimumThe more fields you have in a form, the more data that needs to be transferred over the internet or intranet to view each record. Think about the interaction the user will have with the form and the amount of data that must be displayed within it.
  • Avoid including unnecessary JavaScript web resource librariesThe more scripts you add to the form, the more time it will take to download them. Usually, scripts are cached in your browser after they are loaded the first time, but the performance the first time a form is viewed often creates a significant impression.
  • Avoid loading all scripts in the Onload eventIf you have code that only supports OnChange events for fields or the OnSave event, make sure to set the script library with the event handler for those events instead of the OnLoad event. This way loading those libraries can be deferred and increase performance when the form loads.
  • Use tab events to defer loading web resourcesAny code that is required to support web resources or IFRAMEs within collapsed tabs can use event handlers for the TabStateChange event and reduce code that might otherwise have to occur in the OnLoad event.
  • Set default visibility optionsAvoid using form scripts in the OnLoad event that hide form elements. Instead set the default visibility options for form elements that might be hidden to not be visible by default when the form loads. Then, use scripts in the OnLoad event to show those form elements you want to display. If the form elements are never made visible, they should be removed from the form rather than hidden.
  • Watch out for synchronous web requests as they can cause severe performance issues. Consider moving to asynchronous for some of these web requests. Also, choose WebApi over of creating Xml HTTP Requests (XHR) on your own. 
  • Avoid opening a new tab or window and do open the window in the main form dialog. 
  • For Command Bar, keep the number of controls to a minimumWithin the command bar or the ribbon for the form, evaluate what controls are necessary and hide any that you don’t need. Every control that is displayed increases resources that need to be downloaded to the browser. Use asynchronous network requests in Custom Rules When using custom rules that make network requests in Unified Interface, use asynchronous rule evaluation.

Learn more Design forms for performance in model-driven apps – Power Apps | Microsoft Learn

Latest version of SDK and APIs 

The latest version of SDK, Form API and WebAPI endpoints should be used to support latest product features, roadmap alignment and security. 

APIs calls and custom FetchXML call velocity 

Only the columns required for information or action should be included in API calls

  • Retrieving all columns (*) creates significant overhead on the database engine when distributed across significant user load. Optimization of call velocity is key to avoid “chatty” forms that unnecessarily make repeated calls for the same information in a single interaction.
  • You should avoid retrieving all columns in a query result because of the impact on a subsequent update of records. In an update, this will set all field values, even if they are unchanged, and often triggers cascaded updates to child records. Leverage the most efficient connection mechanism (WebAPI vs SDK) and reference this doc site for guidance on the appropriate approach.

Consider reviewing periodically the Best practices and guidance when coding for Microsoft Dataverse – Power Apps | Microsoft Learn and ColumnSet.AllColumns Property (Microsoft.Xrm.Sdk.Query) | Microsoft Learn.

Error handling across all code-based events 

You should continue to use the ITracingService.Trace to write to the Plug-in Trace Log table when needed. If your plug-in code uses the ILogger interface and the organization does not have Application Insights integration enabled, nothing will be written. So, it is important to continue to use the ITracingService Trace method in your plug-ins. Plug-in trace logs continue to be an important way to capture data while developing and debugging plug-ins, but they were never intended to provide telemetry data.  

For organizations using Application Insights, you should use ILogger because it will allow for telemetry about what happens within a plug-in to be integrated with the larger scope of data captured with the Application Insights integration. The Application Insights integration will tell you when a plug-in executes, how long it takes to run and whether it makes any external http requests. Learn more about tracing in plugins Logging and tracing (Microsoft Dataverse) – Power Apps | Microsoft Learn.   

Use Solution Checker to analyze solution components 

Best practice is to run Solution Checker for all application code and include it as mandatory step while you design solutions or check when you complete developing your custom logic.

Quick Find 

For an optimal search experience for your users consider the next:

  • All columns you expect to return results in a quick find search need to be included in the view or your results will not load as expected.
  • It is recommended to not use option sets in quick find columns. Try using the view filtering for these. 
  • Minimize the number of fields used and avoid using composite fields as searchable columns. E.g., use first and last name as searchable vs full name.
  • Avoid using multiple lines of text fields as search or find columns.
  • Evaluate Dataverse search vs using leading wildcard search

3. Training

This step should be done during user training or during UAT. To ensure optimal performance of Dynamics 365, ensure that users are properly leveraging browser caching. Without caching, users can experience cold loads which have lower performance than partially (or fully) warm loads.

 Make sure to train users to: 

  • Use application inline refresh over browser refresh (should not use F5)
  • Use application inline back button instead browser’s back button.
  • Avoid InPrivate/Incognito modes in browser which causes cold loads.
  • Make users aware that running applications which consumes lot of bandwidth (like video streaming) may affect performance.
  • Do not install browser extensions unless they are necessary (this might be also blocked via policy)
  • Do use ‘Record Set’ to navigate records quickly without switching from form back to the list. 

4. Testing

For business processes where performance is critical or processes having complex customizations with very high volumes, it is strongly recommended to plan for performance testing. Consider reviewing the below technical talk series describing important performance considerations, as well as sharing practical examples of how to set up and execute performance testing, and analyze and mitigate performance issues. Reference: Performance Testing in Microsoft Dynamics 365 TechTalk Series – Microsoft Dynamics Blog

5. Monitoring

You should define a monitoring strategy and might consider using any of the below tools based on your convenience.

  1. Monitor Dynamic 365 connectivity from remote locations continuously using network monitoring tools like Azure Network Performance Monitor or 3rd party tools. These tools help identify any network related problems proactively and drastically reduce troubleshooting time of any potential issue. 
  2. Application Insights, a feature of Azure Monitoris widely used within the enterprise landscape for monitoring and diagnostics. Data that has already been collected from a specific tenant or environment is pushed to your own Application Insights environment. The data is stored in Azure Monitor logs by Application Insights, and visualized in Performance and Failures panels under Investigate on the left pane. The data is exported to your Application Insights environment in the standard schema defined by Application Insights. The support, developer, and admin personas can use this feature to triage and resolve Telemetry events for Microsoft Dataverse – Power Platform | Microsoft Learn
  3. Dataverse and PowerApps analytics in the Power Platform Admin Centre. Through the Plug-in dashboard in the Power Platform Admin Center you can view metrics such as average execution time, failures, most active plug-ins, and more.
  4. Dynamics 365 apps include a basic diagnostic tool that analyzes the client-to-organization connectivity and produces a report.
  5. Monitor is a tool that offers makers the ability to view a stream of events from a user’s session to diagnose and troubleshoot problems. Works both for model driven apps and canvas apps. 

I hope this blog post had helped you learn or know something new…thank you for reading…

Cheers,

PMDY

Understanding Dataverse MCP vs Power Apps MCP – Quick Review

Hi Folks,

Model Context Protocol(MCP) has quickly become one of the hottest topics in today’s AI landscape. The excitement around it is huge—not just within the Microsoft ecosystem, but across the entire industry, because it’s designed to be open and accessible to everyone.

Microsoft Power Platform is also moving fast, releasing its own MCPs. I’ve already been asked several times about the difference between them, so this post breaks down how the Dataverse MCP and the Power Apps MCP differ—and when you should use each one.

We know that Dataverse MCP is generally available last year. Now Microsoft announced the public preview of Power Apps MCP this month. So what’s the difference between the two, this post will help to discern the differences between two.

Although both use the Model Context Protocol (MCP), they serve different purposes and operate at different layers of the Power Platform.

FeatureDataverse MCPPower Apps MCP
Primary RoleExpose Dataverse as an MCP server so AI agents can query tables, records, and metadataAllow Power Apps to use MCP to call external AI models or tools
FocusData access & operationsAI integration inside apps
Who Uses ItDevelopers building AI agents or copilots that need Dataverse dataApp makers building Power Apps that need AI-driven logic

2. Direction of Integration

DirectionDataverse MCPPower Apps MCP
MCP ServerYes — Dataverse acts as an MCP serverNo
MCP ClientNoYes — Power Apps acts as an MCP client
MeaningAI tools connect into DataversePower Apps connects out to AI tools

Dataverse MCP: AI → Dataverse Power Apps MCP: Power Apps → AI

3. What You Can Do

Dataverse MCP

  • Query Dataverse tables
  • Retrieve records
  • Update or create data
  • Use Dataverse as a tool inside Copilot Studio, VS Code GitHub Copilot, Claude Desktop, etc.

Power Apps MCP

  • Call external AI models from inside a Power App
  • Build AI-driven app logic
  • Trigger workflows using AI reasoning

4. Typical Use Cases

Dataverse MCP

  • Build a Copilot that answers questions using Dataverse data
  • Let GitHub Copilot query Dataverse while coding
  • Create AI agents that read/write CRM or ERP data

Power Apps MCP

  • Add AI reasoning to a canvas or model-driven app
  • Use external AI models to classify, summarize, or generate content
  • Build intelligent forms or workflows

5. Relation to Connectors

Dataverse MCP is increasingly seen as a future alternative to connectors for AI-driven scenarios.

Power Apps MCP does not replace connectors — it extends apps with AI capabilities.

Summary

CategoryDataverse MCPPower Apps MCP
Acts asMCP ServerMCP Client
Used forAI agents accessing DataversePower Apps calling AI tools
Primary BenefitIntelligent, standardized Dataverse accessAI-enhanced app logic
Typical ToolsCopilot Studio, GitHub Copilot, ClaudeCanvas apps, model-driven apps

Decision Guide: When to Use Dataverse MCP vs Power Apps MCP

1. If you want AI to access Dataverse → Use Dataverse MCP

Choose Dataverse MCP when:

  • You’re building a Copilot, AI agent, or LLM-powered tool that needs:
    • Dataverse tables
    • Records
    • Metadata
    • CRUD operations
  • You want AI to reason over business data
  • You want a standardized, connector-free way for AI to talk to Dataverse
  • You’re integrating Dataverse with:
    • GitHub Copilot
    • Copilot Studio
    • Claude Desktop
    • Custom LLM agents

Typical scenarios

  • My AI assistant should answer questions using CRM data.
  • I want GitHub Copilot to autocomplete code based on Dataverse schema.
  • I’m building an AI agent that updates Dataverse records.

If the AI is the one doing the work → Dataverse MCP.

2. If you want your Power App to call AI → Use Power Apps MCP

Choose Power Apps MCP when:

  • You’re building a canvas or model-driven app that needs:
    • AI reasoning
    • AI-generated content
    • AI classification or summarization
  • You want your app to call:
    • OpenAI models
    • Azure AI models
    • Custom MCP tools
  • You want AI logic inside the app, not outside it

Typical scenarios

  • My form should summarize customer notes using AI.
  • My app should classify images or text using an external model.
  • I want to call an LLM from a button in a canvas app.

3. If you need both directions → Use both

Some solutions need AI ↔ Dataverse ↔ Power Apps.

Example

  • A Power App collects data
  • AI agent (via Dataverse MCP) analyzes historical records
  • Power App (via Power Apps MCP) calls AI to generate insights for the user

This is becoming a common pattern in enterprise AI.

4. Quick Decision Table

GoalUse Dataverse MCPUse Power Apps MCP
AI needs to read/write Dataverse
Power App needs to call AI
Replace connectors for AI-driven data access
Add AI reasoning inside app UI
Build AI copilots or agents
Build AI-enhanced business apps

Hope you have found this post useful….

Cheers,

PMDY

Python + Dataverse Series – #07: Running a Linear Normalization Algorithm on Dataverse Data Using Python

This is continuation in this series of Dataverse SDK for Python, if you haven’t checked out earlier articles, I would encourage to start from the beginning of this series.

Machine learning often begins with one essential step: data preprocessing. Before models can learn patterns, the raw data must be cleaned, scaled, and transformed into a form suitable for analysis. In this example, let me demonstrate how to retrieve numerical data from Microsoft Dataverse and apply a linear normalization algorithm using Python.

Normalization is a fundamental algorithm in machine learning pipelines. It rescales numeric values into a consistent range—typically between 0 and 1—making them easier for algorithms to interpret and compare.

1. Retrieving Data from Dataverse

Using the DataverseClient and Interactive Browser authentication, we connect to Dataverse and fetch the revenue field from the Account table. This gives us a small dataset to run our algorithm on.

from azure.identity import InteractiveBrowserCredential
from PowerPlatform.Dataverse.client import DataverseClient
credential = InteractiveBrowserCredential()
client = DataverseClient("https://ecellorsdev.crm8.dynamics.com", credential)
account_batches = client.get(
"account",
select=["accountid", "revenue"],
top=10,
)

We then extract the revenue values into a NumPy array.

2. Implementing the Linear Normalization Algorithm

The algorithm used here is min–max normalization, defined mathematically as:normalized=xmin(x)max(x)min(x)This algorithm ensures:

  • the smallest value becomes 0
  • the largest becomes 1
  • all other values fall proportionally in between

Here’s the implementation:

import numpy as np
revenues = np.array(revenues)
min_rev = np.min(revenues)
max_rev = np.max(revenues)
normalized_revenues = (revenues - min_rev) / (max_rev - min_rev)

This is a classic preprocessing algorithm used in machine learning pipelines before feeding data into models such as regression, clustering, or neural networks.

3. Visualizing the Normalized Output

To better understand the effect of the algorithm, we plot the normalized values:

import matplotlib.pyplot as plt
plt.plot(normalized_revenues, marker='o')
plt.title('Normalized Revenues from Dataverse Accounts')
plt.xlabel('Account Index')
plt.ylabel('Normalized Revenue')
plt.grid()
plt.show()

The visualization highlights how the algorithm compresses the original revenue values into a uniform scale.

4. Why Normalization Matters

Normalization is not just a mathematical trick—it’s a crucial algorithmic step that:

  • prevents large values from dominating smaller ones
  • improves convergence in optimization-based models
  • enhances the stability of distance‑based algorithms
  • makes datasets comparable across different ranges
#Running Machine Learning Algorithm on data retrieved from Dataverse to run a linear normalization
from azure.identity import InteractiveBrowserCredential
from PowerPlatform.Dataverse.client import DataverseClient
import numpy as np
# Connect to Dataverse
credential = InteractiveBrowserCredential()
client = DataverseClient("https://ecellorsdev.crm8.dynamics.com", credential)
# Fetch account data as paged batches
account_batches = client.get(
"account",
select=["accountid", "revenue"],
top=10,
)
revenues = []
for batch in account_batches:
for account in batch:
if "revenue" in account and account["revenue"] is not None:
revenues.append(account["revenue"])
revenues = np.array(revenues)
# Apply a simple linear algorithm: Normalize the revenues
if len(revenues) > 0:
min_rev = np.min(revenues)
max_rev = np.max(revenues)
normalized_revenues = (revenues – min_rev) / (max_rev – min_rev)
print("Normalized Revenues:", normalized_revenues)
#visualize the result
import matplotlib.pyplot as plt
plt.plot(normalized_revenues, marker='o')
plt.title('Normalized Revenues from Dataverse Accounts')
plt.xlabel('Account Index')
plt.ylabel('Normalized Revenue')
plt.grid()
plt.show()

The use of this code is to transform raw Dataverse revenue data into normalized, machine‑learning‑ready values that can be analyzed, compared, and visualized effectively.

You can download the Python Notebook below if you want to work with VS Code

https://github.com/pavanmanideep/DataverseSDK_PythonSamples/blob/main/Python-RetrieveData-ApplyLinearAlgorithm.ipynb

Once you have opened the Python notebook, you can start to run the code as below

You should see something like below

For authentication in another browser tab, once authenticated, you should be able to see the

Hope you found this useful…it’s going to be interesting, stay tuned for upcoming articles.

Cheers,

PMDY

Python + Dataverse Series – #06: Data preprocessing steps before running Machine Learning Algorithms

Hi Folks,

If you were already a Power Platform Consultant and new to working with Python, then I would encourage to start from the beginning of this series.

Now in this series, we entered an interesting part where Machine learning algorithms were run to analyze Dataverse Data and in this post we will understand why feature scaling is a critical preprocessing step for many machine learning algorithms because it ensures that all features contribute equally to the model’s outcome, prevents numerical instability, and helps optimization algorithms converge faster to the optimal solution

Primarily before running any Machine Learning Algorithm, we need to do some data preprocessing like scaling the data, in this case we will use a formula which is used to scale using min–max normalization (feature scaling to the [0, 1] range).

#preprocessing step before running machine learning algorithms
from azure.identity import InteractiveBrowserCredential #using Interactive Login
from PowerPlatform.Dataverse.client import DataverseClient #installing Python SDK for Dataverse
import numpy as np #import Numpy Library to perform calculations
# Connect to Dataverse
credential = InteractiveBrowserCredential()
client = DataverseClient("https://ecellorsdev.crm8.dynamics.com", credential) #Creates Dataverse Client
# Fetch account data as paged batches
account_batches = client.get(
"account",
select=["accountid", "revenue"],
top=10,
) #Fetches top 10 accounts with accountid, revenue columns
revenues = []
for batch in account_batches:
for account in batch:
if "revenue" in account and account["revenue"] is not None:
revenues.append(account["revenue"])
revenues = np.array(revenues)
#Normalize the revenue
if len(revenues) > 0:
min_rev = np.min(revenues)
max_rev = np.max(revenues)
normalized_revenues = (revenues – min_rev) / (max_rev – min_rev)
print("Normalized Revenues:", normalized_revenues)
#visualize the result
import matplotlib.pyplot as plt
plt.plot(normalized_revenues, marker='o')
plt.title('Normalized Revenues from Dataverse Accounts')
plt.xlabel('Account Index')
plt.ylabel('Normalized Revenue')
plt.grid()
plt.show()

You can download the Python Notebook below if you want to work with VS Code

https://github.com/pavanmanideep/DataverseSDK_PythonSamples/blob/main/Python-PreProcessingStepBeforeMachineLearning.ipynb

Hope you found this useful…

Cheers,

PMDY

Python + Dataverse Series – How to run Python Code in Vs Code

Hi Folks,

As you folks know that Python currently is the number #1 programming language with a massive, versatile ecosystem of libraries for data science, AI, and backend web development. This post kicks off a hands‑on series about working with Microsoft Dataverse using Python. We’ll explore how to use the Dataverse SDK for Python to connect with Dataverse, automate data operations, and integrate Python solutions across the broader Power Platform ecosystem. Whether you’re building data-driven apps, automating workflows, or extending Power Platform capabilities with custom logic, this series will help you get started with practical, real‑world examples.

https://www.microsoft.com/en-us/power-platform/blog/2025/12/03/dataverse-sdk-python/

With the release of the Dataverse SDK for Python, building Python-based logic for the Power Platform has become dramatically simpler. In this post, we’ll walk through how to download Python and set it up in Visual Studio Code so you can start building applications that interact with Dataverse using Python. Sounds exciting already. Let’s dive in and get everything set up..

1. Download and install Python from official website below and then install it in your computer.

https://www.python.org/ftp/python/3.14.3/python-3.14.3-amd64.exe

2. Install VS Code

Important: During installation, make sure to check “Add Python to PATH”. This ensures VS Code can detect Python automatically.

3. After installation, open VS Code and install the Python extension (Microsoft’s official one). This extension enables IntelliSense, debugging, and running Python script

4. That’s it, you are now able to run Python logic inside Vs Code

5. Create or Open a Python file in the system, opened a sample file below

5. If you want to run Python Programmes in your VS Code, follow below options

a. Select Start Debugging

b. You will be prompted a window like below

You can select the first option highlighted above, it automatically runs your Python Code

This is very easy to setup…

If you want to continue reading this series, check out the next article.

Hope this helps…

Cheers,

PMDY

Python + Dataverse Series – #05: Remove PII

Hi Folks,

This is in continuation in the Python + Dataverse series, it is worth checking out from the start of this series here.

At times, there will be a need to remove PII(Personally Identifiable Information) present in the Dataverse Environments, for this one time task, you can easily run Python script below, let’s take example of removing PII from Contact fields in the below example.

from azure.identity import InteractiveBrowserCredential
from PowerPlatform.Dataverse.client import DataverseClient
# Connect to Dataverse
credential = InteractiveBrowserCredential()
client = DataverseClient("https://ecellorsdev.crm8.dynamics.com", credential)
#use AI to remove PII data from the dataverse records, let's say contact records
def remove_pii_from_contact(contact):
pii_fields = ['emailaddress1', 'telephone1', 'mobilephone', 'address1_line1', 'address1_city', 'address1_postalcode']
for field in pii_fields:
if field in contact:
contact[field] = '[REDACTED]'
return contact
# Fetch contacts with PII (Dataverse client returns paged batches)
contact_batches = client.get(
"contact",
select=[
"contactid",
"fullname",
"emailaddress1",
"telephone1",
"mobilephone",
"address1_line1",
"address1_city",
"address1_postalcode",
],
top=10,
)
# Remove PII and update contacts
for batch in contact_batches:
for contact in batch:
contact_id = contact.get("contactid")
sanitized_contact = remove_pii_from_contact(contact)
# Prepare update data (exclude contactid)
update_data = {key: value for key, value in sanitized_contact.items() if key != "contactid"}
# Update the contact in Dataverse
client.update("contact", contact_id, update_data)
print(f"Contact {contact_id} updated with sanitized data: {sanitized_contact}")

If you want to work on this, download the Python Notebook to use in VS Code…

https://github.com/pavanmanideep/DataverseSDK_PythonSamples/blob/main/Python-DataverseSDK-RemovePII.ipynb

Cheers,

PMDY

Adding intelligence to Dataverse using Dataverse AI functions

Hi Folks,

While intelligence with the use of AI is being embedded into each and every part of the Microsoft ecosystem, it is good to know the features coming in the Power Platform space.

In this blog post, let’s see how we can use Dataverse AI Functions, their usage, advantages which can greatly ease summarizing, classifying, extracting, translating, assessing sentiment, or drafting a reply for common business scenarios.

To illustrate it better, I used a different AI Function for Canvas App, Model Driven App and Power Automate, hope you can follow the same for others as well.

What are Dataverse AI Functions?

Think of Dataverse AI Functions as prebuilt AI Functions which will add intelligence in your Apps and Flows without need to collect, build and train the models. They can be used in many places such as AI Builder, Power Automate, Power Apps, Low Code Plugins. Following are the AI Functions available…

  1. AIReply – Drafts a reply to the message you provide.
  2. AISentiment – Detects sentiment for the text you provide
  3. AISummarize – Summarizes the text you provide
  4. AIClassify – Classifies the text into one or more categories, you can use this from custom copilot
  5. AIExtract – Extracts specified entities like Names of people, phone numbers, places etc.
  6. AITranslate – Translate text from another language

Let’s see their usage in our favorite Canvas Apps first as illustration is easy and later in the post, I will mention how you can call the Dataverse AI Functions from Model Driven Apps, Power Automate so that you can get the real essence of it….

Utilizing ‘Dataverse AI Functions’ in a Canvas App

Create a new Canvas App and add ‘Environment’ Datasource as shown below.

All the ‘Dataverse AI functions’ can be accessed by ‘Environment‘ as shown below.

Let’s try the AIReply function in Canvas Apps

Add a textbox for storing the prompt or input string and a button control.

On the ‘OnSelect‘ event of the button, use the following formula to store the response in the AIResponse context variable, make sure you name the variables appropriately in your formula as per your naming defined in canvas apps.

UpdateContext({AIReplyResponse:Environment.AIReply({Text:AIInput.Text})})

Now create one more text variable to store the response and change the Default value to AIReplyResponse.PreparedResponse.

Try testing the app by providing inputs as below…

You should get a response from AIReply in the response field, you can try out other functions providing the necessary parameters required.

Utilizing ‘Dataverse AI Functions’ in a Power Automate

In Power Automate, all you can do to call Dataverse AI Functions is call the Unbound Action as below.

Passing the relevant input parameters is enough to get the output from these functions.

Let’s try AISentiment

Click on test, you should a response from Power Automate with the sentiment

Utilizing ‘Dataverse AI Functions’ in a Model Driven Apps

Do you want to utilize the similar capabilities of Dataverse AI Functions inside your custom code like in Plugins, Actions etc..

Let’s try AIClassify

var request = new OrganizationRequest("AIClassify")
{
["AllowMultipleCategories"] = false,
["Categories"] = titles,
["Text"] = classifyText
};
var result = service.Execute(request);

It was pretty much similar in AIBuilder as well…

Please do note that there are quotas to use these AI Functions at tenant level, else you might get similar error like below, while I didn’t get any information regarding this from Microsoft, so I am unsure about this as of writing this post, I will keep this updated if I get to know.

Using Dataverse AI functions needs a bit of Prompt Engineering knowledge, you were looking to learn more about Prompt engineering, then check it out here.

References:

https://learn.microsoft.com/en-us/power-platform/power-fx/reference/function-ai

Cheers,

PMDY

Python + Dataverse Series – Post #03: Create, Update, Delete records via Web API

Hi Folks,

This is continuation in this Python with Dataverse Series, in this blog post, we will perform a full CRUD(Create, Retrieve, Update, Delete) in Dataverse using Web API.

Please use the below code for the same…to make any calls using WEB API to Dataverse.

import pyodbc
import msal
import requests
import json
import re
# Azure AD details
client_id = 'XXXX'
client_secret = 'XXXX'
tenant_id = 'XXXX'
authority = f'https://login.microsoftonline.com/{tenant_id}'
resource = 'https://XXXX.crm8.dynamics.com'
# SQL endpoint
sql_server = 'XXXX.crm8.dynamics.com'
database = 'XXXX'
# Get token with error handling
try:
print(f"Attempting to authenticate with tenant: {tenant_id}")
print(f"Authority URL: {authority}")
app = msal.ConfidentialClientApplication(client_id, authority=authority, client_credential=client_secret)
print("Acquiring token…")
token_response = app.acquire_token_for_client(scopes=[f'{resource}/.default'])
if 'error' in token_response:
print(f"Token acquisition failed: {token_response['error']}")
print(f"Error description: {token_response.get('error_description', 'No description available')}")
else:
access_token = token_response['access_token']
print("Token acquired successfully and your token is!"+access_token)
print(f"Token length: {len(access_token)} characters")
except ValueError as e:
print(f"Configuration Error: {e}")
print("\nPossible solutions:")
print("1. Verify your tenant ID is correct")
print("2. Check if the tenant exists and is active")
print("3. Ensure you're using the right Azure cloud (commercial, government, etc.)")
except Exception as e:
print(f"Unexpected error: {e}")
#Get 5 contacts from Dataverse using Web API
import requests
import json
try:
#Full CRUD Operations – Create, Read, Update, Delete a contact in Dataverse
print("Making Web API request to perform CRUD operations on contacts…")
# Dataverse Web API endpoint for contacts
web_api_url = f"{resource}/api/data/v9.2/contacts"
# Base headers with authorization token
headers = {
'Authorization': f'Bearer {access_token}',
'OData-MaxVersion': '4.0',
'OData-Version': '4.0',
'Accept': 'application/json',
'Content-Type': 'application/json'
}
# Create a new contact
new_contact = { "firstname": "John", "lastname": "Doe" }
print("Creating a new contact…")
# Request the server to return the created representation. If not supported or omitted,
# Dataverse often returns 204 No Content and provides the entity id in a response header.
create_headers = headers.copy()
create_headers['Prefer'] = 'return=representation'
response = requests.post(web_api_url, headers=create_headers, json=new_contact)
created_contact = {}
contact_id = None
# If the API returned the representation, parse the JSON
if response.status_code in (200, 201):
try:
created_contact = response.json()
except ValueError:
created_contact = {}
contact_id = created_contact.get('contactid') or created_contact.get('contactid@odata.bind')
print("New contact created successfully (body returned).")
print(f"Created Contact ID: {contact_id}")
# If the API returned 204 No Content, Dataverse includes the entity URL in 'OData-EntityId' or 'Location'
elif response.status_code == 204:
entity_url = response.headers.get('OData-EntityId') or response.headers.get('Location')
if entity_url:
# Extract GUID using regex (GUID format)
m = re.search(r"([0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{12})", entity_url)
if m:
contact_id = m.group(1)
created_contact = {'contactid': contact_id}
print("New contact created successfully (no body). Extracted Contact ID from headers:")
print(f"Created Contact ID: {contact_id}")
else:
print("Created but couldn't parse entity id from response headers:")
print(f"Headers: {response.headers}")
else:
print("Created but no entity location header found. Headers:")
print(response.headers)
else:
print(f"Failed to create contact. Status code: {response.status_code}")
print(f"Error details: {response.text}")
# Read the created contact
if not contact_id:
# Defensive: stop further CRUD if we don't have an id
print("No contact id available; aborting read/update/delete steps.")
else:
print("Reading the created contact…")
response = requests.get(f"{web_api_url}({contact_id})", headers=headers)
if response.status_code == 200:
print("Contact retrieved successfully!")
contact_data = response.json()
print(json.dumps(contact_data, indent=4))
else:
print(f"Failed to retrieve contact. Status code: {response.status_code}")
print(f"Error details: {response.text}")
# Update the contact's email
updated_data = { "emailaddress1": "john.doe@example.com" }
response = requests.patch(f"{web_api_url}({contact_id})", headers=headers, json=updated_data)
if response.status_code == 204:
print("Contact updated successfully!")
else:
print(f"Failed to update contact. Status code: {response.status_code}")
print(f"Error details: {response.text}")
# Delete the contact
response = requests.delete(f"{web_api_url}({contact_id})", headers=headers)
if response.status_code == 204:
print("Contact deleted successfully!")
else:
print(f"Failed to delete contact. Status code: {response.status_code}")
print(f"Error details: {response.text}")
except requests.exceptions.RequestException as e:
print(f"Request error: {e}")
except KeyError as e:
print(f"Token not available: {e}")
except Exception as e:
print(f"Unexpected error: {e}")
view raw FullCRUDWebAPI hosted with ❤ by GitHub

You can use the VS Code as IDE, copy the above code in a python file, next click on Run Python File at the top of the VS Code

Hope this helps someone making Web API Calls using Python.

If you want to try this out, download the Python Notebook and open in VS Code.

https://github.com/pavanmanideep/DataverseSDK_PythonSamples/blob/main/Python-Dataverse-FullCRUD-03.ipynb

Looking to continue following this series, don’t forget the next article in this series

Cheers,

PMDY

Enhancing Dataverse Plugins with Bulk Message Operations

Well, this could be a very interesting post as we talk about optimizing the Dataverse performance using bulk operation messages and too using Dataverse plugin customizations but wait, this post is not complete because of an issue which I will talk later in the blog. First let’s dig into this feature by actually trying out. Generally, every business wants improved performance for any logic tagged out to out of box messages and so developers try to optimize their code in various ways when using Dataverse messages.

Firstly, before diving deeper into this article, let’s first understand the differences between Standard and Elastic tables, if you want to know a bit of introduction to elastic tables which were newly introduced last year, you can refer to my previous post on elastic tables here.

The type of table you choose to store your data has the greatest impact on how much throughput you can expect with bulk operations. You can choose out of two types of tables in Dataverse, below are some key differences you can refer to: 

 Standard TablesElastic Tables
Data StructureDefined SchemaFlexible Schema
Stores data in Azure SQLStores data in Azure Cosmos DB
Data IntegrityEnsuredLess Strict
Relationship modelSupportedLimited
PerformancePredictableVariable, preferred for unpredictable and spiky workloads
AgilityLimitedHigh
PersonalizationLimitedExtensive
Standard and Elastic Table Differences

Plugins:

With Bulk Operation messages, the APIs being introduced are Create MultipleUpdateMultiple,DeleteMultiple (only for Elastic tables), Upsert Request(preview). As of now you’re not required to migrate your plug-ins to use CreateMultiple and Update Multiple instead of Create and Update messages. Your logic for Create and Update continues to be applied when applications use CreateMultiple or UpdateMultiple

This is mainly done to prevent two separate business logics for short running and long duration activities. So, it means Microsoft have merged the message processing pipelines for these messages (Create, Create Multiple; Update, Update Multiple) that means Create, Update messages continue to trigger for your existing implemented scenarios, when you update to use Create Multiple, Update Multiple still the Create, Update will behave.

Few points for consideration:

  1. While I have tested and still could see IPluginExecutionContext only provides the information and still I have noted Microsoft Documentation suggests using IPluginExecutionContext4 for Bulk Messages in Plugins where it is being shown as null yet.
  2. While you were working with Create, Update, Delete, you could have used Target property to get the input parameters collection, while working with Bulk Operation messages, you need to use Targets instead of Target.
  3. Instead of checking whether the target is Entity you need to use Entity Collection, we need to loop through and perform our desired business logic
  4. Coming to Images in plugin, these will be retrieved only when you have used IPluginExecutionContext4.

Below is the image from Plugin Registration Tool to refer(e.g. I have taken UpdateMultiple as reference, you can utilize any of the bulk operation messages)

Sample:

Below is the sample, how your Bulk operation message plugin can look like…you don’t need to use all the contexts, I have used to just check that out.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Crm.Sdk;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;
namespace Plugin_Sample
{
public class BulkMessagePlugin : IPlugin
{
public void Execute(IServiceProvider serviceProvider)
{
IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
IPluginExecutionContext2 context2 = (IPluginExecutionContext2)serviceProvider.GetService(typeof(IPluginExecutionContext2));
IPluginExecutionContext3 context3 = (IPluginExecutionContext4)serviceProvider.GetService(typeof(IPluginExecutionContext3));
IPluginExecutionContext4 context4 = (IPluginExecutionContext4)serviceProvider.GetService(typeof(IPluginExecutionContext4));
ITracingService trace = (ITracingService)serviceProvider.GetService(typeof(ITracingService));
// Verify input parameters
if (context4.InputParameters.Contains("Targets") && context.InputParameters["Targets"] is EntityCollection entityCollection)
{
// Verify expected entity images from step registration
if (context4.PreEntityImagesCollection.Length == entityCollection.Entities.Count)
{
int count = 0;
foreach (Entity entity in entityCollection.Entities)
{
EntityImageCollection entityImages = context4.PreEntityImagesCollection[count];
// Verify expected entity image from step registration
if (entityImages.TryGetValue("preimage", out Entity preImage))
{
bool entityContainsSampleName = entity.Contains("fieldname");
bool entityImageContainsSampleName = preImage.Contains("fieldname");
if (entityContainsSampleName && entityImageContainsSampleName)
{
// Verify that the entity 'sample_name' values are different
if (entity["fieldname"] != preImage["fieldname"])
{
string newName = (string)entity["fieldname"];
string oldName = (string)preImage["fieldname"];
string message = $"\\r\\n – 'sample_name' changed from '{oldName}' to '{newName}'.";
// If the 'sample_description' is included in the update, do not overwrite it, just append to it.
if (entity.Contains("sample_description"))
{
entity["sample_description"] = entity["sample_description"] += message;
}
else // The sample description is not included in the update, overwrite with current value + addition.
{
entity["sample_description"] = preImage["sample_description"] += message;
}
}
}
}
}
}
}
}
}
}

I have posted this question to Microsoft regarding the same to know more details on this why the IPluginExecutionContext4 is null , while still I am not sure if this is not deployed to my region, my environment is in India.

Recommendations for Plugins:

  • Don’t try to introduce CreateMultiple, UpdateMultiple, UpsertMultiple in a separate step as it would trigger the logic to be fired twice one for Create operation and another for CreateMultiple.
  • Don’t use batch request types such as ExecuteMultipleRequest, ExecuteTransactionRequest, CreateMultipleRequest, UpdateMultipleRequest, UpsertMultipleRequest in Plugins as user experiences are degraded and timeout errors can occur.
  • Instead use Bulk operation messages like CreateMultipleRequestUpdateMultipleRequest, UpsertMultipleRequest
  • No need to use ExecuteTransactionRequest in Synchronous Plugins as already they will be executed in the transaction.

Hope this guidance will help someone trying to customize their Power Platform solutions using Plugins.

I will write another blog post on using Bulk operation messages for Client Applications…

Cheers,

PMDY

Another way to install Plugin Registration Tool for Power Apps Developers from Nuget

Hi Folks,

Are you a Power Platform or Dynamics 365 CE Developer, you would definitely need to work on Plugin Registration tool at any given point of time and having a local application for Plugin Registration tool greatly helps…in this post, I will show a little different way to install Plugin registration tool and that too very easily.

Well, this approach is especially useful to me when I got a new laptop and need to work on Plugin Registration Tool where the Plugins already build for the implementation.

First 3 ways might have known to everyone through which you can download Plugin registration tool…do you know there is fourth approach as well…

  1. From XrmToolBox
  2. From https://xrm.tools/SDK
  3. Installation from CLI
  4. See below

Because there were limitations to use these approaches at least in my experience, I found the fourth one very useful.

  1. XrmToolBox – Not quite convenient to profile and debug your plugins
  2. https://xrm.tools/SDK – Dlls in the downloaded folder will be blocked and would need to manually unblock the DLL’s for the Tool to work properly
  3. CLI – People rarely use this.

Just do note that the approach is very easy and works only if you have a Plugin Project already. Please follow the steps below

  1. Just open the Plugin project.
  2. Right click on the solution and choose manage Nuget Packages for the solution
  3. Search for Plugin Registration tool as below

4. Choose the Plugin project and click install, confirm the prompt and agree the license agreement shown

5. Once installed, next go to the Project folder in the local machine.

6. Navigate to Packages folder, you should see a folder for Plugin Registration tool below

7. There you go, you can open the Plugin Registration Application under tools folder. You can undo the changes for the Assembly it is linked to Source control.

That’s it, how easy it was? Hope this would help someone.

Cheers,

PMDY