Python + Dataverse Series – #06: Data preprocessing steps before running Machine Learning Algorithms

Hi Folks,

If you were already a Power Platform Consultant and new to working with Python, then I would encourage to start from the beginning of this series.

Now in this series, we entered an interesting part where Machine learning algorithms were run to analyze Dataverse Data and in this post we will understand why feature scaling is a critical preprocessing step for many machine learning algorithms because it ensures that all features contribute equally to the model’s outcome, prevents numerical instability, and helps optimization algorithms converge faster to the optimal solution

Primarily before running any Machine Learning Algorithm, we need to do some data preprocessing like scaling the data, in this case we will use a formula which is used to scale using min–max normalization (feature scaling to the [0, 1] range).

#preprocessing step before running machine learning algorithms
from azure.identity import InteractiveBrowserCredential #using Interactive Login
from PowerPlatform.Dataverse.client import DataverseClient #installing Python SDK for Dataverse
import numpy as np #import Numpy Library to perform calculations
# Connect to Dataverse
credential = InteractiveBrowserCredential()
client = DataverseClient("https://ecellorsdev.crm8.dynamics.com", credential) #Creates Dataverse Client
# Fetch account data as paged batches
account_batches = client.get(
"account",
select=["accountid", "revenue"],
top=10,
) #Fetches top 10 accounts with accountid, revenue columns
revenues = []
for batch in account_batches:
for account in batch:
if "revenue" in account and account["revenue"] is not None:
revenues.append(account["revenue"])
revenues = np.array(revenues)
#Normalize the revenue
if len(revenues) > 0:
min_rev = np.min(revenues)
max_rev = np.max(revenues)
normalized_revenues = (revenues – min_rev) / (max_rev – min_rev)
print("Normalized Revenues:", normalized_revenues)
#visualize the result
import matplotlib.pyplot as plt
plt.plot(normalized_revenues, marker='o')
plt.title('Normalized Revenues from Dataverse Accounts')
plt.xlabel('Account Index')
plt.ylabel('Normalized Revenue')
plt.grid()
plt.show()

You can download the Python Notebook below if you want to work with VS Code

https://github.com/pavanmanideep/DataverseSDK_PythonSamples/blob/main/Python-PreProcessingStepBeforeMachineLearning.ipynb

Hope you found this useful…

Cheers,

PMDY

Understanding MIME Types in Power Platform – Quick Review

While many people doesn’t know the significance of MIME Type, this post is to give brief knowledge about the same before moving to understand security concepts in Power Platform in my upcoming articles.

In the Microsoft Power Platform, MIME types (Multipurpose Internet Mail Extensions) are standardized labels used to identify the format of data files. They are critical for ensuring that applications like Power Apps, Power Automate, and Power Pages can correctly process, display, or transmit files. 

Core Functions in Power Platform

  • Dataverse Storage: Tables such as ActivityMimeAttachment and Annotation (Notes) use a dedicated MimeType column to store the format of attached files alongside their Base64-encoded content.
  • Security & Governance: Administrators can use the Power Platform Admin Center to block specific “dangerous” MIME types (e.g., executables) from being uploaded as attachments to protect the environment.
  • Power Automate Approvals: You can configure approval flows to fail if they contain blocked file types, providing an extra layer of security for email notifications.
  • Power Pages (Web Templates): When creating custom web templates, the MIME type field controls how the server responds to a browser. For example, templates generating JSON must be set to application/json to be parsed correctly.
  • Email Operations: When using connectors like Office 365 Outlook, you must specify the MIME type for attachments (e.g., application/pdf for PDFs) so the recipient’s client can open them properly. 

Common MIME Types Used

File Extension MIME Type
.pdfapplication/pdf
.docxapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
.xlsxapplication/vnd.openxmlformats-officedocument.spreadsheetml.sheet
.png / .jpgimage/png / image/jpeg
.jsonapplication/json
Unknownapplication/octet-stream (used for generic binary files)

Implementing MIME type handling and file restrictions ensures your Power Platform solutions are both functional and secure.

1. Programmatically Setting MIME Types in Power Automate 

When working with file content in Power Automate, you often need to define the MIME type within a JSON object so connectors (like Outlook or HTTP) understand how to process the data. 

  • Structure: Use a Compose action to build a file object with the $content-type (MIME type) and $content (Base64 data).json
  • Dynamic Mapping: If you don’t know the file type in advance, you can use an expression to map extensions to MIME types or use connectors like Cloudmersive to automatically detect document type information. 

2. Restricting File Types in Power Apps

The Attachment control in Power Apps does not have a built-in “allowed types” property, so you must use Power Fx formulas to validate files after they are added. 

  • Validation on Add: Use the OnAddFile property of the attachment control to check the extension and notify the user if it’s invalid in PowerApps
  • Submit Button Logic: For added security, set the DisplayMode of your Submit button to Disabled if any attachment in the list doesn’t match your criteria. 

3. Global Restrictions (Admin Center)

To enforce security across the entire environment, administrators can navigate to the Power Platform Admin Center to manage blocked MIME types. Adding an extension to the blocked file extensions list prevents users from uploading those file types to Dataverse tables like Notes or email attachments. 

Hope this helps…in next post, I will be talking about Content Security Policy and how Power Platform can be secured using different sets of configuration.

Cheers,

PMDY

Adding intelligence to Dataverse using Dataverse AI functions

Hi Folks,

While intelligence with the use of AI is being embedded into each and every part of the Microsoft ecosystem, it is good to know the features coming in the Power Platform space.

In this blog post, let’s see how we can use Dataverse AI Functions, their usage, advantages which can greatly ease summarizing, classifying, extracting, translating, assessing sentiment, or drafting a reply for common business scenarios.

To illustrate it better, I used a different AI Function for Canvas App, Model Driven App and Power Automate, hope you can follow the same for others as well.

What are Dataverse AI Functions?

Think of Dataverse AI Functions as prebuilt AI Functions which will add intelligence in your Apps and Flows without need to collect, build and train the models. They can be used in many places such as AI Builder, Power Automate, Power Apps, Low Code Plugins. Following are the AI Functions available…

  1. AIReply – Drafts a reply to the message you provide.
  2. AISentiment – Detects sentiment for the text you provide
  3. AISummarize – Summarizes the text you provide
  4. AIClassify – Classifies the text into one or more categories, you can use this from custom copilot
  5. AIExtract – Extracts specified entities like Names of people, phone numbers, places etc.
  6. AITranslate – Translate text from another language

Let’s see their usage in our favorite Canvas Apps first as illustration is easy and later in the post, I will mention how you can call the Dataverse AI Functions from Model Driven Apps, Power Automate so that you can get the real essence of it….

Utilizing ‘Dataverse AI Functions’ in a Canvas App

Create a new Canvas App and add ‘Environment’ Datasource as shown below.

All the ‘Dataverse AI functions’ can be accessed by ‘Environment‘ as shown below.

Let’s try the AIReply function in Canvas Apps

Add a textbox for storing the prompt or input string and a button control.

On the ‘OnSelect‘ event of the button, use the following formula to store the response in the AIResponse context variable, make sure you name the variables appropriately in your formula as per your naming defined in canvas apps.

UpdateContext({AIReplyResponse:Environment.AIReply({Text:AIInput.Text})})

Now create one more text variable to store the response and change the Default value to AIReplyResponse.PreparedResponse.

Try testing the app by providing inputs as below…

You should get a response from AIReply in the response field, you can try out other functions providing the necessary parameters required.

Utilizing ‘Dataverse AI Functions’ in a Power Automate

In Power Automate, all you can do to call Dataverse AI Functions is call the Unbound Action as below.

Passing the relevant input parameters is enough to get the output from these functions.

Let’s try AISentiment

Click on test, you should a response from Power Automate with the sentiment

Utilizing ‘Dataverse AI Functions’ in a Model Driven Apps

Do you want to utilize the similar capabilities of Dataverse AI Functions inside your custom code like in Plugins, Actions etc..

Let’s try AIClassify

var request = new OrganizationRequest("AIClassify")
{
["AllowMultipleCategories"] = false,
["Categories"] = titles,
["Text"] = classifyText
};
var result = service.Execute(request);

It was pretty much similar in AIBuilder as well…

Please do note that there are quotas to use these AI Functions at tenant level, else you might get similar error like below, while I didn’t get any information regarding this from Microsoft, so I am unsure about this as of writing this post, I will keep this updated if I get to know.

Using Dataverse AI functions needs a bit of Prompt Engineering knowledge, you were looking to learn more about Prompt engineering, then check it out here.

References:

https://learn.microsoft.com/en-us/power-platform/power-fx/reference/function-ai

Cheers,

PMDY

Dataverse – Git Integration – Preview – Quick Review

Hi Folks,

This post is about Dataverse and Git Integration which is the most sought after feature in the todays automation Era. This is a preview feature, you would need to create a new environment with Early Access enabled to test this feature or you can use an existing US Preview environment for testing this out.

While every MDA(Model Driven Application) and it’s components can be safely and moved across the environments using Solutions with the help of Azure DevOps Pipelines. However when coming to integrating Power Platform Solutions to Azure DevOps, we had to manually export the solution and download them each and every time when we would like to commit the Solution Artifacts to Azure DevOps Repo.

With this new Preview feature we can directly integrate the Power Platform Solutions to Azure DevOps.

Let’s see this action…wait a moment, there were some prerequisites to be considered…

  1. Environment should be a Managed Environment to start using this and you need to be an Admin for the environment
  2. Azure DevOps subscription and license should be available to set this up, also permission to read source files and commits(should be a member of contributor group in Azure DevOps) from a Repo
  3. Your email address used for Azure DevOps and Power Platform Solutions should be the same

Setup:

Connecting Dataverse with Azure DevOps is easy but requires a bit of understanding of the Binding options available.

Well, there were two types of Binding options

  1. Environment Binding – Single root folder binds to all the unmanaged solutions in the environment
  2. Solution Binding – Different solutions uses a different root folder in Azure DevOps for binding

Note: Once the binding is setup, there isn’t away to change, so set this up carefully, else you may need to delete the folder and create a new one in Azure DevOps.

Let’s see one by one…for demoing purpose, I have created two projects in Azure DevOps Instance

  1. Solution Binding: When we use this, all the components will be available as pending changes
  2. Environment Binding: When we use this, all the unmanaged solution components will be mapped to one Azure DevOps root folder. Let’s set this up.

We are currently able to use only Solution binding, as Environment Binding doesn’t show up any changes to be committed, but there is a catch here.

We can set up for Environment binding and verify if the Solution components are getting marked as pending changes or not. Do note that Setting up the Binding is a one time activity for environment, once setup, it can’t be changed from one type to another.

Open https://make.powerapps.com and navigate to solutions and click on ellipses as below

Once clicked on Connect to Git

Since we were currently using Environment binding, let’s select the Connection Type as Environment

Then click on Connect, once connected, you should a alert message in power apps maker portal at the top.

Now create a new solution as below named ecellors Solution

Verify the integration by clicking on Git Integration as below

It should show as below

Now let’s add few components to the solution we created

Once added, let’s publish the unmanaged solution and verify it..

Do look closely, you should see a Source Control icon highlighted in yellow color for illustration.

Also, you should see a commit option available at the top

You should now be able to commit the solution components as if you are committing the code changes.

It also specifies the branch to which we were commiting…

While it takes few minutes unlike pushing the code to Azure DevOps to push the changes, however this would depend based on the number of solution components you were pushing..once it is done, it will show a commit message like below…

Now let’s verify our Azure DevOps Repo..for this you can go back to the main solutions page, click on Git Connection at the top..

After clicking on Git Connection, click on the link to Microsoft Azure DevOps as below

Then you should be navigated to Azure DevOps folder as below where all the solution files will be tracked component wise.

Now we will move back to Power Apps maker portal and make some changes to any of the components inside the solution…

Let’s say, I just edited the flow name and created a new connection reference, saved and published the customizations.

If you did some changes at the Azure DevOps repo level, you can come back and click on Check for updates, if there were any conflicts between changes done in Azure DevOps and component in solution, it will be shown as conflict.

We now have 3 component changes and all were listed here…you can click on Commit.

As soon as the changes are committed, you should see a message saying Commit Successful and 0 Changes, 0 Updates, 0 Conflicts.

Now you successfully integrated Dataverse Solution components with Azure DevOps without any manual intervention required while deploying solutions using Azure DevOps Pipelines.

Hope you learned something new today…while feature is still in Preview and only available for early release, while couple of issues still need to fixed by Microsoft.

I have tested this feature by creating an environment in US Preview region and this feature will be a good value to projects using Automation and this solution repository can be further deployed to other environments using Azure DevOps Pipelines.

This will be rolled out soon next year, hope you learned something new today…

Cheers,

PMDY

Using Preferred Solution in Power Apps saves you time..Quick Review

Hi Folks,

Today, I will be pointing out the advantages of using Preferred Solution and it’s consequences of using or removing it…while the feature is out there from quite few months, yet many of the Power Platform Projects are not utilizing this feature, it can reduce your hassles when many people are working together in a team and you can make sure everyone’s changes go to this solution.

Here we understand what Preferred Solution means to the makers, firstly in order to use this affectively, let’s turn the feature to create Canvas Apps & Cloud Flows in Solutions by enabling this preview feature as suggested below from https://admin.powerplatform.com, this is not mandatory step but would be better as you can add Power Automate flows and Canvas Apps in the Solution and click Save.

Next navigate to https://make.powerapps.com –> Solutions –> Set preferred solution

If no preferred solution is set, by default, it will show the Common Data Service Default Solution to set as Default Solution, if you wish to set another Solution, you can select the respective Solution from the drop down.

Enable/Disable the toggle to show Preferred Solution option in the Solutions Page.

Just click on Apply.

Advantages:

  1. Once preferred Solution is set, any components added by the makers would by default go the Preferred Solution, so makers need not worry about choosing right Solution while creating Power Platform Components.
  2. No need to worry if the solution components will be added in the default solution as the new components will be added to the preferred solution automatically.

Limitations:

  1. Preferred Solutions can be only set in Modern Designer
  2. Components created in Classic Designer won’t go to Preferred Solutions
  3. Custom Connector, Connections, DataFlows, Canvas Apps created from Image or Figma Design, Copilots/Agents, Gateways

You can always delete your preferred solution so that other makers can set their preferred solution, but do this with caution so that none of your team members or your works gets impacted.

Hope this saves few seconds of your valuable time…

Cheers,

PMDY

Microsoft.Xrm.RemotePlugin.Grpc.SandboxFabricGrpcClient error from Plugin in Dynamics 365 – Quick Review

Hi Folks,

I have been encountering a strange error since past few weeks now. If you search for this error in Internet you find nothing…the detailed error message obtained from the Plugin Trace Log is as below

System.ServiceModel.FaultException`1[Microsoft.Xrm.Sdk.OrganizationServiceFault]: Exception occured ... at Microsoft.Xrm.RemotePlugin.Grpc.SandboxFabricGrpcClient.ExecutePluginInternal(IRemotePluginRequest pluginRequest, ExecuteRequest executeRequest, Guid executionId, ISandboxFabricDuplexCommunicationHandler communicationHandler, Boolean returnTraceInfo, Guid organizationId, SandboxFabricCallTracker sandboxFabricCallTracker) +0x5d0
at Microsoft.Xrm.RemotePlugin.Grpc.SandboxFabricGrpcClient.ExecutePlugin(IRemotePluginRequest pluginRequest, IPluginExecutionContext executionContext, IPluginTracingService pluginTracingService, ISandboxFabricDuplexCommunicationHandler communicationHandler, ISet`1 earlySerializedPropertiesList, SandboxFabricCallTracker sandboxFabricCallTracker, ISandboxMemoryStreamProvider memoryStreamProvider) +0x2cd
at Microsoft.Xrm.RemotePlugin.Grpc.SandboxFabricCodeUnit.Execute(ILifetimeScope scope, IExecutionContext context, SandboxFabricCallTracker& sandboxFabricCallTracker, ISandboxMemoryStreamProvider memoryStreamProvider) +0x6e
at Castle.Proxies.Invocations.ISandboxFabricCodeUnit_Execute.InvokeMethodOnTarget() +0x13
at Castle.DynamicProxy.AbstractInvocation.Proceed() +0x2d
at Microsoft.Xrm.RemotePlugin.Client.Interceptors.SandboxFabricPluginTraceInterceptor.Intercept(IInvocation invocation, IExecutionContext context, SandboxFabricCallTracker sandboxFabricCallTracker) +0x1f

The error message looked so strange to me and I couldn’t get any idea on what is happening, I thought some problem with the Plugin Code and it was executing more than 2 mins and hence causing error related to Sandbox service of Dynamics 365. I was executing this logic placed inside an action from Power Automate…this took couple of hours to figure out what was happening…

With no clue, I had started to change Plugin Code in the following ways…

  1. Change the Synchronous Plugin to Asynchronous Plugin
    • It doesn’t show any error in Power Automate now, but in the Plugin Trace Log, it still throws the error
  2. Add Try – Catch Block
    • Add try catch block made me understood that Plugin was causing an exception due to type casting issue in my logic. This worked…

FYI, I have removed some sensitive information from the below error message.

Microsoft recommends us to use Try – Catch block for efficient error handling, in the first place, so always use proper error handling while developing Plugins, Actions or Custom Workflows in Dynamics 365 to avoid such errors.

If you face this kind of error, this is some issue within your code and nothing to do with Microsoft Services, no need to raise a Microsoft Support Ticket for resolving this.

Hope this helps someone facing the same issue…

Cheers,

PMDY

Creating In-App Notifications in Model Driven Apps in an easier way – Quick Review

Hi Folks,

In App notifications are trending these days where many customers are showing interest in implementing these for their businesses.

So, in this blog post, I am going to show you the easiest way to generate In App notification using XrmToolBox in few clicks. Use the below tool to generate one.

So, let me walk you through step by step

Step 1: Open In App Notification Builder in XrmToolBox

Step 2: In App notification is a setting that should be enabled at App level, so meaning if you have developed few Model Driven Apps, you will be able to enable the In App notification individually for each one of them.

Step 3: In the above snapshot, we should be able to select the respective App for which we want to enable the In App Notification. Red bubble besides indicate that the In App notification is not enabled for this App.

So, we need to enable it by clicking on the red icon itself, you should then be able to get this prompt as below.

Step 5: Upon confirming the confirmation dialog box, the In App notification will be enabled for that App and you the red button turns to green as below saying that In App Notification is enabled.

Now that the In App notification is enabled in the App, we will proceed with the remaining setup.

Step 6: You can proceed to give a meaningful title, body for you In App Notification. Also mention the required toast type and specify the expiry duration, Icon. Also Click on Add icon and choose the action required to be performed when In App notification is clicked.

Step 9: You can even choose the type of action to be performed…

For example, let’s use to open as dialog and show list view

Your screen should look something like below

Step 10: Once done, you can click on create and that’s it you have now created In App Notification. Now let’s test this for the user who have priveleges to access this App.

If not, you will face this error..

Log in with user account for which the In App Notification is triggered.

Hurray!!!! That’s it, how easy it was to create In App Notification in Low Code manner.

You can even get the code behind this as well…

However, there were other ways to trigger the In App Notification from a Pro Code angle, let’s discuss those as well.

In this case you need to manually turn the In App Notification feature on by going to settings for the Model Driven App as below first.

Notifications can be sent using the SendAppNotification message using SDK.

You can either trigger from and can choose based on your convenience to trigger a similar notification.

Client Scripting

    var systemuserid = '<user-guid>';
    var data = {
    "actions": [
    {
    "data": {
    "url": "?pagetype=entitylist&etn=account&viewid=00000000-0000-0000-00aa-000010001002",
    "navigationTarget": "dialog"
    },
    "title": "Link to list of notifications"
    }
    ]
    };
    var notificationRecord =
    {
    'title': 'Learning In App Notificaiton',
    'body': `In-App Notifications in Model-Driven Apps are messages or alerts designed to notify users of important events or actions within the app. These notifications appear directly inside the application, providing a seamless way to deliver information without relying on external methods such as emails.`,
    'ownerid@odata.bind': '/systemusers(' + systemuserid + ')',
    'icontype': 100000003, // Warning
    'toasttype': 200000000, // Timed
    'ttlinseconds': 1209600,
    'data': JSON.stringify(data)
    }
    Xrm.WebApi.createRecord('appnotification', notificationRecord).
    then(
    function success(result) {
    console.log('notification created with single action: ' + result.id);
    },
    function (error) {
    console.log(error.message);
    // handle error conditions
    }
    );
    view raw JS hosted with ❤ by GitHub

      Plugin/SDK

      var notification = new Entity("appnotification")
      {
      ["title"] = @"Learning In App Notificaiton",
      ["body"] = @"In-App Notifications in Model-Driven Apps are messages or alerts designed to notify users of important events or actions within the app. These notifications appear directly inside the application, providing a seamless way to deliver information without relying on external methods such as emails.",
      ["ownerid"] = new EntityReference("systemuser", new Guid("00000000-0000-0000-0000-000000000000")),
      ["icontype"] = new OptionSetValue(100000003), // Warning
      ["toasttype"] = new OptionSetValue(200000000), // Timed
      ["ttlinseconds"] = 1209600,
      ["data"] = @"{
      ""actions"": [
      {
      ""data"": {
      ""url"": ""?pagetype=entitylist&etn=account&viewid=00000000-0000-0000-00aa-000010001002"",
      ""navigationTarget"": ""dialog""
      },
      ""title"": ""Link to list of notifications""
      }
      ]
      }"
      };
      service.Create(notification);
      view raw gistfile1.txt hosted with ❤ by GitHub

      Power Automate:

      You should design your Power Automate something like below to trigger a similar notification.

        Note: Currently In App Notification will be triggered for only Model Driven Apps.

        Reference:

        In App Notification Documentation

        Hope this saves some of your time…

        Cheers,

        PMDY

        Whitepaper on Power Automate Best Practices released

        Hi Folks,

        Last week Microsoft Power CAT Team had released a white paper on Power Automate Best Practices which is mainly for Power Automate Developers who want to scale up their Power Automate Flows in enterprise implementations.

        It has been extremely useful and insightful, so I thought of sharing with everyone again.

        Please find it attached here down below

        Hope you find it useful too..

        Cheers,

        PMDY

        Understanding Dataverse search in Dynamics 365 – Quick Review

        Hi Folks,

        One of my colleagues asked about Dataverse search, hence I am writing this article on Dataverse Search in Dynamics 365 and in the end, will compare different search options available in Dynamics 365.

        Dataverse Search:

        In layman terms, Dataverse Search is a powerful search tool that helps you find information quickly across your organization’s data in Microsoft Dataverse, which is the underlying data platform for apps like Power Apps, Dynamics 365, and more, shows you all the related information from across different tables or records in one place.

        In short, Dataverse Search is the evolved version of Relevance Search, offering a more robust, faster, and user-friendly search experience including search results for text in documents that are stored in Dataverse such as PDF, Microsoft Office documents, HTML, XML, ZIP, EML, plain text, and JSON file formats. It also searches text in notes and attachments. Before enabling it, just note that once Dataverse search is enabled, it will be affected in all your Model Driven Apps, as of now, just take note.

        It is on by default, here is where you can now turn off the Dataverse Search:

        1. Navigate to https://admin.powerplatform.com
        2. Click on Environments –> Choose your required environment –> Settings –>Features

        3. Disable/Enable the Dataverse search feature.

        Once enabled, we need to configure the tables for Dataverse Search so that indexing is performed at the backend, in order to do this…

        1. Navigate to https://make.powerapps.com, select your desired solution –> Click on Overview as shown below

        Now you need to choose Manage Search Index and you can choose your desired table and fields, there isn’t a limit on the number of tables you can configure, but there is a limit on the number of fields you can configure for an environment, a maximum of 1000 fields are permitted both including system and custom fields, 50 fields are used by system, so you can configure 950 fields.

        Just note that some field types are treated as multiple fields in the Dataverse search index as indicated in this table.

        Field typeNumber of fields used in
        the Dataverse search index
        Lookup (customer, owner, or Lookup type attribute)3
        Option Set (state, or status type attribute)2
        All other types of fields1

        At the bottom of the snap above, you could see the percentage of columns indexed in this environment.

        When Dataverse search is enabled, the search box is always available at the top of every page in your app. You can start a new search and quickly find the information that you’re looking for.

        When Dataverse search is turned on, it becomes your default and only global search experience for all of your model-driven apps. You won’t be able to switch to quick find search also known as categorized search.

        You can also enable Quick actions as shown in the below table

        TableQuick actions
        AccountAssign, Share, Email a link
        ContactAssign, Share, Email a link
        AppointmentMark complete, Cancel, Set Regarding, Assign, Email a link
        TaskMark complete, Cancel, Set Regarding, Assign, Email a link
        Phone CallMark complete, Cancel, Set Regarding, Assign, Email a link
        EmailCancel, Set Regarding, Email a link

        Here is the short table comparing all types of searches in Dynamics 365…

        FunctionalityDataverse searchQuick FindAdvanced Find
        Enabled by default?Yes.
        Note: For non-production environments an administrator must manually enable it.
        Yes, for the table grid.
        No, for multiple-table quick find (categorized search). An administrator must first disable Dataverse search before multiple-table grid find can be enabled.
        Yes
        Single-table search scopeNot available in a table grid. You can filter the search results by a table on the results page.Available in a table grid.Available in a table grid.
        Multi-table search scopeThere is no maximum limit on the number of tables you can search.Searches up to 10 tables, grouped by a table.Multi-table search not available.
        Search behaviorFinds matches to any word in the search term in any column in the table.Finds matches to all words in the search term in one column in a table; however, the words can be matched in any order in the column.Query builder where you can define search criteria for the selected row type. Can also be used to prepare data for export to Office Excel so that you analyze, summarize,or aggregate data, or create PivotTables to view your data from different perspectives.
        Searchable columnsText columns like Single Line of Text, Multiple Lines of Text, Lookups, and Option Sets. Doesn’t support searching in columns of Numeric or Date data type.All searchable columns.All searchable columns.
        Search resultsReturns the search results in order of their relevance, in a single list.For single-table, returns the search results in a table grid. For multi-table, returns the search results grouped by categories, such as accounts, contacts, or leads.Returns search results of the selected row type with the columns you have specified, in the sort order you have configured.

        Hope you learned something today…if you have any questions, do let me know in the comments…

        Cheers,

        PMDY

        Polymorphic Lookup in Dynamics 365: Streamlining Your CRM with Flexible Relationships

        In Dynamics 365, a Polymorphic Lookup is a powerful feature that allows you to associate a single lookup field with multiple different entities. This feature is particularly useful when you want a field to reference multiple related entities, providing greater flexibility and efficiency in your CRM applications.

        What is a Polymorphic Lookup?

        A Polymorphic Lookup is a special type of lookup field that can refer to multiple entities rather than just one. For example, a single “Related Entity” field can refer to either a Contact, Account, or Opportunity, making it versatile for various business scenarios. This capability is referred to as “polymorphism” because the lookup field can resolve to different types of entities at runtime.

        Example Scenario:

        Consider a sales scenario where a “Related Entity” can be a Customer, but the customer could be either an Account or a Contact. Rather than having two separate lookup fields (one for Account and another for Contact), you can create a polymorphic lookup field, which makes your user interface simpler and more streamlined.

        How Does Polymorphic Lookup Work in Dynamics 365?

        In Dynamics 365, polymorphic lookup fields are implemented as part of the Relationship between entities. The key concept here is the EntityReference, which dynamically resolves to the appropriate entity type (e.g., Account, Contact, etc.) based on the actual value selected by the user.

        1. Field Definition:
          • When defining a lookup field, you define a Relationship where the field can refer to multiple target entities.
          • The system uses the Type and Id to determine the related entity.
        2. Lookup Resolution:
          • At runtime, when a user selects a value in the polymorphic lookup field, the system dynamically resolves which type of entity to link to.
          • The field displays the appropriate name (e.g., Account or Contact) based on the entity that the user selects.

        Creating Polymorphic Lookups in Dynamics 365

        Polymorphic lookup fields are typically used in the following types of scenarios:

        • Custom Relationships: When you need to create a lookup that can reference multiple different entities.
        • Shared Relationship: For cases where one relationship applies to more than one entity, such as a lookup that could refer to either a Contact or an Account.
        Steps to Create a Polymorphic Lookup Field:
        1. Navigate to the Customization Area:
          • Go to the Settings area in Dynamics 365 and select Customizations.
          • Select Customize the System to open the solution where you want to add the polymorphic lookup field.
        2. Create a New Field:
          • In the relevant entity, click on Fields, and then select New.
          • Choose the Lookup data type for the field.
        3. Define the Polymorphic Lookup:
          • Under the Related Entity section, select Custom to define the multiple entities this lookup should support.
          • Select the Entity Relationships where this lookup should point to multiple entities.
        4. Save and Publish:
          • Save the field and publish your customizations to apply the changes.

        Example: Setting Up Polymorphic Lookup for Customer

        Suppose you’re designing a custom Case entity and you want to add a lookup for the Customer. Instead of creating separate lookups for Contact and Account, you can create a polymorphic lookup that links to either an Account or Contact as the Customer.

        Steps:
        • Create a Customer Lookup field in the Case entity.
        • Define the Customer Lookup field to support both Account and Contact entities.
        • After publishing the field, the user will see the lookup field and will be able to choose either an Account or Contact as the Customer.

        Use Cases for Polymorphic Lookup

        1. Consolidating Related Data:
          • Polymorphic lookups help streamline user experience by consolidating multiple lookups into a single field, especially when dealing with common relationships across different entities.
        2. Reducing Redundancy:
          • Rather than having separate lookup fields for Account and Contact in every related form, you can reduce redundancy by using polymorphic lookups, which allows referencing both entities in one field.
        3. Improved Reporting and Analytics:
          • When data is related across multiple entities, using a polymorphic lookup can make it easier to pull reports and perform analysis without requiring multiple joins or complex queries.

        Considerations and Limitations

        While polymorphic lookups are powerful, they come with certain limitations:

        • Limited to Certain Fields: Polymorphic lookups are supported only in certain system fields (like Regarding in activities), but may not be available for every custom scenario.
        • API Handling: When working with the Dynamics 365 Web API, the polymorphic lookup is handled through special attributes that require careful parsing to identify the correct entity type.
        • UI Considerations: Although polymorphic lookups streamline the user interface, they can also confuse users who are unfamiliar with the concept. It’s important to have clear documentation and training for users on how to use these fields.

        Conclusion

        Polymorphic lookups in Dynamics 365 provide an elegant solution for scenarios where a lookup field needs to refer to multiple entity types. By understanding and using polymorphic lookups effectively, you can streamline your CRM solutions, reduce redundancy, and improve your application’s flexibility. It’s important to consider the limitations and ensure that users are properly guided in utilizing these fields within your system.

        You can easily create this Polymorphic Lookup from XrmToolBox as well…

        https://pascalcase.com/Home/Blog/understanding-and-using-polymorphic-lookups-in-dynamics-365-with-xrmtoolbox

        Hope this helps.

        Cheers,

        PMDY