Maximizing Your Power Platform Solution’s Reach: Essential Performance Considerations for Optimal Efficiency

Hi Folks,

This blog post is all about performance considerations for your Power Platform CE Projects and how you can plan to optimize application performance for your Power Apps. So I just want to take you through them…

Are you tired of creating solutions for longer durations and while at the end of the project or during UAT you end up facing performance issues for the solutions you have developed, one of the most important non-functional requirements for a project’s success is Performance. Satisfying performance requirements for your users can be a challenge. Poor performance may cause failures in user adoption of the system and lead to project failure, so you might need to be careful for every decision you take while you design your solutions in the below stages.

Let’s talk about them one by one..

1. Network Latency and bandwidth

A main cause of poor performance of Dynamics 365 apps is the latency of the network over which the clients connect to the organization. 

  • Bandwidth is the width or capacity of a specific communications channel.
  • Latency is the time required for a signal to travel from one point on a network to another and is a fixed cost between two points. And usually many of these “signals” travel for a single request.

Lower latencies (measured in milliseconds) generally provide better levels of performance. Even if the latency of a network connection is low, bandwidth can become a performance degradation factor if there are many resources sharing the network connection, for example, to download large files or send and receive email.

Dynamics 365 apps are designed to work best over networks that have the following elements: 

  • Bandwidth greater than 50 KBps (400 kbps)
  • Latency under 150 ms

These values are recommendations and don’t guarantee satisfactory performance. The recommended values are based on systems using out-of-the box forms that aren’t customized.

If you significantly customize the out-of-box forms, it is recommend that you test the form response to understand bandwidth needs.   

You can use the diagnostics tool to determine the latency and bandwidth:

  1. On your computer or device, start a web browser, and sign in to an organization.
  2. Enter the following URL, https://myorg.crm.dynamics.com/tools/diagnostics/diag.aspx, where crm.dynamics.com is the URL of your organization.
  3. Click Run.

Also, to mitigate higher natural latency for global rollouts, customers should leverage Dynamics 365 Apps successfully by having smart design for their applications. 

2.Smart Design for your application

Form design 

  • Keep the number of fields to a minimumThe more fields you have in a form, the more data that needs to be transferred over the internet or intranet to view each record. Think about the interaction the user will have with the form and the amount of data that must be displayed within it.
  • Avoid including unnecessary JavaScript web resource librariesThe more scripts you add to the form, the more time it will take to download them. Usually, scripts are cached in your browser after they are loaded the first time, but the performance the first time a form is viewed often creates a significant impression.
  • Avoid loading all scripts in the Onload eventIf you have code that only supports OnChange events for fields or the OnSave event, make sure to set the script library with the event handler for those events instead of the OnLoad event. This way loading those libraries can be deferred and increase performance when the form loads.
  • Use tab events to defer loading web resourcesAny code that is required to support web resources or IFRAMEs within collapsed tabs can use event handlers for the TabStateChange event and reduce code that might otherwise have to occur in the OnLoad event.
  • Set default visibility optionsAvoid using form scripts in the OnLoad event that hide form elements. Instead set the default visibility options for form elements that might be hidden to not be visible by default when the form loads. Then, use scripts in the OnLoad event to show those form elements you want to display. If the form elements are never made visible, they should be removed from the form rather than hidden.
  • Watch out for synchronous web requests as they can cause severe performance issues. Consider moving to asynchronous for some of these web requests. Also, choose WebApi over of creating Xml HTTP Requests (XHR) on your own. 
  • Avoid opening a new tab or window and do open the window in the main form dialog. 
  • For Command Bar, keep the number of controls to a minimumWithin the command bar or the ribbon for the form, evaluate what controls are necessary and hide any that you don’t need. Every control that is displayed increases resources that need to be downloaded to the browser. Use asynchronous network requests in Custom Rules When using custom rules that make network requests in Unified Interface, use asynchronous rule evaluation.

Learn more Design forms for performance in model-driven apps – Power Apps | Microsoft Learn

Latest version of SDK and APIs 

The latest version of SDK, Form API and WebAPI endpoints should be used to support latest product features, roadmap alignment and security. 

APIs calls and custom FetchXML call velocity 

Only the columns required for information or action should be included in API calls

  • Retrieving all columns (*) creates significant overhead on the database engine when distributed across significant user load. Optimization of call velocity is key to avoid “chatty” forms that unnecessarily make repeated calls for the same information in a single interaction.
  • You should avoid retrieving all columns in a query result because of the impact on a subsequent update of records. In an update, this will set all field values, even if they are unchanged, and often triggers cascaded updates to child records. Leverage the most efficient connection mechanism (WebAPI vs SDK) and reference this doc site for guidance on the appropriate approach.

Consider reviewing periodically the Best practices and guidance when coding for Microsoft Dataverse – Power Apps | Microsoft Learn and ColumnSet.AllColumns Property (Microsoft.Xrm.Sdk.Query) | Microsoft Learn.

Error handling across all code-based events 

You should continue to use the ITracingService.Trace to write to the Plug-in Trace Log table when needed. If your plug-in code uses the ILogger interface and the organization does not have Application Insights integration enabled, nothing will be written. So, it is important to continue to use the ITracingService Trace method in your plug-ins. Plug-in trace logs continue to be an important way to capture data while developing and debugging plug-ins, but they were never intended to provide telemetry data.  

For organizations using Application Insights, you should use ILogger because it will allow for telemetry about what happens within a plug-in to be integrated with the larger scope of data captured with the Application Insights integration. The Application Insights integration will tell you when a plug-in executes, how long it takes to run and whether it makes any external http requests. Learn more about tracing in plugins Logging and tracing (Microsoft Dataverse) – Power Apps | Microsoft Learn.   

Use Solution Checker to analyze solution components 

Best practice is to run Solution Checker for all application code and include it as mandatory step while you design solutions or check when you complete developing your custom logic.

Quick Find 

For an optimal search experience for your users consider the next:

  • All columns you expect to return results in a quick find search need to be included in the view or your results will not load as expected.
  • It is recommended to not use option sets in quick find columns. Try using the view filtering for these. 
  • Minimize the number of fields used and avoid using composite fields as searchable columns. E.g., use first and last name as searchable vs full name.
  • Avoid using multiple lines of text fields as search or find columns.
  • Evaluate Dataverse search vs using leading wildcard search

3. Training

This step should be done during user training or during UAT. To ensure optimal performance of Dynamics 365, ensure that users are properly leveraging browser caching. Without caching, users can experience cold loads which have lower performance than partially (or fully) warm loads.

 Make sure to train users to: 

  • Use application inline refresh over browser refresh (should not use F5)
  • Use application inline back button instead browser’s back button.
  • Avoid InPrivate/Incognito modes in browser which causes cold loads.
  • Make users aware that running applications which consumes lot of bandwidth (like video streaming) may affect performance.
  • Do not install browser extensions unless they are necessary (this might be also blocked via policy)
  • Do use ‘Record Set’ to navigate records quickly without switching from form back to the list. 

4. Testing

For business processes where performance is critical or processes having complex customizations with very high volumes, it is strongly recommended to plan for performance testing. Consider reviewing the below technical talk series describing important performance considerations, as well as sharing practical examples of how to set up and execute performance testing, and analyze and mitigate performance issues. Reference: Performance Testing in Microsoft Dynamics 365 TechTalk Series – Microsoft Dynamics Blog

5. Monitoring

You should define a monitoring strategy and might consider using any of the below tools based on your convenience.

  1. Monitor Dynamic 365 connectivity from remote locations continuously using network monitoring tools like Azure Network Performance Monitor or 3rd party tools. These tools help identify any network related problems proactively and drastically reduce troubleshooting time of any potential issue. 
  2. Application Insights, a feature of Azure Monitoris widely used within the enterprise landscape for monitoring and diagnostics. Data that has already been collected from a specific tenant or environment is pushed to your own Application Insights environment. The data is stored in Azure Monitor logs by Application Insights, and visualized in Performance and Failures panels under Investigate on the left pane. The data is exported to your Application Insights environment in the standard schema defined by Application Insights. The support, developer, and admin personas can use this feature to triage and resolve Telemetry events for Microsoft Dataverse – Power Platform | Microsoft Learn
  3. Dataverse and PowerApps analytics in the Power Platform Admin Centre. Through the Plug-in dashboard in the Power Platform Admin Center you can view metrics such as average execution time, failures, most active plug-ins, and more.
  4. Dynamics 365 apps include a basic diagnostic tool that analyzes the client-to-organization connectivity and produces a report.
  5. Monitor is a tool that offers makers the ability to view a stream of events from a user’s session to diagnose and troubleshoot problems. Works both for model driven apps and canvas apps. 

I hope this blog post had helped you learn or know something new…thank you for reading…

Cheers,

PMDY

Opening a Custom page to capture entity details for Case Rejection in Model Driven Apps

Hi Folks,

In this blog post, I will talk about implementing a custom page for your implementations.

Here in our use case, customer want to see a pop up dialog box where they can reject the cases from a button and when reject is clicked, there should be a dialog box to capture the reject reason and comments and update them back to the record. So for this we had to implement a custom page and called from a Ribbon button. If you just want to show an alert, you can very easily implement using JavaScript with the help of OOB Alert Dialog…

Xrm.Navigation.openAlertDialog(alertStrings,alertOptions).then(closeCallback,errorCallback);

But in case as user want to update entity details like optionset field, directly from the pop up, you should consider using the approach as we did using a custom page.

.

All we have used is JavaScript, Ribbon Workbench and Custom Page…First is to design the custom page in https://make.powerapps.com/

The optionset for Reject Reason is bound to the Reject Reason combo box using the below property.

On the App start, we will set the parameter with what we have supplied from the ribbon on-click function.

On the OnSelect property of the Save button, we can use the below function

Function:

If(IsBlank(RejectReasondrp.Selected) Or IsBlank(txtRemarks.Value),Set(varmsg,"Fill both the values")&&Set(varmsgpopup, true),
Patch(Cases,LookUp(Cases,Case=GUID(CaseId.Text)),{Comments:txtRemarks.Value,'Rejection Reason':RejectReasondrp.Selected.Value}));
If(!IsBlank(RejectReasondrp.Selected) && !IsBlank(txtRemarks.Value),Set(varShowpopup,true),"");
view raw OnSavePowerFx hosted with ❤ by GitHub

Here’s the js code for the button OnClick Event…

//On Click of Reject button
onClickOfRejectRibbonButton: function (executionContext) {
"use strict";
var formContext = executionContext;
var recordId = formContext.data.entity.getId();
recordId = recordId.replace("{", "");
recordId = recordId.replace("}", "");
var contactId = formContext.getAttribute("customerid").getValue()[0].id;
contactId = contactId.replace("{", "");
contactId = contactId.replace("}", "");
var pageInput = {
pageType: "custom",
name: "new_custompage_7e429",
entityName: "incident",
recordId: recordId
};
var navigationOptions = {
target: 2,
position: 1,
height: 400,
width: 700,
title: "Case Resolution Confirmation" // Enter Title Of Your Choice
};
//Using navigateTo Client API.
Xrm.Navigation.navigateTo(pageInput, navigationOptions).then(
function success() {
// Run code on success
//formContext.data.refresh();
},
function error() {
// Handle errors
formContext.data.refresh();
}
);
}

Here the ribbon workbench customization added…

Finally publish the customizations and add the custom page to the model driven app…don’t forget to add this to your app as this mandatory to get the authorization to your page as below, else you see below error in developer tools of your browser as below…and no custom page opens up…

That’s it…when a Reject is clicked, you should a see a page as below..

Upon entering the details as above, you will be shown a confirmation screen as below..

Once you click on Close, the selected details will be updated back in the record.

Hope this helps someone implementing custom page for a similar requirement….

Cheers,

PMDY

The refresh token has expired due to inactivity when connecting to Power Pages using Power Apps CLI – Quick Fix

Hi Folks,

This post is about a quick fix for an error occurred with Power Apps CLI.

I was trying to connect to my organization using CLI and that’s when I encountered this error.

Prerequisites:

Power Apps CLI, Visual Studio Code

After installing the prerequisites, I was trying to connect to my Power Pages available in my organization from VS Code terminal using below command.

pac paportalist 

It’s then I encountered the below error

It’s then I understood that due to inactivity, it is failing…

Your Power Platform CLI connection is failing due to an expired refresh token and an ExternalTokenManagement Authentication configuration issue. Here’s how you can resolve it:

Fix:

Reauthenticate with Dataverse

pac auth clear
pac auth create --url https://orgXXX.crm8.dynamics.com --username admin@Ecellors.onmicrosoft.com --password [your password]

Creating new authentication profile resolves this issue…

    Now try to run the above command.

    This should prompt a new login window to authenticate your request, provide the details and you should be able to login.

    Hope this helps..

    Cheers,

    PMDY

    Connecting to your Dataverse instance to run SQL Queries without using XrmToolBox

    Hi Folks,

    Do you know that you can connect to your Dataverse DB right from your old toolbox SSMS, an express version would be more than enough to try out. Possibly we didn’t think of, but yes, we can…so let’s see that in this blog post.

    Open SSMS..

    1.Select Server type as Database Engine

    2. Server name as the environment URL from your Power Platform Admin Center as below.

    3. So key in those details as below, make sure to Select Authentication method as Azure Active Directory – Universal with MFA option.

    Once you click on Connect, you will be prompted for authentication via browser.

    Once your Sign-In is successful, you will be able to see.

    That’s it, how simple it was connecting to your Dataverse instances…

    Having said that it’s easy to connect to Dataverse, not all operations performed using normal transact SQL are supported here using Dataverse SQL. You could see it says Read-Only besides the instance name, that means that you don’t have any capabilities to modify from SQL.

    Because Dataverse SQL is a subset of Transact-SQL. If you want to see what statements are supported and what not, just go ahead to this link to find out.

    This opens a whole lot of opportunities to explore, so don’t forget to check this out.

    References:

    Dataverse SQL and Transact SQL

    Cheers,

    PMDY

    Cloning feature branch from Azure DevOps repository doesn’t get you the latest changes..?

    Hi Folks,

    This blog post is just an observation from my experiences of getting the latest version of code from a remote development feature branch cloned from the main branch. I didn’t observe this my first sight and because of couple of other issues, I had overseen this, spent over a half an hour and I had to giggle after knowing this.

    If you were aware, as of my last update in September 2021, Azure DevOps and Visual Studio have been integrated to support seamless code collaboration and version control.

    So usually in day-to-day activities of any Developer working Microsoft Technology stack, Pull, Push, Clone, Merge of Azure DevOps repository directly from Visual Studio is quite common.

    Usually, to clone a repository from Azure DevOps, you follow the below steps.

    Step 1: Open Visual Studio of any
    version, preferably after VS 2017
    Step 2: Click on Clone the repository.

    Step 3: Enter the Azure DevOps
    Repository URL and provide the path in the prompt.

    Step 4: Select your respective
    repository and click on Sign in

    Step 5: Once you are done click on
    Clone, all your source code is now available in your IDE (Visual Studio)

    There might be cases when you check and see you were not able to get the latest changes from your feature branch, those were present in the repo but not in your Visual Studio. Closing the Visual Studio and redoing the Cloning process didn’t help. Then I thought it could be because of Cache of Visual Studio in my PC, so I tried clearing cache following my favorite blog post written earlier in this blog. Even this didn’t help either, thanks to my buddy Mallikarjun C who gave me the clue and here it goes.

    Whenever you were cloning a solution using above approach, ideally you will be checked out to the Main branch and not the feature branch which you were expecting to be checked out to, as Main is set as Default branch.

    If you just see below, it wasn’t checked out to Develop, instead it was main. By default, with this approach, you will by default checked out to main branch.

    Hence you were seeing the changes of the main branch itself and not the Develop branch.

    Instead of this, as I learned I suggest you clone directly to your favorite IDE from Azure DevOps itself in few clicks.

    Step 1: While you are in your respective branch in Azure DevOps, click on Clone option as highlighted below.

    Step 2: It will then ask you to choose the IDE to which you can download the source code.

    Hope this helps someone figuring this out..

    Cheers,

    PMDY

    Avoiding Parallelism in Dynamics 365 Plugins/Custom Workflows: Unraveling the Pitfalls and Maximizing Efficiency

    Subscribe to continue reading

    Subscribe to get access to the rest of this post and other subscriber-only content.

    Power Platform Requests usage…check it out in a no code way now in Admin Center(Preview)

    Hi Folks,

    As a Power Platform Admin/Consultant…did you often worry about your Power Platform Request Limits and usage left…? Do you receive warning messages from Microsoft regarding the usage of your database exceeded..? Want to see what are Custom Plugin Errors encountered while using your Model Driven App targeting Dataverse….then want to consolidate them and forward to your team to look into the issues without much efforts….then you were in the right place…

    Just login with your credentials to admin page https://admin.powerplatform.microsoft.com/.

    Expand Resources to the left…to find the Capacity menu

    If you just want to know only the data usage, then you can ahead and click on Download as shown above to get one.

    Want to get in depth analysis…then click on Details as highlighted in the same snap above.

    This page shows your Database usage/File usage and respective categorizations by table as below..

    These are reports which I was able to extract from my trial environment, however all the reports were not available currently in my region. Yes, this is expected as this feature is still in preview and not recommended for Production Projects as of now. Definitely in the future…

    Note:

    1. Many people including me till now thought that Plugins or at least any operation performed within Model Driven app will not be counted for the API request limits. But…
    2. If the requests are making CRUD, assign, or share–type requests, they’ll count except internal requests. For classic workflows, this includes actions such as checking conditions, starting child workflows, or stopping workflows.
    3. You should never use any third party tools for Integration whenever you were facing any request limit issues.
    4. Request limits are applied differently for licensed users and Non-licensed users.
    5. You can add more capacity to any of your products by assigning your environment in the manage addons page.

    Hope you found this post helpful…

    Cheers,

    PMDY

    Stop using OData V2.0 endpoint going further for your implementations….!

    Hi Folks,

    This blog is just to let you know why you should stop implementing OData calls using V2.0 version. I am pretty sure almost every Dynamics CE project out there have used this OData calls definitely in their implementations from quite some time. While some of new implementations have replaced the logic using Web API, still some people go with using OData V2.0 calls to build their functionality using JavaScript.

    Microsoft had actually planned to remove this endpoint from April 30, 2023. But they deferred this because many projects are’nt yet prepared for removal of this end point and help the customers prepare for this transition to Web API end point.

    Identify if you still using OData V2.0 end point, actually Organization Data Service is an OData V2.0 endpoint which was introduced with Dynamics CRM 2011..it’s deprecated way back with Dynamics 365 CE version 8.0.

    So now, how to identify where and all you were using OData End Points in your code…you shouldn’t expect that existing code will work with only minor changes and this work can be taken at a later stage. This was a high priority warning message from Microsoft stating the removal, so I urge all of you to be prepared for this removal very soon and you shouldn’t be surprised.

    So where to change…..?

    Below are the places where you should change your way of implementation and align with Microsoft…

    1. The Organization Data Service using this end point /XRMServices/2011/OrganizationData.svc in Javascript, you can find it out with the help of the checker service rule web-avoid-crm2011-service-odata for identification. This can be code which was making OData calls to perform CRUD Operations on the current table or related table.
    2. Check any other code, including PowerShell scripts, that send requests to this endpoint: /xrmservices/2011/organizationdata.svc.
    3. Cross Check your Power BI reports or Excel Data sources that may be using this endpoint.

    Note:

    This announcement does not involve the deprecated Organization Service SOAP endpoint, meaning using Organization service in plugins. At this time, no date has been announced for the removal of that endpoint. At the time of writing this blog post, Microsoft didn’t announce whether this removal is only for Online or On Premise Versions.

    References:

    How to use Application Insights to identify usage of the OrganizationData.svc endpoint?

    OData v2.0 Service removal date announcement

    The Clock is Ticking on Your Endpoint

    Do not use the OData v2.0 endpoint

    Hope this saves time and effort implementing your Dynamics CE Solutions…

    Cheers,

    PMDY

    Xrm.WebAPI with Promise for synchronous calls in Javascript

    Hi Folks,

    Here is how I have quickly achieved the synchronous Retrieve multiple call using Web API and Promises with the help of JavaScript. I don’t want to make my post too detail, but I would like to share the approach.

    All I want to do is to just Restrict saving the Contact creation if the Postal Code entered is not present in the system. But this call should be synchronous as the message should be shown immediately incase postal code is not found in the system and prevent saving the contact record. All you need to do is simple, just call the below function on change of Postal Code in Contacts.

    Here in place of XMLHTTPRequest, I have used Xrm.WebAPI so that it won’t show a critial warning in Solution Checker.

    ValidatePostalCode: function (executionContext) {
    "use strict";
    var formContext = executionContext.getFormContext();
    var postalcode = formContext.getAttribute(Resident.Fields.address1_postalcode).getValue();
    var message = "Please enter a valid Postal code; Refer to Postal Code Mappings"
    var uniqueId = "cnt_postalcodenotpresent";
    return new Promise(function (resolve, reject) {
    Xrm.WebApi.retrieveMultipleRecords("new_postalcodes", "?$select=new_postalcode&$filter=hsg_postalcode eq '" + postalcode + "' ").then(
    function success(result) {
    var isNotFound = false;
    if(result !== undefined)
    isNotFound = result.entities.length === 0 ? true : false;
    if (isNotFound) {
    var errorMessage = "Postal Code Mapping is not present for the given postal code"
    formContext.ui.setFormNotification(errorMessage, "ERROR", uniqueId);
    }
    else {
    Resident.isValidationNeeded = false;
    formContext.ui.clearFormNotification(uniqueId);
    formContext.data.entity.save();
    }
    // return true or false
    resolve(isNotFound);
    },
    function (error) {
    reject(error.message);
    //console.log(error.message);
    }
    );
    });
    }

    References:

    What is Promise?

    Web API Retrieve Multiple

    Action based on Async Operation

    Cheers,

    PMDY

    Impersonation – Level Up Hidden Feature that every Power Platform CE Developer ought to know

    Hi Folks,

    By this time, almost every Dynamics 365 Developer had some day or the other used Level up in their Custom Engagement consultant career.

    While every one out in the Dynamics CRM Space know about the widely popular God mode available in Level Up which helps in the day to day administration and maintenance activity of CE applications.

    Today in this blog post, lets see how we can achieve impersonation using Level up. We all know about impersonation in Dynamics CRM possibly using Plugins via Plugin Registration tool or directly through code.

    But do you know, using Level Up, do you know that you even don’t need to write any piece of code but can impersonate to any user in the system with thin a few clicks right from your browser. If this interests you, just follow along…

    Just open the level up chrome extension. This is how your home screen look like once opened..to use this feature, just click on Impersonate as highlighted in the below image. For all this activity, I am logged in as Admin user to the system.

    Next you can search with the username with whom you want to impersonate with i.e. here I would like to impersonate with my user account. You can search and impersonate with any user in the system.

    Once you click on SEARCH USER

    Upnext, click on impersonate button available as below

    It opens a new window impersonated with the user account as the user choosen for the impersonation.

    The screen shows as the user pavan is the logged in user…

    Now you can perform all the operations same as what user pavan has…how cool is it…this feature perfects suits when you want to test out the application security roles where you would like to make a change to a user role and assign to other team member, test it without needing his/her presence. Cool to know, isn’t it….

    References:

    You can install the extension on Edge, Firefox and Chrome.

    BrowserStore Link
    Chromehttps://chrome.google.com/webstore/detail/level-up-for-dynamics-crm/bjnkkhimoaclnddigpphpgkfgeggokam
    Firefoxhttps://addons.mozilla.org/en-US/firefox/addon/level-up-for-d365-power-apps/
    Edgehttps://microsoftedge.microsoft.com/addons/detail/level-up-for-dynamics-365/mdjlgdkgmhlmcikdmeehcecolehipicf

    GitHub for source code and project documentation

    Cheers to Nataraj Yegnaraman who developed this cool tool…

    Thanks for reading…

    PMDY