Opening a Custom page to capture entity details for Case Rejection in Model Driven Apps

Hi Folks,

In this blog post, I will talk about implementing a custom page for your implementations.

Here in our use case, customer want to see a pop up dialog box where they can reject the cases from a button and when reject is clicked, there should be a dialog box to capture the reject reason and comments and update them back to the record. So for this we had to implement a custom page and called from a Ribbon button. If you just want to show an alert, you can very easily implement using JavaScript with the help of OOB Alert Dialog…

Xrm.Navigation.openAlertDialog(alertStrings,alertOptions).then(closeCallback,errorCallback);

But in case as user want to update entity details like optionset field, directly from the pop up, you should consider using the approach as we did using a custom page.

.

All we have used is JavaScript, Ribbon Workbench and Custom Page…First is to design the custom page in https://make.powerapps.com/

The optionset for Reject Reason is bound to the Reject Reason combo box using the below property.

On the App start, we will set the parameter with what we have supplied from the ribbon on-click function.

On the OnSelect property of the Save button, we can use the below function

Function:

If(IsBlank(RejectReasondrp.Selected) Or IsBlank(txtRemarks.Value),Set(varmsg,"Fill both the values")&&Set(varmsgpopup, true),
Patch(Cases,LookUp(Cases,Case=GUID(CaseId.Text)),{Comments:txtRemarks.Value,'Rejection Reason':RejectReasondrp.Selected.Value}));
If(!IsBlank(RejectReasondrp.Selected) && !IsBlank(txtRemarks.Value),Set(varShowpopup,true),"");
view raw OnSavePowerFx hosted with ❤ by GitHub

Here’s the js code for the button OnClick Event…

//On Click of Reject button
onClickOfRejectRibbonButton: function (executionContext) {
"use strict";
var formContext = executionContext;
var recordId = formContext.data.entity.getId();
recordId = recordId.replace("{", "");
recordId = recordId.replace("}", "");
var contactId = formContext.getAttribute("customerid").getValue()[0].id;
contactId = contactId.replace("{", "");
contactId = contactId.replace("}", "");
var pageInput = {
pageType: "custom",
name: "new_custompage_7e429",
entityName: "incident",
recordId: recordId
};
var navigationOptions = {
target: 2,
position: 1,
height: 400,
width: 700,
title: "Case Resolution Confirmation" // Enter Title Of Your Choice
};
//Using navigateTo Client API.
Xrm.Navigation.navigateTo(pageInput, navigationOptions).then(
function success() {
// Run code on success
//formContext.data.refresh();
},
function error() {
// Handle errors
formContext.data.refresh();
}
);
}

Here the ribbon workbench customization added…

Finally publish the customizations and add the custom page to the model driven app…don’t forget to add this to your app as this mandatory to get the authorization to your page as below, else you see below error in developer tools of your browser as below…and no custom page opens up…

That’s it…when a Reject is clicked, you should a see a page as below..

Upon entering the details as above, you will be shown a confirmation screen as below..

Once you click on Close, the selected details will be updated back in the record.

Hope this helps someone implementing custom page for a similar requirement….

Cheers,

PMDY

Stop using OData V2.0 endpoint going further for your implementations….!

Hi Folks,

This blog is just to let you know why you should stop implementing OData calls using V2.0 version. I am pretty sure almost every Dynamics CE project out there have used this OData calls definitely in their implementations from quite some time. While some of new implementations have replaced the logic using Web API, still some people go with using OData V2.0 calls to build their functionality using JavaScript.

Microsoft had actually planned to remove this endpoint from April 30, 2023. But they deferred this because many projects are’nt yet prepared for removal of this end point and help the customers prepare for this transition to Web API end point.

Identify if you still using OData V2.0 end point, actually Organization Data Service is an OData V2.0 endpoint which was introduced with Dynamics CRM 2011..it’s deprecated way back with Dynamics 365 CE version 8.0.

So now, how to identify where and all you were using OData End Points in your code…you shouldn’t expect that existing code will work with only minor changes and this work can be taken at a later stage. This was a high priority warning message from Microsoft stating the removal, so I urge all of you to be prepared for this removal very soon and you shouldn’t be surprised.

So where to change…..?

Below are the places where you should change your way of implementation and align with Microsoft…

  1. The Organization Data Service using this end point /XRMServices/2011/OrganizationData.svc in Javascript, you can find it out with the help of the checker service rule web-avoid-crm2011-service-odata for identification. This can be code which was making OData calls to perform CRUD Operations on the current table or related table.
  2. Check any other code, including PowerShell scripts, that send requests to this endpoint: /xrmservices/2011/organizationdata.svc.
  3. Cross Check your Power BI reports or Excel Data sources that may be using this endpoint.

Note:

This announcement does not involve the deprecated Organization Service SOAP endpoint, meaning using Organization service in plugins. At this time, no date has been announced for the removal of that endpoint. At the time of writing this blog post, Microsoft didn’t announce whether this removal is only for Online or On Premise Versions.

References:

How to use Application Insights to identify usage of the OrganizationData.svc endpoint?

OData v2.0 Service removal date announcement

The Clock is Ticking on Your Endpoint

Do not use the OData v2.0 endpoint

Hope this saves time and effort implementing your Dynamics CE Solutions…

Cheers,

PMDY

Maximizing Your Power Platform Solution’s Reach: Essential Performance Considerations for Optimal Efficiency

Hi Folks,

This blog post is all about performance considerations for your Power Platform CE Projects and how you can plan to optimize application performance for your Power Apps. So I just want to take you through them…

Are you tired of creating solutions for longer durations and while at the end of the project or during UAT you end up facing performance issues for the solutions you have developed, one of the most important non-functional requirements for a project’s success is Performance. Satisfying performance requirements for your users can be a challenge. Poor performance may cause failures in user adoption of the system and lead to project failure, so you might need to be careful for every decision you take while you design your solutions in the below stages.

Let’s talk about them one by one..

1. Network Latency and bandwidth

A main cause of poor performance of Dynamics 365 apps is the latency of the network over which the clients connect to the organization. 

  • Bandwidth is the width or capacity of a specific communications channel.
  • Latency is the time required for a signal to travel from one point on a network to another and is a fixed cost between two points. And usually many of these “signals” travel for a single request.

Lower latencies (measured in milliseconds) generally provide better levels of performance. Even if the latency of a network connection is low, bandwidth can become a performance degradation factor if there are many resources sharing the network connection, for example, to download large files or send and receive email.

Dynamics 365 apps are designed to work best over networks that have the following elements: 

  • Bandwidth greater than 50 KBps (400 kbps)
  • Latency under 150 ms

These values are recommendations and don’t guarantee satisfactory performance. The recommended values are based on systems using out-of-the box forms that aren’t customized.

If you significantly customize the out-of-box forms, it is recommend that you test the form response to understand bandwidth needs.   

You can use the diagnostics tool to determine the latency and bandwidth:

  1. On your computer or device, start a web browser, and sign in to an organization.
  2. Enter the following URL, https://myorg.crm.dynamics.com/tools/diagnostics/diag.aspx, where crm.dynamics.com is the URL of your organization.
  3. Click Run.

Also, to mitigate higher natural latency for global rollouts, customers should leverage Dynamics 365 Apps successfully by having smart design for their applications. 

2.Smart Design for your application

Form design 

  • Keep the number of fields to a minimumThe more fields you have in a form, the more data that needs to be transferred over the internet or intranet to view each record. Think about the interaction the user will have with the form and the amount of data that must be displayed within it.
  • Avoid including unnecessary JavaScript web resource librariesThe more scripts you add to the form, the more time it will take to download them. Usually, scripts are cached in your browser after they are loaded the first time, but the performance the first time a form is viewed often creates a significant impression.
  • Avoid loading all scripts in the Onload eventIf you have code that only supports OnChange events for fields or the OnSave event, make sure to set the script library with the event handler for those events instead of the OnLoad event. This way loading those libraries can be deferred and increase performance when the form loads.
  • Use tab events to defer loading web resourcesAny code that is required to support web resources or IFRAMEs within collapsed tabs can use event handlers for the TabStateChange event and reduce code that might otherwise have to occur in the OnLoad event.
  • Set default visibility optionsAvoid using form scripts in the OnLoad event that hide form elements. Instead set the default visibility options for form elements that might be hidden to not be visible by default when the form loads. Then, use scripts in the OnLoad event to show those form elements you want to display. If the form elements are never made visible, they should be removed from the form rather than hidden.
  • Watch out for synchronous web requests as they can cause severe performance issues. Consider moving to asynchronous for some of these web requests. Also, choose WebApi over of creating Xml HTTP Requests (XHR) on your own. 
  • Avoid opening a new tab or window and do open the window in the main form dialog. 
  • For Command Bar, keep the number of controls to a minimumWithin the command bar or the ribbon for the form, evaluate what controls are necessary and hide any that you don’t need. Every control that is displayed increases resources that need to be downloaded to the browser. Use asynchronous network requests in Custom Rules When using custom rules that make network requests in Unified Interface, use asynchronous rule evaluation.

Learn more Design forms for performance in model-driven apps – Power Apps | Microsoft Learn

Latest version of SDK and APIs 

The latest version of SDK, Form API and WebAPI endpoints should be used to support latest product features, roadmap alignment and security. 

APIs calls and custom FetchXML call velocity 

Only the columns required for information or action should be included in API calls

  • Retrieving all columns (*) creates significant overhead on the database engine when distributed across significant user load. Optimization of call velocity is key to avoid “chatty” forms that unnecessarily make repeated calls for the same information in a single interaction.
  • You should avoid retrieving all columns in a query result because of the impact on a subsequent update of records. In an update, this will set all field values, even if they are unchanged, and often triggers cascaded updates to child records. Leverage the most efficient connection mechanism (WebAPI vs SDK) and reference this doc site for guidance on the appropriate approach.

Consider reviewing periodically the Best practices and guidance when coding for Microsoft Dataverse – Power Apps | Microsoft Learn and ColumnSet.AllColumns Property (Microsoft.Xrm.Sdk.Query) | Microsoft Learn.

Error handling across all code-based events 

You should continue to use the ITracingService.Trace to write to the Plug-in Trace Log table when needed. If your plug-in code uses the ILogger interface and the organization does not have Application Insights integration enabled, nothing will be written. So, it is important to continue to use the ITracingService Trace method in your plug-ins. Plug-in trace logs continue to be an important way to capture data while developing and debugging plug-ins, but they were never intended to provide telemetry data.  

For organizations using Application Insights, you should use ILogger because it will allow for telemetry about what happens within a plug-in to be integrated with the larger scope of data captured with the Application Insights integration. The Application Insights integration will tell you when a plug-in executes, how long it takes to run and whether it makes any external http requests. Learn more about tracing in plugins Logging and tracing (Microsoft Dataverse) – Power Apps | Microsoft Learn.   

Use Solution Checker to analyze solution components 

Best practice is to run Solution Checker for all application code and include it as mandatory step while you design solutions or check when you complete developing your custom logic.

Quick Find 

For an optimal search experience for your users consider the next:

  • All columns you expect to return results in a quick find search need to be included in the view or your results will not load as expected.
  • It is recommended to not use option sets in quick find columns. Try using the view filtering for these. 
  • Minimize the number of fields used and avoid using composite fields as searchable columns. E.g., use first and last name as searchable vs full name.
  • Avoid using multiple lines of text fields as search or find columns.
  • Evaluate Dataverse search vs using leading wildcard search

3. Training

This step should be done during user training or during UAT. To ensure optimal performance of Dynamics 365, ensure that users are properly leveraging browser caching. Without caching, users can experience cold loads which have lower performance than partially (or fully) warm loads.

 Make sure to train users to: 

  • Use application inline refresh over browser refresh (should not use F5)
  • Use application inline back button instead browser’s back button.
  • Avoid InPrivate/Incognito modes in browser which causes cold loads.
  • Make users aware that running applications which consumes lot of bandwidth (like video streaming) may affect performance.
  • Do not install browser extensions unless they are necessary (this might be also blocked via policy)
  • Do use ‘Record Set’ to navigate records quickly without switching from form back to the list. 

4. Testing

For business processes where performance is critical or processes having complex customizations with very high volumes, it is strongly recommended to plan for performance testing. Consider reviewing the below technical talk series describing important performance considerations, as well as sharing practical examples of how to set up and execute performance testing, and analyze and mitigate performance issues. Reference: Performance Testing in Microsoft Dynamics 365 TechTalk Series – Microsoft Dynamics Blog

5. Monitoring

You should define a monitoring strategy and might consider using any of the below tools based on your convenience.

  1. Monitor Dynamic 365 connectivity from remote locations continuously using network monitoring tools like Azure Network Performance Monitor or 3rd party tools. These tools help identify any network related problems proactively and drastically reduce troubleshooting time of any potential issue. 
  2. Application Insights, a feature of Azure Monitoris widely used within the enterprise landscape for monitoring and diagnostics. Data that has already been collected from a specific tenant or environment is pushed to your own Application Insights environment. The data is stored in Azure Monitor logs by Application Insights, and visualized in Performance and Failures panels under Investigate on the left pane. The data is exported to your Application Insights environment in the standard schema defined by Application Insights. The support, developer, and admin personas can use this feature to triage and resolve Telemetry events for Microsoft Dataverse – Power Platform | Microsoft Learn
  3. Dataverse and PowerApps analytics in the Power Platform Admin Centre. Through the Plug-in dashboard in the Power Platform Admin Center you can view metrics such as average execution time, failures, most active plug-ins, and more.
  4. Dynamics 365 apps include a basic diagnostic tool that analyzes the client-to-organization connectivity and produces a report.
  5. Monitor is a tool that offers makers the ability to view a stream of events from a user’s session to diagnose and troubleshoot problems. Works both for model driven apps and canvas apps. 

I hope this blog post had helped you learn or know something new…thank you for reading…

Cheers,

PMDY

Power Platform Pipelines to Deploy Managed Solutions

Hi Folks,

As you all know Application Life Cycle Management(ALM) is very important for a project to become successful in this automation era. The faster is the iteration speed to deploy your solution to production, the more healthy your project is and so your stakeholders. This kind of Automation is usually made possible with the help of CI/CD Azure Pipelines. CI/CD for GIT Repo for Azure DevOps is famous from quite some time. You might have heard that Microsoft had got this kind of CI/CD architecture to Power Platform released in Preview last year and is now Generally Available(GA).

But if you still don’t really have a hands on and pretty new to get this into your project, then this blog post is for you. You don’t need to know Azure DevOps or at least you don’t need to be technical to use this new capability, even administrators can. All you need is to just follow along….You can download the presenation I had presented at Singapore User Group here

First of all, let’s see what are the prerequisites to create a Power Platform Pipeline.

  • Four environments are recommended, but you can use as few as three environments to create a pipeline.
  • Dataverse database is a must in all the environments except the Host Environment.
  • Dataverse for Teams environments aren’t supported for use with pipelines.
  • You must have a Power Platform administrator or system administrator role.
  • Pipelines are a feature of Managed Environments. As such, the development and target environments used in a pipeline must be enabled as a managed environment. Standalone licenses won’t be required when you use developer or trial environments with pipelines. More information: Managed Environments.
  • If you want to share these pipelines, you should grant access by opening the security from the host environment. The Deployment Pipeline User security role grants access to run one or more pipelines.

Now let’s see how you can set up Power Pipelines…

You need to identify which of your environments you want to configure pipeline for. For a healthy pipeline, at least you need 3 environments ie. Dev, Test, Prod, however there is no upper cap.

So for configuring our pipeline, we need one more environment in addition to above which is nothing but Host Environment which serves as the environment which stores all the configuration of the pipeline.

So below are our environments we will be using in this tutorial…

Host Environment(Required)

Development Environment (Required)

QA Environment(Optional)

Production Environment(Required)

You can navigate to https://admin.powerplatform.microsoft.com/ to create an environment with Dataverse database or follow this.

First thing, you have to do is to set up the host environment which holds the configuration of your pipeline. The configuration is pretty much easy and intuitive to follow. Make sure you choose same region to all your environments and select type as Production for all environments except developer environment. You don’t actually need to create a database as this stores only the configuration.

In the same way, I have already created the host environment and I also have few trials which I will be one of them as my test environment and one as my Developer environment, we can also configure other environment but for brevity I am leaving one not configured. Make sure to check Create database for this environment while you were creating for all the environments except the Host Environment.

For setting up the pipeline, you need to have the Environment Id’s of the environments which you would like to configure ready. Follow this link if you don’t know how you can get them.

Copy and paste them in a notepad for your quick reference during configuration.

Open the host environment from Admin portal and click on Resources Option available and click on Dynamics 365 apps.

In the next window, click on Install app and in the list of apps, choose Power Platform Pipelines, click on Next and install the app after agreeing the terms of service.

Now you need to open the host environment from https://make.powerapps.com to configure the pipeline and you should an app for Deployment Pipeline Configuration as below.

Before moving further let’s understand the table structure used for the pipeline.

Open this app as below and configure the environments which you want to use in your pipeline, set up the Development environment from where you would like to deploy the changes to your target. Make sure you select Environment Type as Development Environment and provide the respect Environment Id which you copied to your notepad earlier as above.

Similarly configure the other target environments, but proceed to select as Target Environment as the environment type. Once after system validation is done, validation status is shown as successful and the environments setup will look as below.

Now you need to create a pipeline for the configured environments, once saved, link the development environment which you configured above.

Next is configure your target environments for the pipeline to which you would like to deploy your solutions, from the quick create, give a name, choose your Previous Deployment Stage lookup and select the target environment you would like to Deploy from the lookup.

Once setup, your power pipeline should look something as below.

Note: While you are configuring the first stage to deploy to Test environment, you need leave the Previous Deployment Stage as empty.

Now go back to your Development environment, include any power platform component which you would like to deploy to next environment, here for simplicity I added one Canvas App to the solution as below.

The Canvas App component added and as soon as you select or even before you should see a Pipeline symbol as highlighted below….

Note: This icon will be appeared if you have setup everything correctly, if you still don’t see this option when you try to deploy component from Dev, then you need relook at your configuration which you have setup in the host environment, either you gave the wrong environment Id or you configured incorrectly.

Once this icon is being shown, you are set to go…once you click on the Pipeline symbol, it takes you to a new screen showing your pipeline as below

All you need to just click on Deploy here and wait for a couple of mins to Deploy the solution to next environment to Test and you should see below screen. Then click on Deploy.

Once deployment is successful, then you should see Deploy to next environment(Prod) also enabled.

That’s it, lets see if our Solution is deployed to our Test and Prod Environment.

Prod Environment:

Tips:

Use environment names that indicate their purpose, I have used trial environments for demo purpose but this feature is Generally available, so you might try it out in your actual projects meaning the deployed solution will be managed and not editable as below.

Limitations:

  1. Deleting the host environment deletes all pipelines and run data. Use caution and understand the impact of data and configuration loss as well as maker access to pipelines hosted in the environment.
  2. After General availability all the environment will be automatically enabled as managed environments, so you don’t need to worry much about setting up the environment as Managed.
  3. Licensing is also not a problem if you are a maker who’s creating the pipeline, you need to give the Deployment Pipeline Administrator and Deployment Pipeline User to the maker/user and share the pipeline for them to run it.

Hope you found this post helpful and you should definitely incorporate this feature in your projects to deploy managed solutions from one environment to another. How cool is this one….

Cheers,

PMDY

DAX Studio – Great tool to debug your DAX Queries for Power BI Projects

Hi Folks,

In this digital era, every Power Platform individual want to make more with their data. So obviously there are gonna using Power BI to provide great insights with their data.

Coming to reporting, every one knows to create their visuals in Power BI Desktop and publish them to Power BI Service to view the Power BI Dashboards in Dataverse. This is common and known by everyone. But if you go a bit deeper and want to build some complex Power BI visuals, you were going to definitely use DAX(Data Analysis Expressions). If you were new to DAX, look at this tutorial to gain more information on how to write them.

Next step is when you were writing the DAX expressions, might be in Measures or Calculated functions(including calculated tables, calculated columns) you can’t just write them out at first shot and publish your reports. Definitely it takes time and effort to write them and make the look syntactically formatted so that every one can understand. So you may need to debug your DAX expressions before you actually use them in your reports. It’s when DAX Studio comes as savior which can be integrated with Power BI Desktop in few simple steps.

First step is to download DAX Studio and install it. As soon you are done, you should be able to see in external tools tab.

When you open any Power BI report, you can open them directly from DAX Studio.

You can then open your DAX queries in your DAX Studio and then run/debug your queries, view the performance statistics etc.

DAX Studio is an open source tool which every BI Developer can leverage to improve your productivity in their projects.

Reference:

Video Reference

Web reference

Hope this helps….

Cheers,

PMDY

Power Fx Fomula Data type – your new companion in addition to Calculated fields in Dataverse [Insight]

Hello Folks,

I believe ever Power Platform professional working on Dataverse had one or other time got a chance to work on calculated fields. Actually it provides an easy way to perform any calculations for the supported data types since it has been introduced with CRM Version 2015 update 1.

Here is a very simple example of simple calculation to get your Fx data type up and running in few seconds….follow along….

Navigate to https://make.powerapps.com/

Open your solution, navigate to the columns in any table….for simplicity I am taking example of Accounts table…

Now create new column as below

Key values for the field, make a note that the data type (Fx) is selected

I already have two fields as below already on the form for calculating the Annual revenue per Employee from Annual Revenue of the company…

So now let’s write a simple Power Fx formula to calculate the Annual Revenue per Employee…the expression goes as below…

Annual Revenue is a currency field and Number of Employees field is single line of text. As soon as you save, system automatically identifies the data type as Decimal Number as shown above, click on save and publish the form…

Let’s see the actual use in the form…as soon as you enter the values for Annual Revenue and Number of Employees and save, the value for Calculated Revenue for the Employee field value will be calculated by the Power Fx expression.

Hope this will be useful in future for your implementations…

Points to keep in view:

  1. This formula column is in preview right now at the time of writing this blog post.
  2. And currently, formula columns can’t be used in roll-up fields or with plugins.
  3. You can use the following operators in a formula column:
    +, -, *, /, %, ^, in, exactin, &
  4. Microsoft Documentation says that the Currency data type isn’t currently supported but it works actually.
  5. The Text and Value functions only work with whole numbers, where no decimal separator is involved

Ref: Formula Column

Cheers,

PMDY

Email templates showing Xml – Quick Tip

Hi Folks,

We recently came across a situation where the new and existing email templates keeps showing a xml as below.

This kept us annoying as already the existing ones are created using Rich Email Template editor. At first check, we verified in our Dev and SIT, badly we saw that this is same in both the environments. We double confirmed that no changes were made to the OOB Email template form, so we doubted that there was something wrong with our environment. Luckily we have one more environment where we able to see the email templates working fine. Then we confirmed that there is related to email template form related issue. Also when we tried to open the existing email templates in new designer from https://make.powerapps.com, they opened without any issue.

Fix: Open your model driven app in your custom solution created and launch it by double clicking on it. Just verify it the forms selected for Email Template entity….

The fix is quite obvious and there you are..inorder for email template to show properly, you should select Default UCI Template type form.

Voila, its back as below.

Hope this useful…

Cheers,

PMDY

Power Automate performance improvement considerations

Hi Folks,

Thanks for visiting my blog…in this blog post, have you ever faced the situation where your flow keeps running with no output. Today I will list down the possible scenarios where you were struck with slow performance of the Flow.

Consider the below actions to make your flow execute smoothly.

Remediation steps/Actions to take to make your flow run efficiently:

  1. Understand the throttling limits for your connectors and data sources
  2. Check for Request limits based on user licenses
  3. Cross verify the throughput limits
  4. See the minimum number of actions that the Power Automate service will allow for each plan on the Request limits and allocation page.
  5. Do verify if you were using any on premise connectors
  6. You are hitting the throttling limit in Power Automate
  7. Redesigning your flow to use fewer actions and less data.
  8. Reduce the number of loop iterations for the iteration in ‘Do Until’ and ‘For each item’
  9. Filter your data to retrieve only what is necessary Filtering with Odata
  10. Consider reducing the frequency of scheduled cloud flow
  11. Reduce the file size being accessed if possible if your flow uses them
  12. Consider using Variables for frequently accessed information in your flow
  13. Use Compose and Variable actions to view the data at any time.
  14. Purchasing a Per User or Per Flow license from the pricing page
  15. Per Flow plan may provide best performance quotas available
  16. Enable concurrency control for your ‘Apply to each’ action
  17. Consider creating custom retry policy
  18. Use Select Actions
  19. Check your System jobs in Data verse if asynchronous service performance is normal
  20. Reduce the flow complexity
  21. Consider using Process Advisor for Power Automate.
  22. Verify if you were hitting 2 minute timelimit in dataverse if you were calling Bound or Unbound action in Flow.

Hope this helps someone who’s looking to optimize their flow.

Cheers,

PMDY

Top picks for Power Automate

What is Solution Checker and App Checker in Power Apps – Quick recap

Hi Folks,

While its been quite sometime since Microsoft shipped the Solution checker and App checker, these tools can help a developer to validate all the solutions that was being built before moving to higher environments. It is always advisable to run solution checker once your solution is developed so this can help you achieve better performance following the Power Platform best practices. Previously we used to send for Code review to senior folks but now with this tool, even the junior developer working at the ground level can easily understand and can make the necessary tweaks in the solution.

Solution Checker serves as a static analysis tool for the developers to check any platform related issues.

The solution checker analyzes these solution components:

  • Dataverse custom workflow activities
  • Dataverse web resources (HTML and JavaScript)
  • Dataverse configurations, such as SDK message steps

Note: Solution checker won’t analyze plugins in solutions. Plugin validations are modernized and will eventually the focus is on the native plugin authoring time, which will help you detect and fix issues earlier. So if you were looking for improvements in Plugin code, this will not help you.

Once solution checker starts running, it will be shown as below with a loading symbol on solution checker

It would take a few minutes to complete the process and will be based on the size of the solution. Once this process is complete, you should be able to download the results or view the results like below

If we open the results file, it shows the potential issues or improvements along with their severity which helps us to prioritize the issues which we need to work upon.

The report can also be downloadable excel file with analysis shipped in Zip format.

Now that we have seen what is solution checker, let’s see what App checker is and its pro’s and con’s.

App Checker:

  1. The App checker is now available to help provide a clear list of formula issues in your app, and to provide items to fix to make your app accessible
  2. This helps to make debugging, performance and best practice decisions an easier and more guided experience.  
  3. This is an ideal way to check the formulas you wrote for your Canvas Apps.
  4. There isn’t any possibility to download the app checker results but you can analyze the results on the fly in canvas apps.

To conclude you can think of Solution checker is a tool to check Model Driven Apps and App checker is a tool for Canvas Apps. Hope you will use this great features to improve your solutions and design according to best practices.

Reference:

Solution checker from MS Learn

Apps checker from MS Learn

Cheers,

PMDY

Top Picks for Power Apps

Top Picks for Power Automate

Top Picks for Microsoft Azure

Power Automate Pagination – Simple approach to retrieve 5000+ records from Dataverse

Hi Folks,

When coming to Power Automate, I see lots of articles regarding Pagination for retrieving more than 5000 records from Dataverse with varying complexity. It could be difficult to understand them at once if you were especially don’t much exposure on Power Automate working with functions and variables.

This post will give you a very easy way to retrieve more than 5k + records from Dataverse using PageNumber variable.

The Flow looks like below..

You just need two variables..

  1. Record count
  2. Page number

Lets create it…

For simplicity, create a manual trigger and initialize the two variables which we need.

Now create a do until loop to run until Page Number is 0.

Now define a Scope action as above to define your fetch criteria to retrieve the records from dataverse with Fetch updated as below

Add the Page Number variable created

Add a condition as above to check if List Rows Length is still greater than zero.

If yes, increment the length of Page Number, else set Page Number as 0.

In each iteration, increment the length variable. Here Residents entity is nothing but contacts.

Just save the flow, run it..you will find the number of records in the table(contact/resident) as below

That’s it, so simple right….

Some other related articles or references below:

Retrieve 5k + records using Pagination using Paging cookie

Retrieve 100k+ records using Skip Token

Hope this helps….someone looking for an easy approach for using pagination with dataverse.

Cheers,

PMDY