This blog is just to let you know why you should stop implementing OData calls using V2.0 version. I am pretty sure almost every Dynamics CE project out there have used this OData calls definitely in their implementations from quite some time. While some of new implementations have replaced the logic using Web API, still some people go with using OData V2.0 calls to build their functionality using JavaScript.
Microsoft had actually planned to remove this endpoint from April 30, 2023. But they deferred this because many projects are’nt yet prepared for removal of this end point and help the customers prepare for this transition to Web API end point.
Identify if you still using OData V2.0 end point, actually Organization Data Service is an OData V2.0 endpoint which was introduced with Dynamics CRM 2011..it’s deprecated way back with Dynamics 365 CE version 8.0.
So now, how to identify where and all you were using OData End Points in your code…you shouldn’t expect that existing code will work with only minor changes and this work can be taken at a later stage. This was a high priority warning message from Microsoft stating the removal, so I urge all of you to be prepared for this removalvery soonand you shouldn’t be surprised.
So where to change…..?
Below are the places where you should change your way of implementation and align with Microsoft…
The Organization Data Service using this end point /XRMServices/2011/OrganizationData.svc in Javascript, you can find it out with the help of the checker service rule web-avoid-crm2011-service-odata for identification. This can be code which was making OData calls to perform CRUD Operations on the current table or related table.
Check any other code, including PowerShell scripts, that send requests to this endpoint: /xrmservices/2011/organizationdata.svc.
Cross Check your Power BI reports or Excel Data sources that may be using this endpoint.
Note:
This announcement does not involve the deprecated Organization Service SOAP endpoint, meaning using Organization service in plugins. At this time, no date has been announced for the removal of that endpoint. At the time of writing this blog post, Microsoft didn’t announce whether this removal is only for Online or On Premise Versions.
This blog post isn’t about any use case, rather it just highlights the importance and benefits of designing your data model for your reporting requirements. Every Power BI Developer should consider this at first place.
When designing your Power BI Reports, Data Modelling is the first step whenever you want to work with the Power BI dashboards or reports which plays a very key role. Coming to Schemas, I can say there are two schemas namely Star Schema and Snow Flake Schema. This blog post mainly talks about Star Schema for your Power BI Report/Dashboard design.
With Star Schema, Power BI data models are optimized for performance and usability. While every consultant try to create stunning visuals, they also need to focus on their data model before spending time on their report design.
Star Schema revolves around 2 types of tables in general, they were Fact tables and Dimension tables(talks about the business entities).
Fact table is central table in star schema. Dimension table are the tables which were connected to Fact table using a one-to-many or many-to-one relationship. Generally, dimension tables contain a relatively small number of rows. Fact tables, on the other hand, can contain a very large number of rows and continue to grow over time. So now let’s see how a star schema looks like and taken from Adventure works sample.
Main point to note here is Normalization and Denormalization capabilities which are two great concepts to understand how star schema can help increase the performance of your dataset.
Star schema requires normalized tables and SnowFlake Schema needs denormalized tables. The design fits well with star schema principles:
Dimension tables support filtering and grouping
Fact tables support summarization
You can visualize the relationship as per the below diagram…
These concepts include, I will brief about the below topics which were not widely popular yet must know for designing an efficient Power BI Dataset.
Measures: These are defined on a fact table that is used to summarize the data in Power BI.
Surrogate keys: It is nothing but an Unique Identifier for a particular table
Recently in one of our custom pages, we were getting the below error
“Invalid data from the network”
We had this setting “Formula-level error management” as On, switching it off was hiding this error on the page.
To fix it we tried removing the fields one by one to figure out the issue. Eventually, we saw the error was going away if we remove one of the option set fields from the gallery or comment it’s formula. Later when we removed and added the field back in the gallery and the error got fixed for us.
Recently while I was working with my Power Apps in Edge browser, faced this issue with no clue on first sight and see community reaching out to Power Apps forum. So thought of sharing simple work around.
I was using https://make.powerapps.com, while I clicked on Signout of my current logged in account in the maker portal. Next time when trying to login, it kept on prompting the message saying you need to signup. It takes me no where to re-enter my credentials, opening https://make.powerapps.com/ in other tab doesn’t help even.
As per Microsoft article referenced below, I have simply tried to clear my browser cookies and cache and soon able to open the maker portal without any issue.
Here I would like to give some tips regarding Power BI Errors which will be encountered in everyday job of anyone who’s going to or work with Power BI Dashboards or understand any existing dashboards and integrate them with Model Driven Apps. You aren’t alone…so let’s gets started…
Error 1: Unable to open document
Fix:
You were not on the latest version of Power BI desktop with which it was authored. You can quickly check the version by navigating to Help –> About
You should see a update icon as below
Clicking on it will take you to the respective page having the latest update from where you can update it.
Error 2: Class not registered error
Fix:
This can happen when you were trying to open an existing dashboard which was created by you or shared by someone to you, might be because you have two versions of Power BI Desktop installed..in your machine..like shown below.i.e.
Make sure you choose the right one which you have used earlier to develop the Dashboard so that you won’t run into any issue.
Error 3: Workspace deleted
Fix:
This error can happen when you were trying to access any workspace while you were creating a Power BI Dashboard in Dynamics CE using an existing workspace, make sure you have a workspace and the logged in user have access to the Workspace.
Granting access to the workspace will fix the issue.
Error 4: One or more Cloud Datasources have been deleted
Fix:
Whenever you deploy any new Power BI report or dashboard to Power BI Service, you encounter this problem most of the times. Sometimes after making a change to a dataset or taking over ownership of a dataset, you might receive the following error:
1. Click the “Recreate cloud data sources” button.
2. Reenter credentials in the “Data Source Credentials” section for your data sources
Make sure dataset is connected to Gateway incase you have configured.
Error 5: There is no gateway to access the datasource
Fix:
The first thing to look at when you have a missing gateway is to make sure you created a data source for that gateway. Installing the On-Premises Data Gateway is not enough. That simply registers the gateway with the Power BI service and lists you as an admin for that gateway.
You then need to go into the Power BI service and create a data source for that gateway.
You can create a data source by doing the following.
Select the gear icon within Power BI
Select Manage Gateways.
Select your gateway and then select Add Data Source.
When you are done entering the information, select Add.
Here is how I have quickly achieved the synchronous Retrieve multiple call using Web API and Promises with the help of JavaScript. I don’t want to make my post too detail, but I would like to share the approach.
All I want to do is to just Restrict saving the Contact creation if the Postal Code entered is not present in the system. But this call should be synchronous as the message should be shown immediately incase postal code is not found in the system and prevent saving the contact record. All you need to do is simple, just call the below function on change of Postal Code in Contacts.
Here in place of XMLHTTPRequest, I have used Xrm.WebAPI so that it won’t show a critial warning in Solution Checker.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
By this time, almost every Dynamics 365 Developer had some day or the other used Level up in their Custom Engagement consultant career.
While every one out in the Dynamics CRM Space know about the widely popular God mode available in Level Up which helps in the day to day administration and maintenance activity of CE applications.
Today in this blog post, lets see how we can achieve impersonation using Level up. We all know about impersonation in Dynamics CRM possibly using Plugins via Plugin Registration tool or directly through code.
But do you know, using Level Up, do you know that you even don’t need to write any piece of code but can impersonate to any user in the system with thin a few clicks right from your browser. If this interests you, just follow along…
Just open the level up chrome extension. This is how your home screen look like once opened..to use this feature, just click on Impersonate as highlighted in the below image. For all this activity, I am logged in as Admin user to the system.
Next you can search with the username with whom you want to impersonate with i.e. here I would like to impersonate with my user account. You can search and impersonate with any user in the system.
Once you click on SEARCH USER
Upnext, click on impersonate button available as below
It opens a new window impersonated with the user account as the user choosen for the impersonation.
The screen shows as the user pavan is the logged in user…
Now you can perform all the operations same as what user pavan has…how cool is it…this feature perfects suits when you want to test out the application security roles where you would like to make a change to a user role and assign to other team member, test it without needing his/her presence. Cool to know, isn’t it….
References:
You can install the extension on Edge, Firefox and Chrome.
I recently came across the above error for one of my Dynamics 365 Plugins…this blog talks about applying a quick fix.
While debugging our Plugin logic line by line to understand why it’s not working, observed this error for messages like RetrieveMultiple, Retrieve when I use any Organization Service call.
This was a .Net version downgrade issue caused by ILMerge as I downgraded one of the DLL to 4.6.2 version from 4.7.1. If you see this issue even without downgrading your DLL, you can use this fix.
After some research I came across this article and applied the same to my assembly which fixed the issue. Added these lines to my AssemblyInfo.cs class file..
Have you or some one in your team accidentally deleted some your flows and want them to be back..? Then this post is for you…
There are the two ways you can restore deleted flows up to 20 days in either way…once after 20 days, even Microsoft Support couldn’t recover your flows.
In this blog post, we will see how you can recover your flows using first approach using Power Automate Management connector which is much simpler.
All you need is a button flow with two Power Automate management connector actions—List Flows as Admin and Restore Deleted Flows as Admin.
So let’s see…
I have a very simple flow as below…
But is has been failing due to some authentication issue, so I have deleted it…
Later I understood the root cause for the authentication issue, I want the flow created to be back. But no problem, you can recover your flow as below…So follow these steps to quickly get your deleted flow back…using an another flow…
Choose a manual trigger…
Next add an action, and choose an operation by searching list flows as admin.
Select your environment in the connector
Save your flow and test it…
Once your flow is successful, verify the outputs..scroll to find out the flow, check for the display name and note down the name associated.
This connector will list down all the flows created in your tenant…when you expand the raw OUTPUTS/value. Note the name which is nothing but the Guid of your flow.
Add Restore Deleted Flows as Admin action and run the flow by searching for the same as below.
Add the Restore Deleted Flows as Admin action from the Power Automate Management Connector.
In the Flow field, enter the name value ie. Guid noted from previous step ie.
9c4be0d4-b54f-4ac3-969e-01154b7f98bb
Save the flow, and test/run it.
You can note your flow is back, you can find out under My Flows but it’s disabled state.
Note:
The steps in this article apply to both non-solution and solution flows.
Flows that were deleted more than 20 days ago can’t be recovered. Both restore methods (PowerShell script and Power Automate Management connector), as well as Microsoft Support can’t help to restore them.
After you restore a flow, it defaults to the disabled state. You must manually enable the flow, per your requirements.
This blog post is all about performance considerations for your Power Platform CE Projects and how you can plan to optimize application performance for your Power Apps. So I just want to take you through them…
Are you tired of creating solutions for longer durations and while at the end of the project or during UAT you end up facing performance issues for the solutions you have developed, one of the most important non-functional requirements for a project’s success is Performance. Satisfying performance requirements for your users can be a challenge. Poor performance may cause failures in user adoption of the system and lead to project failure, so you might need to be careful for every decision you take while you design your solutions in the below stages.
Let’s talk about them one by one..
1. Network Latency and bandwidth
A main cause of poor performance of Dynamics 365 apps is the latency of the network over which the clients connect to the organization.
Bandwidth is the width or capacity of a specific communications channel.
Latency is the time required for a signal to travel from one point on a network to another and is a fixed cost between two points. And usually many of these “signals” travel for a single request.
Lower latencies (measured in milliseconds) generally provide better levels of performance. Even if the latency of a network connection is low, bandwidth can become a performance degradation factor if there are many resources sharing the network connection, for example, to download large files or send and receive email.
Dynamics 365 apps are designed to work best over networks that have the following elements:
Bandwidth greater than 50 KBps (400 kbps)
Latency under 150 ms
These values are recommendations and don’t guarantee satisfactory performance. The recommended values are based on systems using out-of-the box forms that aren’t customized.
If you significantly customize the out-of-box forms, it is recommend that you test the form response to understand bandwidth needs.
You can use the diagnostics tool to determine the latency and bandwidth:
On your computer or device, start a web browser, and sign in to an organization.
Also, to mitigate higher natural latency for global rollouts, customers should leverage Dynamics 365 Apps successfully by having smart design for their applications.
2.Smart Design for your application
Form design
Keep the number of fields to a minimum. The more fields you have in a form, the more data that needs to be transferred over the internet or intranet to view each record. Think about the interaction the user will have with the form and the amount of data that must be displayed within it.
Avoid including unnecessary JavaScript web resource libraries. The more scripts you add to the form, the more time it will take to download them. Usually, scripts are cached in your browser after they are loaded the first time, but the performance the first time a form is viewed often creates a significant impression.
Avoid loading all scripts in the Onload event. If you have code that only supports OnChange events for fields or the OnSave event, make sure to set the script library with the event handler for those events instead of the OnLoad event. This way loading those libraries can be deferred and increase performance when the form loads.
Use tab events to defer loading web resources. Any code that is required to support web resources or IFRAMEs within collapsed tabs can use event handlers for the TabStateChange event and reduce code that might otherwise have to occur in the OnLoad event.
Set default visibility options. Avoid using form scripts in the OnLoad event that hide form elements. Instead set the default visibility options for form elements that might be hidden to not be visible by default when the form loads. Then, use scripts in the OnLoad event to show those form elements you want to display. If the form elements are never made visible, they should be removed from the form rather than hidden.
Avoid opening a new tab or window and do open the window in the main form dialog.
For Command Bar, keep the number of controls to a minimum. Within the command bar or the ribbon for the form, evaluate what controls are necessary and hide any that you don’t need. Every control that is displayed increases resources that need to be downloaded to the browser. Use asynchronous network requests in Custom Rules When using custom rules that make network requests in Unified Interface, use asynchronous rule evaluation.
Only the columns required for information or action should be included in API calls:
Retrieving all columns (*) creates significant overhead on the database engine when distributed across significant user load. Optimization of call velocity is key to avoid “chatty” forms that unnecessarily make repeated calls for the same information in a single interaction.
You should avoid retrieving all columns in a query result because of the impact on a subsequent update of records. In an update, this will set all field values, even if they are unchanged, and often triggers cascaded updates to child records. Leverage the most efficient connection mechanism (WebAPI vs SDK) and reference this doc site for guidance on the appropriate approach.
You should continue to use the ITracingService.Trace to write to the Plug-in Trace Log table when needed. If your plug-in code uses the ILogger interface and the organization does not have Application Insights integration enabled, nothing will be written. So, it is important to continue to use the ITracingService Trace method in your plug-ins. Plug-in trace logs continue to be an important way to capture data while developing and debugging plug-ins, but they were never intended to provide telemetry data.
For organizations using Application Insights, you should use ILogger because it will allow for telemetry about what happens within a plug-in to be integrated with the larger scope of data captured with the Application Insights integration. The Application Insights integration will tell you when a plug-in executes, how long it takes to run and whether it makes any external http requests. Learn more about tracing in plugins Logging and tracing (Microsoft Dataverse) – Power Apps | Microsoft Learn.
Use Solution Checker to analyze solution components
Best practice is to run Solution Checker for all application code and include it as mandatory step while you design solutions or check when you complete developing your custom logic.
Quick Find
For an optimal search experience for your users consider the next:
All columns you expect to return results in a quick find search need to be included in the view or your results will not load as expected.
It is recommended to not use option sets in quick find columns. Try using the view filtering for these.
Minimize the number of fields used and avoid using composite fields as searchable columns. E.g., use first and last name as searchable vs full name.
Avoid using multiple lines of text fields as search or find columns.
Evaluate Dataverse search vs using leading wildcard search
3. Training
This step should be done during user training or during UAT. To ensure optimal performance of Dynamics 365, ensure that users are properly leveraging browser caching. Without caching, users can experience cold loads which have lower performance than partially (or fully) warm loads.
Train users to not clear the browser cache manually.
Check if devices have enough space allocated for browser cache.
Make sure to train users to:
Use application inline refresh over browser refresh (should not use F5)
Use application inline back button instead browser’s back button.
Avoid InPrivate/Incognito modes in browser which causes cold loads.
Make users aware that running applications which consumes lot of bandwidth (like video streaming) may affect performance.
Do not install browser extensions unless they are necessary (this might be also blocked via policy)
Do use ‘Record Set’ to navigate records quickly without switching from form back to the list.
4. Testing
For business processes where performance is critical or processes having complex customizations with very high volumes, it is strongly recommended to plan for performance testing. Consider reviewing the below technical talk series describing important performance considerations, as well as sharing practical examples of how to set up and execute performance testing, and analyze and mitigate performance issues. Reference: Performance Testing in Microsoft Dynamics 365 TechTalk Series – Microsoft Dynamics Blog
5. Monitoring
You should define a monitoring strategy and might consider using any of the below tools based on your convenience.
Monitor Dynamic 365 connectivity from remote locations continuously using network monitoring tools like Azure Network Performance Monitor or 3rd party tools. These tools help identify any network related problems proactively and drastically reduce troubleshooting time of any potential issue.
Application Insights, a feature of Azure Monitor, is widely used within the enterprise landscape formonitoring and diagnostics. Data that has already been collected from a specific tenant or environment is pushed to your own Application Insights environment. The data is stored in Azure Monitor logs by Application Insights, and visualized in Performance and Failures panels under Investigate on the left pane. The data is exported to your Application Insights environment in the standard schema defined by Application Insights. The support, developer, and admin personas can use this feature to triage and resolve Telemetry events for Microsoft Dataverse – Power Platform | Microsoft Learn
Dataverse and PowerApps analytics in the Power Platform Admin Centre. Through the Plug-in dashboard in the Power Platform Admin Center you can view metrics such as average execution time, failures, most active plug-ins, and more.
Dynamics 365 apps include a basic diagnostic tool that analyzes the client-to-organization connectivity and produces a report.
Monitor is a tool that offers makers the ability to view a stream of events from a user’s session to diagnose and troubleshoot problems. Works both for model driven apps and canvas apps.
I hope this blog post had helped you learn or know something new…thank you for reading…