Exploring Power Platform and Beyond: Features, Experiences, Challenges, Solutions all in one place
Author: Pavan Mani Deep Y
Passionate for Power Platform. A technology geek who loves sharing the leanings, quick tips and new features on Dynamics 365 & related tools, technologies. An Azure IOT and Quantum Computing enthusiast...
While everything is being automated, we will learn how effective you can handle the errors while you automate the process. Ideally when a failure happens in a Power Automate cloud flow, the default behavior is to stop processing. You might want to handle errors and roll back earlier steps in case of failure. Here are 3 basic first hand rules to consider implementing without second thought.
Run after
The way that errors are handled is by changing the run after settings in the steps in the flow, as shown in the following image.
Parallel branches
When using the run after settings, you can have different actions for success and failure by using parallel branches.
Changesets
If your flow needs to perform a series of actions on Dataverse data, and you must ensure that all steps work or none of them work, then you should use a changeset.
If you define a changeset, the operations will run in a single transaction. If any of the step’s error, the changes that were made by the prior steps will be rolled back.
Special mentions:
Using Scopes – Try, Catch, Finally
Retry policies – Specify how a request should be handled incase failed.
In today’s no code world and AI, while most of the Apps are developed using low code approach, sometimes we have to go with the traditional way of development to handle any integrations with other systems.
When we give anyone Command Line script and ask them to execute, the other person would immediately open Search bar at the bottom available in Windows and start entering cmd. Immediately command prompt window appears and will be able to execute the same command.
But what if we ask to execute command line Commands from C# code…? So, in this blog post, I will show you how easily you can call command line commands with a simple example. Let’s get started…
Here in order to showcase, I will just use a basic command line command and run it from C#.
Everyone knows how to find the ipconfig command right, which just shows the internet protocol configuration when entered in command line like below.
In order to execute it from Console Application using C#, we would need to utilize the System. Diagnostics. You can utilize the below C# code.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
When we execute this command, it shows exactly same as what we saw above with Command Line.
In the same way we can call any Command Line Commands from C#. I have to use this approach for my Power Platform Implementation integration to decrypt encrypted messages using PGP and I found it to be very helpful and thought of sharing with all of you. If you were looking for a program to decrypt, you can check out for previous blog post here.
Are you debugging the Dynamics 365 Plugins using Plugin Profiler, did you ever notice this problem that you were unable to persist profile so as to debug your plugin. Did you got frustrated as you couldn’t capture the profile even after lot of tries installing and uninstalling the profiler. Just read on. I am writing this blog post after fixing a similar situation with one of my Plugin.
First of all, I would advise you to check the below.
Plugin trace log under Settings –> Plugin Trace Log.
Check if your Plugin is being called multiple number of times
Check the filtering attributes of your Plugin whether it is causing to go in an infinite loop
Suppose if you have added an image, did you select the respective attributes of the image
Did you add sufficient depth conditions to prevent infinite loop executions.
At what step is your plugin running, is it in PreOperation, PostOperation.? In case you were throwing an error, change it to Prevalidation step and check.
Were you using persist to entity option while debugging, try changing to throw an error and see.
If you note that the system becomes unresponsive and you were not able to download the log file, then definitely your logic is getting called multiple times. Please reverify.
Once you have verified these, you should be able to find out the exact root cause of the issue…I will leave to yourself.
Thank you for visiting my blog today, this is another post talking about SSIS Data Flow Task which I encountered while performing data loading tasks using SSIS and would like to share with everyone.
Did your Visual Studio keeps not responding when you were opening the dataflow tasks for the SSIS Packages you or your team created as shown in image below. And you always try to close the same from task bar since you can’t work and keeps you frustrating, then this tip is absolutely for you.
The problem is actually with your Connection Manager, in your data flow task, you might have OLE DB Connections which the package is using in order to write information if there were any failures in the Data flow. In my case, I was actually writing to a SQL Table using a OLE DB Destination component.
If you cross check that SQL server availability, you should see the SQL Server (Your Instance) is stopped when you check in Start–> Services in the PC. In my case, I was using SQL Server (SQLEXPRESS01) in the SSIS Package as below.
And hence the SQL Server service is in stopped mode, the Visual Studio is not able to acquire the connection to open the package. You were almost there..
Just Start the service which you were using and voila…. your Visual Studio should open normally.
Thank you for visiting my blog today, this post is all about improving the performance of SSIS Data Flow Task which I would like to share with everyone.
Do you know, you can improve your SSIS Data Flow Task easily just by setting AutoAdjustBufferSize Property of your data flow task. If you already know this, you can skip further reading.
I already placed Balanced Data Distributors in my SSIS job, but the performance of Kingswaysoft CDS/CRM Component is not promising and too low.
Thank you for vising my blog today…I believe many of the Consultants or Power Platform professionals out there didn’t know about the HashSet available in .Net since version 3.5.
By the way, what is HashSet..here is a brief about it?
HashSet is a data structure which we mightn’t have come across, neither me until implementing one of my requirements. It offers several benefits compared to other data structures for specific use cases. HashSet is preferred and advantageous, here is a use case where HashSet can be useful than other Data Structures available…followed by Advantages and disadvantages.
Scenario: I have a requirement where I need to send an email to the owners of the record using Custom workflow when record is updated, I see many numbers of records are having same owner and hence same email addresses are being added to the To activity party which I want to prevent, it is then, I searched and found of this HashSet.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
In this way, you can get the owner of the record and add to the HashSet as shown above in the diagram. Also Hash Set can help prevent adding duplicate records making it an ideal way to deal in certain scenarios.
Advantages:
Fast Lookup: It is efficient for tasks that involve frequent lookups, such as membership checks.
Uniqueness: All elements are unique. It automatically handles duplicates and maintains a collection of distinct values. This is useful when you need to eliminate duplicates from a collection.
No Order: It does not maintain any specific order of elements. If the order of elements doesn’t matter for your use case, using a HashSet can be more efficient than other data structures like lists or arrays, which need to maintain a specific order.
Set Operations: It supports set operations like union, intersection, and difference efficiently and beneficial when you need to compare or combine sets of data, as it can help avoid nested loops and improve performance.
Hashing: It relies on hashing to store and retrieve elements. Hashing allows for quick data access and is suitable for applications where fast data retrieval is crucial.
Scalability: It typically scales well with a large number of elements, as long as the hash function is well-distributed, and collisions are minimal.
Limitations include:
Lack of order: It you need to maintain the order of elements, then this is a good candidate for your implementation.
Space usage: It is memory intensive and is not recommended when memory optimization is being considered.
Limited Metadata: It primarily stores keys (or elements), which means you have limited access to associated metadata or values. If you need to associate additional data with keys, you might consider other data structures like HashMap or custom classes.
I hope this gives an overview on using HashSet…however you can’t use Hash Set in all scenarios, it actually depends on your use case, please check the disadvantages too before using it… if you have any questions, don’t hesitate to ask…
I am a big fan of Power Automate…but this post is not about flows but features about Custom Workflow in Dynamics 365 CE.
Did you ever come across this problem where you were not able to debug custom workflow extension. I had come across this and this blog post is all about it…I successfully registered my Custom workflow, but it is not triggering at all.
So, I need to debug it to see what the exact issue was…as I am encounter this error.
Error message says Duplicate workflow activity group name: ‘EcellorsDemo.Cases(1.0.0.0) (Profiled)‘. So, I tried to check my code, plugin steps and any activated plugins but couldn’t find any duplicates.
Usually while debugging your custom workflow using profiler, your workflow will go into draft mode and another copy of the same workflow gets created with name of (Profiled) attached to the name. However, in my case, I didn’t see the same behavior and at the same time, I was unable to use Profiler after the first profiling session and it gave me error shown above.
In order to resolve, this just delete the Plugin Assemblies which could find in the default solution like highlighted below…
Once you have deleted this, try to debug the custom workflow and voila!!!
Hope this helps someone troubleshooting Custom workflow…!
In the Dynamics 365 world, it’s all about efficiently handling the user requests. Whenever you add any user to the environment, the system will update the default personal settings for the user. Maybe you could have some processes in your system which is dependent on the user time zone. So, setting the time zone is very important. It is tedious to update the personal settings manually going to the user profile and updating it manually every time.
Of course, you have a wonderful tool in XrmToolBox from which we will be able to set the User Personal Settings in bulk so that we can update to all the users in one go. What if we want to automate this process, i.e. whenever you add a new user to the Dynamics 365 environment, you want to set that person time zone automatically without any manual intervention.
There you go…this post is for you then…you can do it simply using Plugin or Power Automate. In this blog post, we will see how we can utilize the Plugin as it is more effective approach.
You need to write a Plugin on Associate Message.
Just use this piece of code to set Personal settings…
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Update the personal settings as per your needs in this request. You can find all the attributes of the user settings table by using Fetch Xml Builder easily.
This post is for all who are working on D365 Model Driven Apps and mainly Plugins.
Yes, you saw it right, in this blog post, we will see how can debug plugin without using our favorite plugin profiler which is very widely used from quite some time by everyone working on Plugins for Dynamics 365. All this is done by a tool called Dataverse Browser, which is not yet on XrmToolBox. Please note that there were some limitations as detailed in limitation section below.
Here are 3 simple steps to follow..
Install Dataverse Browser
Attach the Debugger
Run your actual operation.
Step into your code and debug it.
The tool embeds a web browser based on Chromium. It works by translating the Web API requests to SDK requests. Then it analyzes if plugin steps are registered on the message and it loads them, make them run locally. All other requests are sent to the Dataverse, so that the plugins are interacting with the real database.
Download the latest source code of Dataverse browser here.
Next extract the zip file downloaded as highlighted below
Extract the zip file downloaded, open Dataverse.Browser Application as highlighted below.
In the popup window, click on More info as highlighted below…
Then run the application anyway…you will be presented with a window where you can select the environment. Going forward, any time you want to open Dataverse browser, just open the Dataverse.Browser.exe and choose the environment as below.
Click on New, enter the details as above and key in the details.
Enter the settings of your environment:
A name meaningful for you
The host name of your instance (without the https://)
The path to the plugins assembly file (the dll). For a better experience, it should be compiled in debug mode with the pdb file generated.
Then click Go.
You just need to Authenticate to your instance.
Once Authenticated to the respective model driven apps, all the Web API requests sent to Dataverse will be shown as below.
I have following Plugin Libraries registered.
Next step is to choose the instance and perform the respective operation which triggers the Plugin. So, in here, I will perform an update to the Account entity from the Dataverse Browser which triggers the Plugin.
Once an update is performed, a Web API request gets recorded in the Dataverse browser as highlighted below.
Since the Plugin is in Post Operation, i.e. Stage number is 40
Just expand the Patch Request, you should see two operations on 30, 40, but area of interest here is for the Plugin which was registered on stage 40.
Make sure you open the Visual Studio and perform the below steps from Dataverse Browser.
Attach the debugger from Dataverse Browser by clicking on the Plug Symbol as below which will show the list of debugger options available for you to select from. Here I have selected Execute Plugins, plugin will be invoked. You can either select any of the three options as presented below.
1.Do not execute plugins – recommended when you want to debug without actually triggering your plugin logic. i.e. With this approach even you can check the code in Production environment.
2. Execute plugins/Execute plugins with auto break – recommended when you want to debug by triggering your actual plugin, this is recommended in case your plugin code had changed recently and in Development environments.
Just select Ecellors Demo – Microsoft Visual Studio: Visual Studio Professional 2022 version which will launch an existing Visual studio 2022 as below in break mode. Next click on Continue as highlighted below or press Click F5 on your keyboard.
This shows you that the debugger has been attached when you navigate to Dataverse Browser asking you to place your breakpoints.
Now just place breakpoints in your code in Visual Studio. Just go back to Dataverse Browser and click on Ok on the Diaglog box.
Perform the operation which triggers the Plugin from Dataverse Browser itself, this will hit the break point in Visual Studio from where you can debug your plugin.
As you might have observed, your code need not throw exception in order to debug, you could do similarly to the way you would debug using Profiler. But here just that you don’t need to deploy the latest code to the Dataverse just for debugging purpose.
This gives a lot more flexibility eases the way you debug plugins.
Limitions:
There is no support for transactions.
When plugins are triggered because of a server-side operation, they will not be run locally.
For many reasons, behavior will never be perfectly similar to the one when plugins are executed on server side.
Happy debugging, I hope you found this post useful…