Power Platform Solution Blue Print Review – Quick Recap

The Solution blueprint review is covers all required topics. The workshop can also be conducted remotely. When the workshop is done remotely, it is typical to divide the review into several sessions over several days.

The following sections cover the top-level topics of the Solution blueprint review and provide a sampling of the types of questions that are covered in each section.

Program strategy

Program strategy covers the process and structures that will guide the implementation. It also reviews the approach that will be used to capture, validate, and manage requirements, and the plan and schedule for creation and adoption of the solution.

This topic focuses on answering questions such as:

  • What are the goals of the implementation, and are they documented, well understood, and can they be measured?
  • What is the methodology being used to guide the implementation, and is it well understood by the entire implementation team?
  • What is the structure that is in place for the team that will conduct the implementation?
  • Are roles and responsibilities of all project roles documented and understood?
  • What is the process to manage scope and changes to scope, status, risks, and issues?
  • What is the plan and timeline for the implementation?
  • What is the approach to managing work within the plan?
  • What are the external dependencies and how are they considered in the project plan?
  • What are the timelines for planned rollout?
  • What is the approach to change management and adoption?
  • What is the process for gathering, validating, and approving requirements?
  • How and where will requirements be tracked and managed?
  • What is the approach for traceability between requirements and other aspects of the implementation (such as testing, training, and so on)?
  • What is the process for assessing fits and gaps?

Test strategy

Test strategy covers the various aspects of the implementation that deal with validating that the implemented solution works as defined and will meet the business need.

This topic focuses on answering questions such as:

  • What are the phases of testing and how do they build on each other to ensure validation of the solution?
  • Who is responsible for defining, building, implementing, and managing testing?
  • What is the plan to test performance?
  • What is the plan to test security?
  • What is the plan to test the cutover process?
  • Has a regression testing approach been planned that will allow for efficient uptake of updates?

Business process strategy

Business process strategy considers the underlying business processes (the functionality) that will be implemented on the Microsoft Dynamics 365 platform as part of the solution and how these processes will be used to drive the overall solution design.

This topic focuses on answering questions such as:

  • What are the top processes that are in scope for the implementation?
  • What is currently known about the general fit for the processes within the Dynamics 365 application set?
    • How are processes being managed within the implementation and how do they relate to subsequent areas of the solution such as user stories, requirements, test cases, and training?
    • Is the business process implementation schedule documented and understood?
    • Are requirements established for offline implementation of business processes?

Based on the processes that are in scope, the solution architect who is conducting the review might ask a series of feature-related questions to gauge complexity or understand potential risks or opportunities to optimize the solution based on the future product roadmap.

Application strategy

Application strategy considers the various apps, services, and platforms that will make up the overall solution.

This topic focuses on answering questions such as:

  • Which Dynamics 365 applications or services will be deployed as part of the solution?
  • Which Microsoft Azure capabilities or services will be deployed as part of the solution?
  • What if new external application components or services will be deployed as part of the solution?
  • What if legacy application components or services will be deployed as part of the solution?
  • What extensions to the Dynamics 365 applications and platform are planned?

Data strategy

Data strategy considers the design of the data within the solution and the design for how legacy data will be migrated to the solution.

This topic focuses on answering questions such as:

  • What are the plans for key data design issues like legal entity structure and data localization?
  • What is the scope and planned flow of key master data entities?
  • What is the scope and planned flow of key transactional data entities?
  • What is the scope of data migration?
  • What is the overall data migration strategy and approach?
  • What are the overall volumes of data to be managed within the solution?
  • What are the steps that will be taken to optimize data migration performance?

Integration strategy

Integration strategy considers the design of communication and connectivity between the various components of the solution. This strategy includes the application interfaces, middleware, and the processes that are required to manage the operation of the integrations.

This topic focuses on answering questions such as:

  • What is the scope of the integration design at an interface/interchange level?
  • What are the known non-functional requirements, like transaction volumes and connection modes, for each interface?
  • What are the design patterns that have been identified for use in implementing interfaces?
  • What are the design patterns that have been identified for managing integrations?
  • What middleware components are planned to be used within the solution?

Business intelligence strategy

Business intelligence strategy considers the design of the business intelligence features of the solution. This strategy includes traditional reporting and analytics. It includes the use of reporting and analytics features within the Dynamics 365 components and external components that will connect to Dynamics 365 data.

This topic focuses on answering questions such as:

  • What are the processes within the solution that depend on reporting and analytics capabilities?
  • What are the sources of data in the solution that will drive reporting and analytics?
  • What are the capabilities and constraints of these data sources?
  • What are the requirements for data movement across solution components to facilitate analytics and reporting?
  • What solution components have been identified to support reporting and analytics requirements?
  • What are the requirements to combine enterprise data from multiple systems/sources, and what does that strategy look like?

Security strategy

Security strategy considers the design of security within the Dynamics 365 components of the solution and the other Microsoft Azure and external solution components.

This topic focuses on answering questions such as:

  • What is the overall authentication strategy for the solution? Does it comply with the constraints of the Dynamics 365 platform?
  • What is the design of the tenant and directory structures within Azure?
  • Do unusual authentication needs exist, and what are the design patterns that will be used to solve them?
  • Do extraordinary encryption needs exist, and what are the design patterns that will be used to solve them?
  • Are data privacy or residency requirements established, and what are the design patterns that will be used to solve them?
  • Are extraordinary requirements established for row-level security, and what are the design patterns that will be used to solve them?
  • Are requirements in place for security validation or other compliance requirements, and what are the plans to address them?

Application lifecycle management strategy

Application lifecycle management (ALM) strategy considers those aspects of the solution that are related to how the solution is developed and how it will be maintained given that the Dynamics 365 apps are managed through continuous update.

This topic focuses on answering questions such as:

  • What is the preproduction environment strategy, and how does it support the implementation approach?
  • Does the environment strategy support the requirements of continuous update?
  • What plan for Azure DevOps will be used to support the implementation?
  • Does the implementation team understand the continuous update approach that is followed by Dynamics 365 and any other cloud services in the solution?
  • Does the planned ALM approach consider continuous update?
  • Who is responsible for managing the continuous update process?
  • Does the implementation team understand how continuous update will affect go-live events, and is a plan in place to optimize versions and updates to ensure supportability and stability during all phases?
  • Does the ALM approach include the management of configurations and extensions?

Environment and capacity strategy

Deployment architecture considers those aspects of the solution that are related to cloud infrastructure, environments, and the processes that are involved in operating the cloud solution.

This topic focuses on answering questions such as:

  • Has a determination been made about the number of production environments that will be deployed, and what are the factors that went into that decision?
  • What are the business continuance requirements for the solution, and do all solution components meet those requirements?
  • What are the master data and transactional processing volume requirements?
  • What locations will users access the solution from?
  • What are the network structures that are in place to provide connectivity to the solution?
  • Are requirements in place for mobile clients or the use of other specific client technologies?
  • Are the licensing requirements for the instances and supporting interfaces understood?

Solution blueprint is very essential for an effective Solution Architecture, using the above guiding principles will help in this process.

Thank you for reading…

Hope this helps…

Cheers,

PMDY

Visualize this view – what this mean to developers and end users…?

Hi Folks,

Have you noticed Visualize this view button in in the app bar of any grid view of Dynamics 365?

Here is a dashboard built within couple of minutes. While this can greatly help end users visualize the data present in the system. So, in this post, let’s understand bit more details about this capability and what are the some of the features which are left behind.

Let’s understand the how’s this is generated along with its capabilities and disadvantages compared to traditional Power BI dashboard both from Developer and end user perspective, please note that this is my understanding..

For Developers:

a. Visualize this view uses a PCF Control which calls the Power BI REST API and then generates the embed token for the report embedding that into an Iframe.

b. Then uses Power BI JavaScript API to handle user interactions with the embedded report such as filtering or highlighting data points.

c. When Power BI first generates your report, it takes a look through your data to identify patterns and distributions and pick a couple of fields to use as starting points for creating the initial set of visuals when data is not preselected.

d. Any changes to the data fields calls the UpdateView of the PCF Control and there by passing the updated data fields to REST API and then displays the visuals.

e. Visuals will be created with both selected and non-selected fields which are the related to the selected fields in the data pane.

For End Users & Developers:

Advantages:

  1. Visuals are generated when no data is selected
  2. Cross Highlighting is possible
  3. Click on the Report to see Personalize this visual option
  4. People with Contributor, Member, or Admin role assigned can save the Report to workspace
  5. Users with no access to Power BI cant view this feature, they can request for free Power BI License
  6. Free License users can save the Report to thier personal workspace
  7. Users get build permission when any role above Contributor is assigned and reshare permission is given
  8. The report will be saved as direct query with SSO enabled and honours dataverse settings
  9. Show data table presents a list of tables if the model comprises of multiple tables.
  10. Able to specify the aggregation for each of the field in the model.

Disadvantages:

  1. Only able to export summarized data from Visuals, you will be able to export the data in table from data table.
  2. Only Visual Level, no page level or report level filters
  3. During these reports creation, the model is configured to use Direct Query with Single Sign On.
  4. Embed a report on a Dataverse form requires modifying the XML of the solution
  5. Report published into the workspace are available to download but downloaded reports will not be able to customize further in Power BI Desktop as it would be built using Native Queries.
  6. If the page is kept ideal for long time or the user navigates to other browser window, the session and report will be lost.

Considerations & Limitations:

  1. Power BI Pro license is required to create these reports
  2. While this is wonderful for end users to visualize the data but this is not an alteranative to building reports using Power BI Desktop.

Hope this helps.

Cheers,

PMDY

Call Custom Actions in Dataverse using Web API – Quick Recap

Hi Folks,

Here is how you can quickly call action using Web API, with this method you can execute a single action, function, or CRUD operation. In the below example, let’s see how you can call an action. Here is function…to achieve this..

var formContext = executionContext.getFormContext();
var message = "Please enter a valid NRIC Number";
var uniqueId = "nric_valid";
var NRIC = formContext.getAttribute("new_nric").getValue();
if(NRIC !== null)
{
var execute_ValidateNRIC = {
NRIC: NRIC, // Call this function only when NRIC value which is non-null
getMetadata: function () {
return {
boundParameter: null,
parameterTypes: {
NRIC: { typeName: "Edm.String", structuralProperty: 1 }
},
operationType: 0,
operationName: "new_ValidateNRIC",
outputParameterTypes: {
IsValid: { typeName: "Edm.Boolean" }
}
};
}
};
Xrm.WebApi.execute(execute_new_ValidateNRIC).then(
function success(response) {
if (response.ok) {
response.json().then(function (data) {
if (!data.IsValid) {
formContext.getControl("new_nric").setNotification(message, uniqueId);
} else {
formContext.getControl("new_nric").clearNotification(uniqueId);
}
}).catch(function (error) {
Xrm.Navigation.openAlertDialog("Error occured from Validate NRIC "+error);
});
}
}
).catch(function (error) {
Xrm.Navigation.openAlertDialog(error.message);
}).catch(function (error) {
Xrm.Navigation.openAlertDialog(error.message);
});
}
}

This example details with Unbound action, which is not tagged to any entity, however if in case on Bound action, you will specify the entity name for bound parameter instead of null. You need to specify the Metadata accordingly for your Action. Let’s understand it’s syntax first…

Xrm.WebApi.online.execute(request).then(successCallback, errorCallback);

Parameters

Is your plugin not running? Have you debugged? Plugin doesn’t run but your operation is successful when debugging…then try this out

Hi Folks,

Last few weeks was very busy for me, I missed interacting with the community.

Here I would like to share one tip which can greatly help your debugging…

Just to give a little background, I was working with the Plugins for Dynamics 365 recently where I was working with API, the Plugin seem to work fine when debugged using Profiler, I tested the piece of the Plugin Code in Console, it worked either, but Plugin is not working when the respective action which triggers the Plugin is being fired. I scratched my head, what is the problem…

Just then, I tried using the below block of code, replaced the catch block of Plugin Code with below code.

catch(WebException ex)
{
string stringResponse = string.Empty;
int statusCode;
using (WebResponse response = ex.Response)
{
HttpWebResponse httpResponse = (HttpWebResponse)response;
statusCode = (int)httpResponse.StatusCode;
using (Stream data = response.GetResponseStream())
using (var reader = new StreamReader(data))
{
stringResponse = reader.ReadToEnd();
}
using (var ms = new MemoryStream(Encoding.Unicode.GetBytes(stringResponse)))
{
}
}
view raw Detailed Error hosted with ❤ by GitHub

Soon, I observed from the detailed error message above function posted, it is failing because of version problem of the referenced DLL and current DLL version was not supported with my assembly.

Soon I was able to reference my Plugin with correct DLL version which fixed the issue. No further debugging was needed.

Hope this helps…

Cheers,

PMDY

Another way to install Plugin Registration Tool for Dynamics 365 CE from Nuget

Hi Folks,

Are you a Power Platform or Dynamics 365 CE Developer, you would definitely need to work on Plugin Registration tool at any given point of time and having a local application for Plugin Registration tool greatly helps…in this post, I will show a little different way to install Plugin registration tool and that too very easily.

Well, this approach is especially useful to me when I got a new laptop and need to work on Plugin Registration Tool where the Plugins already build for the implementation.

First 3 ways might have known to everyone through which you can download Plugin registration tool…do you know there is fourth approach as well…

  1. From XrmToolBox
  2. From https://xrm.tools/SDK
  3. Installation from CLI
  4. See below

Because there were limitations to use these approaches at least in my experience, I found the fourth one very useful.

  1. XrmToolBox – Not quite convenient to profile and debug your plugins
  2. https://xrm.tools/SDK – Dlls in the downloaded folder will be blocked and would need to manually unblock the DLL’s for the Tool to work properly
  3. CLI – People rarely use this.

Just do note that the approach is very easy and works only if you have a Plugin Project already. Please follow the steps below

  1. Just open the Plugin project.
  2. Right click on the solution and choose manage Nuget Packages for the solution
  3. Search for Plugin Registration tool as below

4. Choose the Plugin project and click install, confirm the prompt and agree the license agreement shown

5. Once installed, next go to the Project folder in the local machine.

6. Navigate to Packages folder, you should see a folder for Plugin Registration tool below

7. There you go, you can open the Plugin Registration Application under tools folder. You can undo the changes for the Assembly it is linked to Source control.

That’s it, how easy it was? Hope this would help someone.

Cheers,

PMDY

Dataverse Accelerator | API playground (Preview)

Hi Folks,

In this post, I will be talking about the features of Dataverse Accelerator in brief. Actually, the Microsoft Dataverse accelerator is an application that provides access to select preview features and tooling related to Dataverse development, it is based on Microsoft Power Pages. This is totally different from Dataverse Industry Accelerator.

Dataverse accelerator app is automatically available in all new Microsoft Dataverse environments. If your environment doesn’t already have it, you can install the Dataverse accelerator by going to Power Platform Admin Center –> Environments –> Dynamics 365 Apps –> Install App –> Choose Dataverse Accelerator

You can also refer to my previous blog post on installing it here if you prefer

Once installed, you should see something like below under the Apps

On selection of the Dataverse Accelerator App, you should see something like below, do note that you must have App-level access to the Dataverse accelerator model driven app, such as system customizer or direct access from a security role.

Now let’s quickly see what are features available with Dataverse Accelerator

FeatureDescription
Low-code plug-insReusable, real-time workflows that execute a specific set of commands within Dataverse. Low-code plug-ins run server-side and are triggered by personalized event handlers, defined in Power Fx.
Plug-in monitorA modern interface to surface the existing plug-in trace log table in Dataverse environments, designed for developing and debugging Dataverse plug-ins and custom APIs.
Do you remember viewing Plugin Trace logs from customizations, now you don’t need system administrator role to view trace logs, giving access to this app will do, rest everything remains the same.
API PlaygroundA preauthenticated software testing tool which helps to quickly test and play with Dataverse API’s.

I wrote a blog post earlier on using Low Code Plugins, you may check it out here, while using Plugin Monitor is pretty straight forward.

You can find my blog post on using Postman to test Dataverse API’s here.

Now let’s see how can use the API Playground, basically you will be able to test the below from API Playground similar to Postman. All you need to open the API Playground from Dataverse accelerator. You will be preauthenticated while using API Playground.

TypeDescription
Custom APIThis includes any Dataverse Web API actionsfunctions from Microsoft, or any public user-defined custom APIs registered in the working environment.
Instant plug-inInstant plug-ins are classified as any user-defined workflows registered as a custom API in the environment with a related Power Fx Expressions.
OData requestAllows more granular control over the request inputs to send OData requests.

Custom API, Instant Plugin – You select the relevant request in the drop down available in API Playground and provide the necessary input parameters if required for your request

OData request – Select OData as your request and provide the plural name of the entity and hit send

After a request is sent, the response is displayed in the lower half of your screen which would be something like below.

OData response

I will update this post as these features get released in my region(APAC), because at the time of writing this blog post, this API Playground feature is being rolled out globally and was still in preview.

The Dataverse accelerator isn’t available in GCC or GCC High environments.

Hope learned something about Dataverse Accelerator.

Cheers,

PMDY

Using Bulk Operations messages – #01 (Plugins)

Well, this could be a very interesting post as we talk about optimizing the Dataverse performance using bulk operation messages and too using Dataverse plugin customizations but wait, this post is not complete because of an issue which I will talk later in the blog. First let’s dig into this feature by actually trying out. Generally, every business wants improved performance for any logic tagged out to out of box messages and so developers try to optimize their code in various ways when using Dataverse messages.

Firstly, before diving deeper into this article, let’s first understand the differences between Standard and Elastic tables, if you want to know a bit of introduction to elastic tables which were newly introduced last year, you can refer to my previous post on elastic tables here.

The type of table you choose to store your data has the greatest impact on how much throughput you can expect with bulk operations. You can choose out of two types of tables in Dataverse, below are some key differences you can refer to: 

 Standard TablesElastic Tables
Data StructureDefined SchemaFlexible Schema
Stores data in Azure SQLStores data in Azure Cosmos DB
Data IntegrityEnsuredLess Strict
Relationship modelSupportedLimited
PerformancePredictableVariable, preferred for unpredictable and spiky workloads
AgilityLimitedHigh
PersonalizationLimitedExtensive
Standard and Elastic Table Differences

Plugins:

With Bulk Operation messages, the APIs being introduced are Create MultipleUpdateMultiple,DeleteMultiple (only for Elastic tables), Upsert Request(preview). As of now you’re not required to migrate your plug-ins to use CreateMultiple and Update Multiple instead of Create and Update messages. Your logic for Create and Update continues to be applied when applications use CreateMultiple or UpdateMultiple

This is mainly done to prevent two separate business logics for short running and long duration activities. So, it means Microsoft have merged the message processing pipelines for these messages (Create, Create Multiple; Update, Update Multiple) that means Create, Update messages continue to trigger for your existing implemented scenarios, when you update to use Create Multiple, Update Multiple still the Create, Update will behave.

Few points for consideration:

  1. While I have tested and still could see IPluginExecutionContext only provides the information and still I have noted Microsoft Documentation suggests using IPluginExecutionContext4 for Bulk Messages in Plugins where it is being shown as null yet.
  2. While you were working with Create, Update, Delete, you could have used Target property to get the input parameters collection, while working with Bulk Operation messages, you need to use Targets instead of Target.
  3. Instead of checking whether the target is Entity you need to use Entity Collection, we need to loop through and perform our desired business logic
  4. Coming to Images in plugin, these will be retrieved only when you have used IPluginExecutionContext4.

Below is the image from Plugin Registration Tool to refer(e.g. I have taken UpdateMultiple as reference, you can utilize any of the bulk operation messages)

Sample:

Below is the sample, how your Bulk operation message plugin can look like…you don’t need to use all the contexts, I have used to just check that out.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Crm.Sdk;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;
namespace Plugin_Sample
{
public class BulkMessagePlugin : IPlugin
{
public void Execute(IServiceProvider serviceProvider)
{
IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
IPluginExecutionContext2 context2 = (IPluginExecutionContext2)serviceProvider.GetService(typeof(IPluginExecutionContext2));
IPluginExecutionContext3 context3 = (IPluginExecutionContext4)serviceProvider.GetService(typeof(IPluginExecutionContext3));
IPluginExecutionContext4 context4 = (IPluginExecutionContext4)serviceProvider.GetService(typeof(IPluginExecutionContext4));
ITracingService trace = (ITracingService)serviceProvider.GetService(typeof(ITracingService));
// Verify input parameters
if (context4.InputParameters.Contains("Targets") && context.InputParameters["Targets"] is EntityCollection entityCollection)
{
// Verify expected entity images from step registration
if (context4.PreEntityImagesCollection.Length == entityCollection.Entities.Count)
{
int count = 0;
foreach (Entity entity in entityCollection.Entities)
{
EntityImageCollection entityImages = context4.PreEntityImagesCollection[count];
// Verify expected entity image from step registration
if (entityImages.TryGetValue("preimage", out Entity preImage))
{
bool entityContainsSampleName = entity.Contains("fieldname");
bool entityImageContainsSampleName = preImage.Contains("fieldname");
if (entityContainsSampleName && entityImageContainsSampleName)
{
// Verify that the entity 'sample_name' values are different
if (entity["fieldname"] != preImage["fieldname"])
{
string newName = (string)entity["fieldname"];
string oldName = (string)preImage["fieldname"];
string message = $"\\r\\n – 'sample_name' changed from '{oldName}' to '{newName}'.";
// If the 'sample_description' is included in the update, do not overwrite it, just append to it.
if (entity.Contains("sample_description"))
{
entity["sample_description"] = entity["sample_description"] += message;
}
else // The sample description is not included in the update, overwrite with current value + addition.
{
entity["sample_description"] = preImage["sample_description"] += message;
}
}
}
}
}
}
}
}
}
}

I have posted this question to Microsoft regarding the same to know more details on this why the IPluginExecutionContext4 is null , while still I am not sure if this is not deployed to my region, my environment is in India.

Recommendations for Plugins:

  • Don’t try to introduce CreateMultiple, UpdateMultiple, UpsertMultiple in a separate step as it would trigger the logic to be fired twice one for Create operation and another for CreateMultiple.
  • Don’t use batch request types such as ExecuteMultipleRequest, ExecuteTransactionRequest, CreateMultipleRequest, UpdateMultipleRequest, UpsertMultipleRequest in Plugins as user experiences are degraded and timeout errors can occur.
  • Instead use Bulk operation messages like CreateMultipleRequestUpdateMultipleRequest, UpsertMultipleRequest
    • No need to use ExecuteTransactionRequest in Synchronous Plugins as already they will be executed in the transaction.

    Hope this guidance will help someone trying to customize their Power Platform solutions using Plugins.

    I will write another blog post on using Bulk operation messages for Client Applications…

    Cheers,

    PMDY

    Start Transitioning your Dynamics 365 Client Applications to use Dataverse Client

    Hi Folks,

    This blog post deals about what you need to do for your client applications in specific to use Dataverse Client API instead of existing CrmServiceClient(Core Assemblies) API.

    Below were 3 reasons cited by Microsoft and why we need to just be aware of this move.

    1.Cross Platform Application Support: With the introduction of Microsoft.PowerPlatform.Dataverse.Client, the new Dataverse Service Client supports Cross Platform Support.

    2. MSAL Authentication: New Dataverse ServiceClient API uses MSAL while our older CrmServiceClient API uses ADAL. ADAL.Net is no longer supported.

    3. Performance and functional benefits: We can have one authentication handler per web service connection instead of just one per process. The Dataverse Service Client class supports a smaller interface surface, inline authentication by instance, and Microsoft.Extensions.Logging.ILogger.

    What’s the impact?

    • Plug-ins or custom workflow activities – no changes
    • New or existing online applications – changes are needed but not immediately…
    • On-premises applications – this article is not for you, yet

    So, meaning it impacts Online Client applications only. While you really don’t need to worry much about this the class member signatures of ServiceClient and CrmServiceClient are the same, except for the class names themselves being slightly different. Application code should not need any significant changes.

    As of now, no changes to your code are required, but it is better to keep in mind that in the future the CRM 2011 Service End Point would be deprecated, and this change would be made mandatory.

    So, what should you do to incorporate this change?

    Use the following assemblies from Nuget instead of CrmSdk.CoreAssemblies

    Add the below using statement to use Microsoft.PowerPlatform.Dataverse.Client

    Use ServiceClient instead of CrmServiceClient, ServiceClient would return your OrganizationService.

    Instead of

    Be strategic to minimize the impact to your apps.

    Cheers,

    PMDY

    Update user personal settings Automatically easily when a new user gets added to Dynamics 365 Environments

    Hi Folks,

    In the Dynamics 365 world, it’s all about efficiently handling the user requests. Whenever you add any user to the environment, the system will update the default personal settings for the user. Maybe you could have some processes in your system which is dependent on the user time zone. So, setting the time zone is very important. It is tedious to update the personal settings manually going to the user profile and updating it manually every time.

    In case you want to do it for all users at one time during initial setup, you can follow my blog post Update Model Driven App Personal Settings from XrmToolBox.

    Of course, you have a wonderful tool in XrmToolBox from which we will be able to set the User Personal Settings in bulk so that we can update to all the users in one go. What if we want to automate this process, i.e. whenever you add a new user to the Dynamics 365 environment, you want to set that person time zone automatically without any manual intervention.

    There you go…this post is for you then…you can do it simply using Plugin or Power Automate. In this blog post, we will see how we can utilize the Plugin as it is more effective approach.

    You need to write a Plugin on Associate Message.

    Just use this piece of code to set Personal settings…

    using System;
    using System.Collections.Generic;
    using System.Linq;
    using System.ServiceModel;
    using System.Text;
    using System.Threading.Tasks;
    using Microsoft.Xrm.Sdk;
    using Microsoft.Crm.Sdk;
    using Microsoft.Crm.Sdk.Messages;
    namespace Ecellors_Demo
    {
    public class Demo : IPlugin
    {
    public void Execute(IServiceProvider serviceProvider)
    {
    // Obtain the tracing service
    ITracingService tracingService =
    (ITracingService)serviceProvider.GetService(typeof(ITracingService));
    // Obtain the execution context from the service provider.
    IPluginExecutionContext context = (IPluginExecutionContext)
    serviceProvider.GetService(typeof(IPluginExecutionContext));
    IOrganizationServiceFactory serviceFactory =
    (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
    IOrganizationService service = serviceFactory.CreateOrganizationService(context.UserId);
    if (context.InputParameters.Contains("Relationship"))
    {
    var relationshipName = context.InputParameters["Relationship"].ToString();
    try
    {
    if (relationshipName != "systemuserroles_association.")
    {
    return;
    }
    if (context.MessageName == "Associate")
    {
    //logic when role added
    var updateUserSettingsRequest = new UpdateUserSettingsSystemUserRequest();
    updateUserSettingsRequest.Settings = new Entity("usersettings");
    updateUserSettingsRequest.UserId = context.UserId;
    updateUserSettingsRequest.Settings.Attributes["timezonecode"] = 215;//Singapore timezone
    service.Execute(updateUserSettingsRequest);
    }
    if (context.MessageName == "Disassociate")
    {
    //logic when role removed
    var updateUserSettingsRequest = new UpdateUserSettingsSystemUserRequest();
    updateUserSettingsRequest.Settings = new Entity("usersettings");
    updateUserSettingsRequest.UserId = context.UserId;
    updateUserSettingsRequest.Settings.Attributes["timezonecode"] = 0;//UTC timezone
    service.Execute(updateUserSettingsRequest);
    }
    else
    {
    return;
    }
    }
    catch (FaultException<OrganizationServiceFault> ex)
    {
    throw new InvalidPluginExecutionException("An error occurred in UserSettingsPlugin.", ex);
    }
    catch (Exception ex)
    {
    tracingService.Trace("UserSettingsPlugin: {0}", ex.ToString());
    throw;
    }
    }
    }
    }
    }

    Update the personal settings as per your needs in this request. You can find all the attributes of the user settings table by using Fetch Xml Builder easily.

    Hope this helps someone.

    Cheers,

    PMDY

    Debug Plugins with Dataverse Browser – Quick Recap

    Hi Folks,

    This post is for all who are working on D365 Model Driven Apps and mainly Plugins.

    Yes, you saw it right, in this blog post, we will see how can debug plugin without using our favorite plugin profiler which is very widely used from quite some time by everyone working on Plugins for Dynamics 365. All this is done by a tool called Dataverse Browser, which is not yet on XrmToolBox. Please note that there were some limitations as detailed in limitation section below.

    Here are 3 simple steps to follow..

    1. Install Dataverse Browser
    2. Attach the Debugger
    3. Run your actual operation.
    4. Step into your code and debug it.

    The tool embeds a web browser based on Chromium. It works by translating the Web API requests to SDK requests. Then it analyzes if plugin steps are registered on the message and it loads them, make them run locally. All other requests are sent to the Dataverse, so that the plugins are interacting with the real database.

    Download the latest source code of Dataverse browser here.

    Next extract the zip file downloaded as highlighted below

    Extract the zip file downloaded, open Dataverse.Browser Application as highlighted below.

    In the popup window, click on More info as highlighted below…

    Then run the application anyway…you will be presented with a window where you can select the environment. Going forward, any time you want to open Dataverse browser, just open the Dataverse.Browser.exe and choose the environment as below.

    Click on New, enter the details as above and key in the details.

    • Enter the settings of your environment:
      • A name meaningful for you
      • The host name of your instance (without the https://)
      • The path to the plugins assembly file (the dll). For a better experience, it should be compiled in debug mode with the pdb file generated.

    Then click Go.

    You just need to Authenticate to your instance.

    Once Authenticated to the respective model driven apps, all the Web API requests sent to Dataverse will be shown as below.

    I have following Plugin Libraries registered.

    Next step is to choose the instance and perform the respective operation which triggers the Plugin. So, in here, I will perform an update to the Account entity from the Dataverse Browser which triggers the Plugin.

    Once an update is performed, a Web API request gets recorded in the Dataverse browser as highlighted below.

    Since the Plugin is in Post Operation, i.e. Stage number is 40

    Just expand the Patch Request, you should see two operations on 30, 40, but area of interest here is for the Plugin which was registered on stage 40.

    Make sure you open the Visual Studio and perform the below steps from Dataverse Browser.

    Attach the debugger from Dataverse Browser by clicking on the Plug Symbol as below which will show the list of debugger options available for you to select from. Here I have selected Execute Plugins, plugin will be invoked. You can either select any of the three options as presented below.

    1.Do not execute plugins – recommended when you want to debug without actually triggering your plugin logic. i.e. With this approach even you can check the code in Production environment.

    2. Execute plugins/Execute plugins with auto break – recommended when you want to debug by triggering your actual plugin, this is recommended in case your plugin code had changed recently and in Development environments.

    Just select Ecellors Demo – Microsoft Visual Studio: Visual Studio Professional 2022 version which will launch an existing Visual studio 2022 as below in break mode. Next click on Continue as highlighted below or press Click F5 on your keyboard.

    This shows you that the debugger has been attached when you navigate to Dataverse Browser asking you to place your breakpoints.

    Now just place breakpoints in your code in Visual Studio. Just go back to Dataverse Browser and click on Ok on the Diaglog box.

    Perform the operation which triggers the Plugin from Dataverse Browser itself, this will hit the break point in Visual Studio from where you can debug your plugin.

    As you might have observed, your code need not throw exception in order to debug, you could do similarly to the way you would debug using Profiler. But here just that you don’t need to deploy the latest code to the Dataverse just for debugging purpose.

    This gives a lot more flexibility eases the way you debug plugins.

    Limitions:

    • There is no support for transactions.
    • When plugins are triggered because of a server-side operation, they will not be run locally.
    • For many reasons, behavior will never be perfectly similar to the one when plugins are executed on server side.

    Happy debugging, I hope you found this post useful…

    References:

    Dataverse Dev Browser

    Cheers,

    PMDY