All you need to know for migrating your Power Platform environments from one region to another

Geo Migration is a great feature/flexibility offered by Microsoft for customers who wish to move to a region which is in closest proximity to their operations even though initially their Power Platform environment region based out of a different one when they signed up. I checked out online but couldn’t find a good reference blog article yet online, hence this post.

I will make this post detailed but a comprehensive one for anyone to understand the migration. Customers can also opt for Multi Geo for those who have a need to store data in multiple geographies to satisfy their data residency requirements. If you don’t know where your Power Platform environment resides, you can check from Power Platform Admin Center.

If you were not aware yet, Microsoft Azure is the only cloud provider which offers services in more regions when compared to AWS (Amazon Web Services) and GCP (Google Cloud Platform). The Geo Migration feature seamlessly allows customers to move their environments in a single tenant from one region to another. e.g. for Singapore, it is as below.

Important:

  1. Geo Migration is not generally available, so please exercise with caution.
  2. You may reach out to your TAM(Microsoft Technical Account Manager) quoting your request
  3. There were several limitations, see below references for more details.

Mandatory Pre-Migration Check list:

  1. Any Power Apps, Power Automate Flows should be manually exported prior to the migration. Custom Connectors aren’t supported as of now, they must manually reconfigure or created in the new environment. You can export them individually or export them in group.
  2. Canvas Apps, Custom Pages, Code Components like PCF and libraries should be deleted from the environment before your migration activity starts. Else they might be in corrupted state after migration activity.
  3. If any of your Apps are not solution aware because of any reason like App calls a Power Automate when a button is called etc., you may need to explicitly export it out and take the backup.

Post Migration Check list:

  1. After the migration, import all the packages which you have taken backup during pre migration. For those which were not solution aware, import them manually.
  2. If you have Power Portals or Power Virtual Agents, those should be exported explicitly.
  3. Make sure you test all functionalities in order not to impact end users.

Notes:

You don’t need to build Apps and Flows from scratch. Dynamics 365 marketing App is not supported yet. There could be some configuration changes post migration.

While I try to put the information to the best available as per sources from Microsoft, this may change over time and variation could be different as each customer will have different workloads and dependencies with other services, so please read the references carefully before proceeding. Contact Microsoft Support or TAM as necessary.

Hope this helps to get a sneak peek into the migration process.

References:

Where is your data stored?

MultiGeo Architecture

Dynamics 365 & Power Platform new regions

Advance Data Residency Move Program

Geo to Geo Migrations

Cheers,

PMDY

Installing GnuPG – Your open-source software companion to encrypt/decrypt files for your Power Platform Integrations

What’s GnuPG?

GnuPG is a complete and free implementation of the OpenPGP standard. GnuPG allows you to encrypt and sign your data and communications; it features a versatile key management system, along with access modules for all kinds of public key directories. GPG can use both symmetric and asymmetric encryption to encrypt and decrypt.

So, now let’s talk about the tool Gpg4Win. Gpg4win is an email and file encryption package for most versions of Microsoft Windows and Microsoft Outlook, which utilizes the GnuPG framework for symmetric and public-key cryptography, such as data encryption, digital signatureshash calculations etc. It’s open source and a free tool, it has been widely used by many of the encryption implementations. So, let’s see how you can install a GnuPG Software.

You can navigate to this GnuPG Download link of the official download page. You can download the latest version, as of writing this blog Gpg4Win 4.2.0 is the latest.

Gpg4win 4.2.0 contains mainly, rest of the components aren’t of interest for this blog:

1.GnuPG 2.4.3 : Actual software used to encrypt and decrypt.

2. Kleopatra 3.1.28: Kleopatra is a certificate manager and GUI for GnuPG, it stores all your certificates and keys.

Choose $0 and proceed to download which now

This now downloads the Gpg4Win software. So once click and start your installation, choose the necessary components required.

You can proceed to select only GnuPG, Kleopatra or both, which installs only GnuPG command line and/or Kleopatra which is a windows utility.

If you choose not to install Kleopatra, it’s ok, you still be able to encrypt and decrypt but only using command line, but if you have Kleopatra, you can use GUI for encryption or decryption.

Once you have installed GnuPG, just open Command Prompt, start entering gpg..

You can also check the root folder where all your Key rings will be stored…

With gpg is now set up in your PC, you will be able to encrypt and decrypt using gpg command line scripts.

Ok, now everything is good, how about if other persons when logged into this PC, will they be able to use the gpg commands to encrypt or decrypt, of course not, for this you need to follow as below…

All you need to set an environment variable which is of scope user and set the home location for gpg to look for keys in that machine.

Once you have set this, the home location of gpg is now changed, so any user who have access to this path can be able to encrypt or decrypt without issues.

You check the modified location by using this command

I hope you have learned something…below this post, I have added the link to the blog post where the encryption and decryption just below this blog post, we will see how you can encrypt and decrypt files using gpg command line utility being called from C#. Any questions do let me know in comments….

Happy Integrating Power Platform with 3rd party Applications.

Cheers,

PMDY

Your Visual Studio doesn’t respond when opening Dataflow tasks in SSIS Packages in your local development machine? – Quick Tip

Hi Folks,

Thank you for visiting my blog today, this is another post talking about SSIS Data Flow Task which I encountered while performing data loading tasks using SSIS and would like to share with everyone.

Did your Visual Studio keeps not responding when you were opening the dataflow tasks for the SSIS Packages you or your team created as shown in image below. And you always try to close the same from task bar since you can’t work and keeps you frustrating, then this tip is absolutely for you.

The problem is actually with your Connection Manager, in your data flow task, you might have OLE DB Connections which the package is using in order to write information if there were any failures in the Data flow. In my case, I was actually writing to a SQL Table using a OLE DB Destination component.

If you cross check that SQL server availability, you should see the SQL Server (Your Instance) is stopped when you check in Start–> Services in the PC. In my case, I was using SQL Server (SQLEXPRESS01) in the SSIS Package as below.

And hence the SQL Server service is in stopped mode, the Visual Studio is not able to acquire the connection to open the package. You were almost there..

Just Start the service which you were using and voila…. your Visual Studio should open normally.

Thank you for reading….

Cheers,

PMDY

Unable to profile Custom Workflow using Profiler – Quick Fix

Hi Folks,

I am a big fan of Power Automate…but this post is not about flows but features about Custom Workflow in Dynamics 365 CE.

Did you ever come across this problem where you were not able to debug custom workflow extension. I had come across this and this blog post is all about it…I successfully registered my Custom workflow, but it is not triggering at all.

So, I need to debug it to see what the exact issue was…as I am encounter this error.

Error message says Duplicate workflow activity group name: ‘EcellorsDemo.Cases(1.0.0.0) (Profiled)‘. So, I tried to check my code, plugin steps and any activated plugins but couldn’t find any duplicates.

Usually while debugging your custom workflow using profiler, your workflow will go into draft mode and another copy of the same workflow gets created with name of (Profiled) attached to the name. However, in my case, I didn’t see the same behavior and at the same time, I was unable to use Profiler after the first profiling session and it gave me error shown above.

In order to resolve, this just delete the Plugin Assemblies which could find in the default solution like highlighted below…

Once you have deleted this, try to debug the custom workflow and voila!!!

Hope this helps someone troubleshooting Custom workflow…!

Cheers,

PMDY

Xrm.WebAPI with Promise for synchronous calls in Javascript

Hi Folks,

Here is how I have quickly achieved the synchronous Retrieve multiple call using Web API and Promises with the help of JavaScript. I don’t want to make my post too detail, but I would like to share the approach.

All I want to do is to just Restrict saving the Contact creation if the Postal Code entered is not present in the system. But this call should be synchronous as the message should be shown immediately incase postal code is not found in the system and prevent saving the contact record. All you need to do is simple, just call the below function on change of Postal Code in Contacts.

Here in place of XMLHTTPRequest, I have used Xrm.WebAPI so that it won’t show a critial warning in Solution Checker.

ValidatePostalCode: function (executionContext) {
"use strict";
var formContext = executionContext.getFormContext();
var postalcode = formContext.getAttribute(Resident.Fields.address1_postalcode).getValue();
var message = "Please enter a valid Postal code; Refer to Postal Code Mappings"
var uniqueId = "cnt_postalcodenotpresent";
return new Promise(function (resolve, reject) {
Xrm.WebApi.retrieveMultipleRecords("new_postalcodes", "?$select=new_postalcode&$filter=hsg_postalcode eq '" + postalcode + "' ").then(
function success(result) {
var isNotFound = false;
if(result !== undefined)
isNotFound = result.entities.length === 0 ? true : false;
if (isNotFound) {
var errorMessage = "Postal Code Mapping is not present for the given postal code"
formContext.ui.setFormNotification(errorMessage, "ERROR", uniqueId);
}
else {
Resident.isValidationNeeded = false;
formContext.ui.clearFormNotification(uniqueId);
formContext.data.entity.save();
}
// return true or false
resolve(isNotFound);
},
function (error) {
reject(error.message);
//console.log(error.message);
}
);
});
}

References:

What is Promise?

Web API Retrieve Multiple

Action based on Async Operation

Cheers,

PMDY

Power platform Plugin design mistakes which often make you mad…!!!

Hello friends,

Below blog details about the common plugin design mistake which often make you spend hours together to troubleshoot….

  1. Make sure you properly verify the filtering attributes are missing
  2. Don’t unnecessarily include all the parameters in the Pre/Post image
  3. Don’t retrieve all the data in your fetch queries
  4. Make sure your operation doesn’t take long time when you were trying to perform any synchronous operation.
  5. Check if any other operation is blocking your action or pushing your action to go next.
  6. Try to use trace log if you were able to call the plugin
  7. In some cases, if you have any issue with the plugin registration tool to profile, alternatively you can put this method of logging to trace out the issue.
  8. Use depth property to prevent infinite loop execution.

Hope you found at least you are able to understand why your plugin doesn’t trigger now.

Thank you…that’s it for today….

Cheers,

PMDY

addPreSearch and addCustomFilter to your lookups in Dynamics 365

Using these methods we can now easily filter the lookup in Dynamics 365.

Using addPreSearch we can specify a handler to PreSearch Event. Inside the handler we can specify our fetch xml query that can be used for filtering. The filter applied in the fetch xml will be combined with the any previously added filter as an ‘AND’ condition.

To remove the filter we can use removePreSearch method.

formContext.getControl(arg).addPreSearch(myFunction)

  1. Pass execution context
  2. Specify the argument which is nothing but the lookup field you want to addPresearch functionality.

Example:

formContext.getControl(“csz_consumedproduct”).addPreSearch(filterConsumedProductLookup);

function filterConsumedProductLookup(executionContext) {
debugger;
var entityLogicalName = “product”;
var filter = ” <filter>” +
” <condition attribute=’name’ operator=’not-like’ value=’%Accompanied%’ />” +
” </filter>”;
executionContext.getFormContext().getControl(“csz_consumedproduct”).addCustomFilter(filter, entityLogicalName)
}

Cheers,

PMDY

Cheers,

PMDY