Power Fx Fomula Data type – your new companion in addition to Calculated fields in Dataverse [Insight]

Hello Folks,

I believe ever Power Platform professional working on Dataverse had one or other time got a chance to work on calculated fields. Actually it provides an easy way to perform any calculations for the supported data types since it has been introduced with CRM Version 2015 update 1.

Here is a very simple example of simple calculation to get your Fx data type up and running in few seconds….follow along….

Navigate to https://make.powerapps.com/

Open your solution, navigate to the columns in any table….for simplicity I am taking example of Accounts table…

Now create new column as below

Key values for the field, make a note that the data type (Fx) is selected

I already have two fields as below already on the form for calculating the Annual revenue per Employee from Annual Revenue of the company…

So now let’s write a simple Power Fx formula to calculate the Annual Revenue per Employee…the expression goes as below…

Annual Revenue is a currency field and Number of Employees field is single line of text. As soon as you save, system automatically identifies the data type as Decimal Number as shown above, click on save and publish the form…

Let’s see the actual use in the form…as soon as you enter the values for Annual Revenue and Number of Employees and save, the value for Calculated Revenue for the Employee field value will be calculated by the Power Fx expression.

Hope this will be useful in future for your implementations…

Points to keep in view:

  1. This formula column is in preview right now at the time of writing this blog post.
  2. And currently, formula columns can’t be used in roll-up fields or with plugins.
  3. You can use the following operators in a formula column:
    +, -, *, /, %, ^, in, exactin, &
  4. Microsoft Documentation says that the Currency data type isn’t currently supported but it works actually.
  5. The Text and Value functions only work with whole numbers, where no decimal separator is involved

Ref: Formula Column

Cheers,

PMDY

Setting up Postman Environment to test the Dataverse API’s – Quick Tip

Hi Folks,

Today in this blog post, I would like to share how we can quickly set up Postman Environment to test out the Dataverse API’s right away. It’s very easy and doesn’t need any Client Id, Client Secret registration in Azure AD for Authorization if you follow these steps as the provided Client id works for all Dataverse environments, so let me take you through.

You just need to have a Dataverse environment that you can connect to and Postman desktop App in your machine(Preferably windows)

  1. Lauch the Postman desktop application
  2. Create the environment by click

3. Enter a name for your environment, for example, Blog Environment as below

4. Get the Web API end point URL for your environment with few simple steps as below…by going to Developer resources in make.powerapps.com

Then copy the Web API endpoint URL as below…

Next step is to add following key value pairs in Postman for connecting to Dynamics..please make sure you use the same clientid(51f81489-12ee-4a9e-aaae-a2591f45987d), it is the same to connect to any Dataverse environment.

Variable Initial value

urlhttps://<your org name>.api.crm.dynamics.com
clientid51f81489-12ee-4a9e-aaae-a2591f45987d
version9.2
webapiurl{{url}}/api/data/v{{version}}/
callbackhttps://localhost
authurlhttps://login.microsoftonline.com/common/oauth2/authorize?resource={{url}}

Your updated configuration should look something as below in the Postman.

Click on save to save your newly created environment as highlighted below..

Now all you need is to generate access token in order to authenticate with your Dataverse environment to connect using OAuth 2.0

Follow the simple steps below..

Click on newly created environment, click on + symbol besides it as highlighted below

The following pane appears. Select the Authorization tab.

Set the Type to OAuth 2.0 and set Add authorization data to to Request Headers, if you scroll down a bit, you will be able to see Configure New Token option as below, else you wouldn’t.Auth request headers

In the Configure New Token pane, set the following values:

NameValueAction
Grant TypeimplicitChoose implicit from the drop-down
Callback URL{{callback}}Copy the value
Auth URL{{authurl}}Copy the value
Client ID{{clientid}}Copy the value

The settings should appear as below

Tip: If you were using the Postman to connect to multiple dataverse instances, make sure you clear your Cookies inorder delete the cookies in Postman.

Click on Get New Access Token button, within a moment, you should see a Azure Active Directory pop up to Authenticate your login from browser.

Click Get New Access Token.Once you click Get New Access Token, an Azure Active Directory sign-in dialog box appears. Enter your username and password, and then click Sign In. Once authentication completes, the following dialogue appears and just get the token with few more steps as below.

  1. Authentication completes
  2. After the authentication dialogue automatically closes in a few seconds, the Manage Access Tokens pane appears. Click Use Token.Access token page
  3. The newly generated token will automatically appear in the text box below the Available Tokens drop-down.Token autopopulate

Test your connection

The following shows how to test your connection using WhoAmI:

  1. Select GET as the HTTP method and add {{webapiurl}}WhoAmI in the editing space.Calling WhoAmI endpoint
  2. Click Send to send this request.
  3. If your request is successful, you will see the data returning from the WhoAmI endpoint, like below:Response from WhoAmI

Hope you have found this post useful…when you were working with Dataverse API’s.

Cheers,

PMDY

Run PCF Code Components in browser, deploy to Dataverse easily – Quick Recap

Hi Folks,

In this blog post, I would detail about how you can you work with Code Components, it just takes only few minutes of your valuable time. Now-a -days everyone is moving from traditional HTML Webresources to PCF Code components. I used to be a pro-HTML Developer where I always want to always know how my code is running in browser. Usually every developer wants to try out how their code is working in local before proceeding further. So here we will see how you can run your component locally…and once tested we can deploy them to Dataverse.

So without any further due, let’s get into it…

Firstly you can download these code components from this link, now let’s see how we can use these components in our Apps. Follow with me with few simple steps as below..

1.Install Microsoft Power Platform CLI.

2. Navigate to the folder and extract the zip file.

3. Open Visual Studio Code and navigate to that folder location(Many people suggest to use Visual Studio Developer Command Prompt, but believe me this is a lot easier)

The component’s run time can be found by navigating to the respective folder of the component..

4. Open a new Terminal and execute npm install command(I am assuming that you have node installed in your machine for working with this), this will add all the dependencies to the component folder, it should look something like below..

5. As this control is a prebuilt one, no need to execute build command.

6.Create a new folder using the command mkdir <folder name> inside the sample component folder and navigate into the folder using the command cd <folder name>, something like below…here I have named the component folder as IncrementComponent.

7. Now we will proceed with next steps of creating a new solution project inside the same folder using the following command:pac solution init --publisher-name <Name of the publisher> --publisher-prefix <Publisher prefix>

It should look as below

You should see solution folder components being updated as below

8.After the new solution project is created, refer to the location where the sample component is located. You can add the reference using the following command:pac solution add-reference --path <Path to the root of the sample component>

It should look as below

10. Now you have to generate a zip file from your solution project by building the project using the following command:msbuild /t:restore

Oops, you get an error as below…in order to resolve

If you just try to reverse engineer, the above error says that ‘msbuild’ is not recognized and instructs to check the path variable.

Inorder to fix this, I have followed blog and with minor tweaks, able to resolve the issue, i.e. if you have 64 bit Visual Studio 2022 version, you have to use the below path for environment variable instead of the one specified in the above blog.

Path: %ProgramFiles%\Microsoft Visual Studio\2022\Community\MSBuild\Current\Bin

Then you should be able to overcome the above error…and you should see a screen something same as below

The next step is to run your Code component in local test harness and see how if behaves before actually pushing to dataverse…so use the following command npm start while making sure terminal points to the exact folder location…

Now the code has been run in your browser, you just need to verify by going to the mentioned URL in the local machine.

http://localhost:8181

There it was, you can see your component running in your local browser window….

These code components can be used in Canvas Apps, Model Driven Apps, Power Portals and adds much more flexibility than customizing with HTML Webresources.

Now you can use msbuild /t:restore command to create a zip file, we are good to use the PCF control by importing it to CDS.

Limitations:

  1. Microsoft Dataverse dependent APIs, including WebAPI, are not available for Power Apps canvas applications yet.
  2. Code components should bundle all the code including external library content into the primary code bundle. To see an example of how the Power Apps command line interface can help with bundling your external library content into a component-specific bundle, see Angular flip component example. 
  3. Code components should not use the HTML web storage objects, like window.localStorage and window.sessionStorage, to store data. Data stored locally on the user’s browser or mobile client is not secure and not guaranteed to be available reliably.

You can learn more about PCF Here…hope this helps….

Additional Resources to try out Code Components

Cheers,

PMDY

Automation Kit for Power Platform – Quick Review

Hi Folks,

Have you ever thought of a tool where you can review all your scheduled flows at once in one dashboard, then I am glad to introduce you the latest capability from Microsoft which is none other than Automation Kit.

If we get into detail, the Automation Kit is set of tools that accelerates the use and support of Power Automate for desktop for automation projects. HEAT is guidance that’s designed to help you deploy the automation platform and manage the entire lifecycle of an automation project.

The key features of this Automation Tool Kit:

  1. The ability to view the schedule of Recurring cloud flows
  2. View schedule by Day, Week, Month and Schedule view
  3. View the status of Scheduled flows (Success, Failure or Scheduled)
  4. View the duration of a Cloud Flow run
  5. View the details any any errors

The key element of the solution is the Power Platform main environment.

There are usually several satellite production environments that run your automation projects. Depending on your environment strategy, these could also be development or test environments.

Between these environments there is a near-real-time synchronization process that includes cloud or desktop flow telemetry, machine and machine group usage, and audit logs. The Power BI dashboard for the Automation Kit displays this information.

Automation Kit components

The Automation Kit supports an automation CoE with the following components:

  1. Automation Project: This project is a canvas app that supports requesting automation projects and submitting them for approval.
  2. Automation Center: This is a model-driven app that organizations can use to create and maintain automation assets, such as master data records, map resources and environments, and assign roles to employees.
  3. Automation Solution Manager: This is a canvas app in satellite environments that enables the metering of solutions and their artifacts.
  4. Cloud flows: These cloud flows use Dataverse tables to sync data from satellite environments, in near real time, to the main environment.
  5. A Power BI dashboard that provides insights and monitors your automation assets.

These two solutions contain the components in the kit.

  • The main solution, which you deploy to the main environment.
  • The satellite solution, which you deploy in each satellite environment.

Limitations:

  1. Only Power Automate Desktop and Power Automate solutions contained within a solution are displayed
  2. At least one Power Automate Desktop has been registered and executed

Reference:

Automation Kit for Power Automate

Automation adoption best practices overview

Learn More

I am glad to help you know about Power Platform Latest Capability…

Cheers,

PMDY

Track the Power Platform blog feed…Insight

Hi Folks,

Hope you were having a fantastic year….

By the way, in this blog post, I would like to give a bit of detail how to record new posts in any of the tech blogs mainly for your favorite power platform ones coming from Power Platform Tech Guru’s directly in your team’s channel without navigating away. This helps you to upskill or learn a new topic fast. Pshhh..at least for some.

All you need is a teams license and a RSS enabled blog address to get the feed…all WordPress blogs are RSS Enabled by default.

Ok, let’s get started….

Open Microsoft Teams

In the left pane, select Apps icon and search for RSS

Click on RSS in the search and choose Add to a team as below.

Next step is to select the channel..

Then choose Set up a connector as below

Key in the details of the feed…here for sake of simplicity, I am giving this blog feed…so see snap below, same time you can set the frequency and you get updates from your feed.

Next step is to just save the feed details….and once saved you can something like below…

That’s it, you will be notified each time a blog is published directly to your teams channel and all the feed for the blog will be available right in in Team’s channel as below. How great is it….

This is not a power platform related blog post actually but this tip can greatly enhance your learning in your workplace without getting distracted.

Hope this helps…

Cheers,

PMDY

Email templates showing Xml – Quick Tip

Hi Folks,

We recently came across a situation where the new and existing email templates keeps showing a xml as below.

This kept us annoying as already the existing ones are created using Rich Email Template editor. At first check, we verified in our Dev and SIT, badly we saw that this is same in both the environments. We double confirmed that no changes were made to the OOB Email template form, so we doubted that there was something wrong with our environment. Luckily we have one more environment where we able to see the email templates working fine. Then we confirmed that there is related to email template form related issue. Also when we tried to open the existing email templates in new designer from https://make.powerapps.com, they opened without any issue.

Fix: Open your model driven app in your custom solution created and launch it by double clicking on it. Just verify it the forms selected for Email Template entity….

The fix is quite obvious and there you are..inorder for email template to show properly, you should select Default UCI Template type form.

Voila, its back as below.

Hope this useful…

Cheers,

PMDY

ChatGPT – Insights

Hi Folks,

In this modern era where AI/ML is ruling the world and automating all the possible day to day activities done by a human, in this blog article, let’s talk about the latest buzz word since November 2022. i.e. ChatGPT.

So, what is ChatGPT, why its has become so much popular?

Officially ChatGPT stands for Generative PreTrained Transformer, it’s an AI-powered chatbot created by OpenAI. ChatGPT was fine-tuned on top of GPT-3.5 using supervised learning as well as reinforcement learning using Microsoft’s neural network technology. Open AI’s Chat GPT uses powerful machine learning algorithms to generate coherent and natural-sounding responses for user queries. To some extent, its giving a tough competition to its competitor Google Deep Mind Sparrow. The differences between the two can be found here.

The AI race between Microsoft and Google has definitely been ignited. Microsoft’s recent $10 billion investment into OpenAI, the startup behind their popular Chat GPT chatbot, has increased competition for AI supremacy.

Ok, that’s all the sleek intro. Let’s see how to work on this Chat GPT.

It can be accessed here

Once you click on the above link, you need login either with your Google or Microsoft credentials. You will be presented with below screen.

In the text box available at the bottom, you can enter your input for ChatGPT to search and get back the result to you.

I have just tried entering how we can integrate Open AI’s ChatGPT and Power Automate and results are pretty much shown in a manner which is easier to understand as below.

The interesting part compared to Power Platform’s Power Virtual Agents is that ChatGPT has the capability to record the previous responses used under the login which is a great enhancement. So with this, the problems can be analyzed and solutions can be generated. By the way, the GPT-3 model, in particular, is 175 billion parameters in size, making it the largest language model ever trained.

In the upcoming blogs, I will try to elaborate how Chat GPT can be used for Power Platform real world scenarios and how we can integrate with Power Automate step by step…. Till then stay tuned…and keep rocking….

Cheers,

PMDY

Power Automate performance improvement considerations

Hi Folks,

Thanks for visiting my blog…in this blog post, have you ever faced the situation where your flow keeps running with no output. Today I will list down the possible scenarios where you were struck with slow performance of the Flow.

Consider the below actions to make your flow execute smoothly.

Remediation steps/Actions to take to make your flow run efficiently:

  1. Understand the throttling limits for your connectors and data sources
  2. Check for Request limits based on user licenses
  3. Cross verify the throughput limits
  4. See the minimum number of actions that the Power Automate service will allow for each plan on the Request limits and allocation page.
  5. Do verify if you were using any on premise connectors
  6. You are hitting the throttling limit in Power Automate
  7. Redesigning your flow to use fewer actions and less data.
  8. Reduce the number of loop iterations for the iteration in ‘Do Until’ and ‘For each item’
  9. Filter your data to retrieve only what is necessary Filtering with Odata
  10. Consider reducing the frequency of scheduled cloud flow
  11. Reduce the file size being accessed if possible if your flow uses them
  12. Consider using Variables for frequently accessed information in your flow
  13. Use Compose and Variable actions to view the data at any time.
  14. Purchasing a Per User or Per Flow license from the pricing page
  15. Per Flow plan may provide best performance quotas available
  16. Enable concurrency control for your ‘Apply to each’ action
  17. Consider creating custom retry policy
  18. Use Select Actions
  19. Check your System jobs in Data verse if asynchronous service performance is normal
  20. Reduce the flow complexity
  21. Consider using Process Advisor for Power Automate.
  22. Verify if you were hitting 2 minute timelimit in dataverse if you were calling Bound or Unbound action in Flow.

Hope this helps someone who’s looking to optimize their flow.

Cheers,

PMDY

Top picks for Power Automate

What is Solution Checker and App Checker in Power Apps – Quick recap

Hi Folks,

While its been quite sometime since Microsoft shipped the Solution checker and App checker, these tools can help a developer to validate all the solutions that was being built before moving to higher environments. It is always advisable to run solution checker once your solution is developed so this can help you achieve better performance following the Power Platform best practices. Previously we used to send for Code review to senior folks but now with this tool, even the junior developer working at the ground level can easily understand and can make the necessary tweaks in the solution.

Solution Checker serves as a static analysis tool for the developers to check any platform related issues.

The solution checker analyzes these solution components:

  • Dataverse custom workflow activities
  • Dataverse web resources (HTML and JavaScript)
  • Dataverse configurations, such as SDK message steps

Note: Solution checker won’t analyze plugins in solutions. Plugin validations are modernized and will eventually the focus is on the native plugin authoring time, which will help you detect and fix issues earlier. So if you were looking for improvements in Plugin code, this will not help you.

Once solution checker starts running, it will be shown as below with a loading symbol on solution checker

It would take a few minutes to complete the process and will be based on the size of the solution. Once this process is complete, you should be able to download the results or view the results like below

If we open the results file, it shows the potential issues or improvements along with their severity which helps us to prioritize the issues which we need to work upon.

The report can also be downloadable excel file with analysis shipped in Zip format.

Now that we have seen what is solution checker, let’s see what App checker is and its pro’s and con’s.

App Checker:

  1. The App checker is now available to help provide a clear list of formula issues in your app, and to provide items to fix to make your app accessible
  2. This helps to make debugging, performance and best practice decisions an easier and more guided experience.  
  3. This is an ideal way to check the formulas you wrote for your Canvas Apps.
  4. There isn’t any possibility to download the app checker results but you can analyze the results on the fly in canvas apps.

To conclude you can think of Solution checker is a tool to check Model Driven Apps and App checker is a tool for Canvas Apps. Hope you will use this great features to improve your solutions and design according to best practices.

Reference:

Solution checker from MS Learn

Apps checker from MS Learn

Cheers,

PMDY

Top Picks for Power Apps

Top Picks for Power Automate

Top Picks for Microsoft Azure

Environment variables in Power Platform makes your life easier – Quick recap…

Often there is a need to use some kind of configuration for your customizations to work across environments in Power Platform or for storing the 3rd party URLs like SharePoint, other API services etc…previously there were ways where you can store your config but now with this helps in efficient way of interacting with your configuration across environments.

One environment variable can be used across many different solution components – whether they’re the same type of component or different ex. Power Apps and Flow can use the same variable in the environment. Environment variables store the parameter keys and values, which then serve as input to various other application objects.

Additionally, if you need to retire a data source in production environments, you can simply update the environment variable values with information for the new data source. The apps and flows don’t require modification and will start using the new data source.

The environment variables can be unpacked and stored in source control. You may also store different environment variables values files for the separate configuration needed in different environments. Solution Packager can then accept the file corresponding to the environment the solution will be imported to. Thanks @Rezza Dorani for the video…

The following environment variables are available as on today…

Use cases:

  1. Access Environment variables in Plugins
  2. Get & Set Environment variables using Javascript

Few Advantages:

  1. Environment variables are supported in custom connectors.
  2. Provide new parameter values while importing solutions to other environments. 
  3. Supported by Solution Packager and DevOps tools enable continuous integration and continuous delivery (CI/CD).
  4. Its very easy to maintain datasources in Environment variables.
  5. Its ideal for passing any parameters to bound or unbound actions from Power Automate.

Neverthless there were few limitions:

  1. When environment variable values are changed directly within an environment instead of through an ALM operation like solution import, flows will continue using the previous value until the flow is either saved or turned off and turned on again.
  2. If the environment value is changed, it may take up to an hour to fully publish updated environment variables.
  3. If you made the same mistake as I did and imported a Managed Solution without a Current Value, added a Current Value for the first time and cannot edit the new Current Value anymore.
  4. It looks like a potential risk.

Hope this post helped in some way…please let me know if you have any questions….

Cheers,

PMDY