Setting Up Your First Canvas App – Step-by-Step Guide for Beginners –

Canvas App Series – #02

This is the second blog post series on Canvas Apps where you can learn and grow from Zero – Hero in Canvas Power Apps. In this blog post, we will talk about different ways you can get started with creating canvas apps.

Introduction

Power Apps Canvas Apps allow users to build powerful applications with a drag-and-drop interface, requiring little to no coding. Whether you’re a beginner or an experienced user, setting up your first Canvas App is a straightforward process. This guide walks you through each step.

Prerequisites

Before getting started, ensure you have:

  • A Microsoft Power Apps account (Sign up here).
  • Access to Power Apps Studio.
  • Basic knowledge of what you want to build (e.g., a simple data entry form).

Step 1: Accessing Power Apps Studio

There were different ways you can create a Canvas Apps

  1. You can create a canvas app by giving your requirement in Copilot which will in turn build your Canvas Apps.

2. You can design them using any of the existing templates available

3. You can also design your App using Plan designer which is the latest feature released and still in preview, for this you need to enable

For this you need to have an plan available

You can click on See more plans option available, create new plans if necessary

You have to state your business problem, this is pretty much same as using the Copilot in the old experience but here you just tell what problem you have been solving by creating the App, that’s it.

I entered Tracking Student Attendances as my problem and within a matter of 1 min, it designed whole data model where you can accept or propose for a new change.

Once you accept, next it will go ahead and start preparing the data necessary.

After you accept this, it will start designing for the user experiences

Once everything is done, you

It will ask you to save in a solution, this way you will be able to save all your changes to a solution which can be safely moved across environments.

And that’s it, your fully functional app is ready in few mins.

Step 2: Designing Your App

Once inside the Power Apps Studio:

  1. Drag and drop controls from the left-side panel to the canvas.
  2. Add labels, text inputs, buttons, and galleries as needed.
  3. Resize and align elements for a clean layout.

Below is the sample Power App screen in Studio containing the components.

Step 3: Connecting to a Data Source

  1. Click on Data in the left panel.
  2. Select Add data and choose a source like SharePoint, Dataverse, or Excel.
  3. Connect your app to the data source.

Step 4: Adding Functionality with Formulas

Power Apps uses Excel-like formulas to add functionality. Example:

  • To navigate to another screen: Navigate(Screen2, ScreenTransition.Fade)
  • To filter data: Filter(Orders, Status="Pending")

Step 5: Previewing and Testing Your App

  1. Click on the Play button in the top-right corner.
  2. Test the app’s functionality.
  3. Fix any layout or data issues as needed.

Image Suggestion: Screenshot showing the app running in preview mode.

Step 6: Saving and Publishing

  1. Click File > Save As.
  2. Choose Cloud as the storage option.
  3. Click Publish to make your app available.

Image Suggestion: Screenshot of the Save & Publish screen.

Conclusion

Congratulations! You’ve built your first Canvas App. You can continue refining it by adding more features, integrating AI, or automating workflows.

Are you ready to explore more? Share your first Canvas App experience in the comments!

Do you want step by step guided walk through, then check this App in a Day Workshop from Microsoft where you can start from scratch and build a fully functional Canvas App.

Cheers,

PMDY

Update Power Automate Flow from Code – Quick Review

Hi Folks,

This is a post related to Power Automate, I will try to keep it short giving a background of this first.

Recently we faced one issue with Power Automate where we had actually created a Power Automate Flow which uses the trigger ‘When a HTTP Request is received’ where for the request the method name is not specified in the trigger.

So, we need to update the existing flow without generating a new one as saving your Power Automate without giving Method name gave error which couldn’t be modified later. There was one way from the code but not the Power Automate editor, so here we would try to update the flow from code. I will show you two approaches after showing the existing flow steps.

Step 1:

Navigate to https://make.powerautomate.com and create an instant Cloud Flow

Next choose the trigger

Click create and up next, choose an action

Just to inform you that I haven’t selected any method for this request

I have used a simple JSON as below for the trigger

{
  "name": "John Doe",
  "age": 30,
  "isStudent": false,
  "courses": ["Mathematics", "Physics", "Computer Science"]
}

I have added Parse JSON Step next using the same JSON Schema, so now I can save the flow

Upon saving, I got the flow URL generated as below

https://prod2-27.centralindia.logic.azure.com:443/workflows/a1e51105b13d40e991c4084a91daffa5/triggers/manual/paths/invoke?api-version=2016-06-01

You can take a look for the code generated for each step in the Code View as below

It is readonly and you can’t modify it from the https://make.powerautomate.com

Only way is to update the flow from the backend, so here are two approaches

  1. Using Chrome extension : Power Automate Tools
  2. Export & Import method

Using Chrome Extension:

Install the tool from https://chromewebstore.google.com/detail/power-automate-tools/jccblbmcghkddifenlocnjfmeemjeacc

Once installed, identify the Power Automate flow which you want to edit, once you were on this page, click on the extension –> Power Automate Tools

You can just modify the code and add the piece of step needed wherever required,

here I would add method name to my HTTP Trigger

I will add Post Method here

 "method": "POST",

It will look like

You get a chance to validate and then click on save, even you will the same IntelliSense you would have on https://make.powerautomate.com

Upon saving, your Power Automate flow, an alert will be shown to you, and the flow will be updated.

Just refresh your Power Automate flow and check

That’s it, your flow is now updated.

Well, if your tenant have policies where you can’t use the Power Automate Tools extension, you can then follow this approach is easier as well.

For showing this one, I will remove the Method name Post again from the flow, save it and then update using below method.

Export & Import method

Here you would need to export the flow, here we would use the export via Package (.zip) method.

In the next step of export, you will be prompted to key in details as below, just copy the flow name from Review Package Content and paste it in the Name field. Just enter the flow name, it would be enough.

Then click on export

The package will be exported to your local machine

We need to look for definition file, there would be few JSON files in the exported file

You can navigate the last subfolder available

Open the JSON file using your favorite IDE, I prefer using Visual Studio Code, once opened, you will see something like this

Click on Cntrl + A, once all the text is selected, right click and choose Format document, then your text will be properly aligned.

Look for the

triggerAuthenticationType

Now copy paste the code for the method

 "method": "POST",

Now your code should look like, hit save as and save the file to a different folder, since we cant override the existing zip folder.

Now once again navigate the last subfolder and delete the definition file present. Once deleted, copy the saved file in your folder to the last subfolder, so your subfolder should look exactly same as below

Now navigate to the https://make.powerautomate.com, click on import option available and choose Import Package (Legacy)

Choose your import package

The package import will be successful

Now click on the package where Red symbol is shown and choose the resource type shown at the right of the page

Scroll down to the bottom and click on save option

Then click on import option

You will be shown a message that import is successful

Refresh your Power Automate and check

That’s it, your flow is now updated from backend.

Hope that’s it, you were now able to update the flow using code.

That’s it, it’s easier than you think..can save your time when needed.

Cheers,

PMDY

Is your plugin not running? Have you debugged? Plugin doesn’t run but your operation is successful when debugging…then try this out

Hi Folks,

Last few weeks was very busy for me, I missed interacting with the community.

Here I would like to share one tip which can greatly help your debugging…

Just to give a little background, I was working with the Plugins for Dynamics 365 recently where I was working with API, the Plugin seem to work fine when debugged using Profiler, I tested the piece of the Plugin Code in Console, it worked either, but Plugin is not working when the respective action which triggers the Plugin is being fired. I scratched my head, what is the problem…

Just then, I tried using the below block of code, replaced the catch block of Plugin Code with below code.

catch(WebException ex)
{
string stringResponse = string.Empty;
int statusCode;
using (WebResponse response = ex.Response)
{
HttpWebResponse httpResponse = (HttpWebResponse)response;
statusCode = (int)httpResponse.StatusCode;
using (Stream data = response.GetResponseStream())
using (var reader = new StreamReader(data))
{
stringResponse = reader.ReadToEnd();
}
using (var ms = new MemoryStream(Encoding.Unicode.GetBytes(stringResponse)))
{
}
}
view raw Detailed Error hosted with ❤ by GitHub

Soon, I observed from the detailed error message above function posted, it is failing because of version problem of the referenced DLL and current DLL version was not supported with my assembly.

Soon I was able to reference my Plugin with correct DLL version which fixed the issue. No further debugging was needed.

Hope this helps…

Cheers,

PMDY

Another way to install Plugin Registration Tool for Dynamics 365 CE from Nuget

Hi Folks,

Are you a Power Platform or Dynamics 365 CE Developer, you would definitely need to work on Plugin Registration tool at any given point of time and having a local application for Plugin Registration tool greatly helps…in this post, I will show a little different way to install Plugin registration tool and that too very easily.

Well, this approach is especially useful to me when I got a new laptop and need to work on Plugin Registration Tool where the Plugins already build for the implementation.

First 3 ways might have known to everyone through which you can download Plugin registration tool…do you know there is fourth approach as well…

  1. From XrmToolBox
  2. From https://xrm.tools/SDK
  3. Installation from CLI
  4. See below

Because there were limitations to use these approaches at least in my experience, I found the fourth one very useful.

  1. XrmToolBox – Not quite convenient to profile and debug your plugins
  2. https://xrm.tools/SDK – Dlls in the downloaded folder will be blocked and would need to manually unblock the DLL’s for the Tool to work properly
  3. CLI – People rarely use this.

Just do note that the approach is very easy and works only if you have a Plugin Project already. Please follow the steps below

  1. Just open the Plugin project.
  2. Right click on the solution and choose manage Nuget Packages for the solution
  3. Search for Plugin Registration tool as below

4. Choose the Plugin project and click install, confirm the prompt and agree the license agreement shown

5. Once installed, next go to the Project folder in the local machine.

6. Navigate to Packages folder, you should see a folder for Plugin Registration tool below

7. There you go, you can open the Plugin Registration Application under tools folder. You can undo the changes for the Assembly it is linked to Source control.

That’s it, how easy it was? Hope this would help someone.

Cheers,

PMDY

Polymorphic Lookup in Dynamics 365: Streamlining Your CRM with Flexible Relationships

In Dynamics 365, a Polymorphic Lookup is a powerful feature that allows you to associate a single lookup field with multiple different entities. This feature is particularly useful when you want a field to reference multiple related entities, providing greater flexibility and efficiency in your CRM applications.

What is a Polymorphic Lookup?

A Polymorphic Lookup is a special type of lookup field that can refer to multiple entities rather than just one. For example, a single “Related Entity” field can refer to either a Contact, Account, or Opportunity, making it versatile for various business scenarios. This capability is referred to as “polymorphism” because the lookup field can resolve to different types of entities at runtime.

Example Scenario:

Consider a sales scenario where a “Related Entity” can be a Customer, but the customer could be either an Account or a Contact. Rather than having two separate lookup fields (one for Account and another for Contact), you can create a polymorphic lookup field, which makes your user interface simpler and more streamlined.

How Does Polymorphic Lookup Work in Dynamics 365?

In Dynamics 365, polymorphic lookup fields are implemented as part of the Relationship between entities. The key concept here is the EntityReference, which dynamically resolves to the appropriate entity type (e.g., Account, Contact, etc.) based on the actual value selected by the user.

  1. Field Definition:
    • When defining a lookup field, you define a Relationship where the field can refer to multiple target entities.
    • The system uses the Type and Id to determine the related entity.
  2. Lookup Resolution:
    • At runtime, when a user selects a value in the polymorphic lookup field, the system dynamically resolves which type of entity to link to.
    • The field displays the appropriate name (e.g., Account or Contact) based on the entity that the user selects.

Creating Polymorphic Lookups in Dynamics 365

Polymorphic lookup fields are typically used in the following types of scenarios:

  • Custom Relationships: When you need to create a lookup that can reference multiple different entities.
  • Shared Relationship: For cases where one relationship applies to more than one entity, such as a lookup that could refer to either a Contact or an Account.
Steps to Create a Polymorphic Lookup Field:
  1. Navigate to the Customization Area:
    • Go to the Settings area in Dynamics 365 and select Customizations.
    • Select Customize the System to open the solution where you want to add the polymorphic lookup field.
  2. Create a New Field:
    • In the relevant entity, click on Fields, and then select New.
    • Choose the Lookup data type for the field.
  3. Define the Polymorphic Lookup:
    • Under the Related Entity section, select Custom to define the multiple entities this lookup should support.
    • Select the Entity Relationships where this lookup should point to multiple entities.
  4. Save and Publish:
    • Save the field and publish your customizations to apply the changes.

Example: Setting Up Polymorphic Lookup for Customer

Suppose you’re designing a custom Case entity and you want to add a lookup for the Customer. Instead of creating separate lookups for Contact and Account, you can create a polymorphic lookup that links to either an Account or Contact as the Customer.

Steps:
  • Create a Customer Lookup field in the Case entity.
  • Define the Customer Lookup field to support both Account and Contact entities.
  • After publishing the field, the user will see the lookup field and will be able to choose either an Account or Contact as the Customer.

Use Cases for Polymorphic Lookup

  1. Consolidating Related Data:
    • Polymorphic lookups help streamline user experience by consolidating multiple lookups into a single field, especially when dealing with common relationships across different entities.
  2. Reducing Redundancy:
    • Rather than having separate lookup fields for Account and Contact in every related form, you can reduce redundancy by using polymorphic lookups, which allows referencing both entities in one field.
  3. Improved Reporting and Analytics:
    • When data is related across multiple entities, using a polymorphic lookup can make it easier to pull reports and perform analysis without requiring multiple joins or complex queries.

Considerations and Limitations

While polymorphic lookups are powerful, they come with certain limitations:

  • Limited to Certain Fields: Polymorphic lookups are supported only in certain system fields (like Regarding in activities), but may not be available for every custom scenario.
  • API Handling: When working with the Dynamics 365 Web API, the polymorphic lookup is handled through special attributes that require careful parsing to identify the correct entity type.
  • UI Considerations: Although polymorphic lookups streamline the user interface, they can also confuse users who are unfamiliar with the concept. It’s important to have clear documentation and training for users on how to use these fields.

Conclusion

Polymorphic lookups in Dynamics 365 provide an elegant solution for scenarios where a lookup field needs to refer to multiple entity types. By understanding and using polymorphic lookups effectively, you can streamline your CRM solutions, reduce redundancy, and improve your application’s flexibility. It’s important to consider the limitations and ensure that users are properly guided in utilizing these fields within your system.

You can easily create this Polymorphic Lookup from XrmToolBox as well…

https://pascalcase.com/Home/Blog/understanding-and-using-polymorphic-lookups-in-dynamics-365-with-xrmtoolbox

Hope this helps.

Cheers,

PMDY

Master Canvas Power Apps – #Canvas Apps Learn Series

Hi Folks,

This is a blog post series on Canvas Apps where you can learn and grow from Zero – Hero in Canvas Power Apps…and boost your knowledge on Canvas Apps.

  1. Introduction to Canvas Apps – What they are, why they matter, and real-world use cases.
  2. Setting Up Your First Canvas App – Step-by-step guide for beginners.
  3. Understanding Screens and Navigation – How to structure an app with multiple screens.
  4. Working with Data Sources – Connecting to SharePoint, Dataverse, Excel, and other sources.
  5. Forms and Galleries – Displaying and capturing data effectively.
  6. Mastering Power Fx – Key formulas and best practices.
  7. User Experience and UI Design – Creating a responsive and user-friendly interface.
  8. Using Components for Reusability – Making apps scalable and maintainable.
  9. Working with Media and Attachments – Adding images, videos, and file uploads.
  10. Performance Optimization Tips – Best practices to make apps faster.
  11. Offline Capabilities in Canvas Apps – How to work with apps when offline.
  12. Integrating Power Automate with Canvas Apps – Automating processes.
  13. AI and Copilot Features in Canvas Apps – Adding intelligence to apps.
  14. Advanced Security and Role-Based Access – Controlling user access and permissions.
  15. Publishing and Managing Your Canvas Apps – Deployment, versioning, and governance.

Firstly, let’s start with some simple introduction for this post…

What Are Canvas Apps?

Canvas Apps are a powerful low-code development tool within Microsoft Power Platform that allows users to build custom business applications with a drag-and-drop interface. Unlike model-driven apps, which rely on structured data models, Canvas Apps provide full control over the user interface, enabling developers and business users to design highly customized applications tailored to specific business needs.

Canvas Apps can be used to create simple applications for internal business processes or sophisticated applications with multiple screens, data interactions, and integrations with other Microsoft and third-party services. Users can design these apps using Power Apps Studio, a web-based development environment that provides a range of components, such as buttons, galleries, forms, and media controls, to create intuitive and responsive applications.

Why Are Canvas Apps Important?

Canvas Apps bring significant value to businesses and developers by providing:

  1. Low-Code Development – Build applications with minimal coding, making app development accessible to both developers and non-developers. Power Fx, a formula-based language, enables business logic implementation with ease.
  2. Customization & Flexibility – Unlike model-driven apps that follow a predefined data structure, Canvas Apps allow users to freely design screens, layouts, and controls, ensuring the app meets unique business requirements.
  3. Seamless Data Integration – Connect to over 800+ data sources, including SharePoint, Dataverse, Excel, SQL Server, and third-party APIs, ensuring seamless access to enterprise data.
  4. Cross-Platform Compatibility – Run apps on web browsers, mobile devices (iOS & Android), and embedded within Microsoft Teams, SharePoint, and Dynamics 365.
  5. Integration with Power Platform – Enhance apps with Power Automate for automation workflows, Power BI for data visualization, and AI Builder for AI-driven insights and intelligent automation.
  6. Rapid Prototyping & Deployment – With the drag-and-drop interface and prebuilt templates, businesses can quickly prototype and deploy applications without long development cycles.
  7. Security & Compliance – Apps built using Canvas Apps inherit Microsoft’s security infrastructure, allowing role-based access control (RBAC) and compliance with enterprise security standards.

Real-World Use Cases

Canvas Apps can be leveraged across industries to improve efficiency and streamline operations. Some common real-world use cases include:

  • Expense Management App – Employees can submit expenses with receipts, managers can approve them, and finance teams can generate reports.
  • Inventory Management System – Track stock levels, reorder inventory, and generate reports in real-time.
  • Incident Reporting App – Employees can report workplace incidents with photos, location, and real-time status updates.
  • Customer Feedback App – Collect customer feedback through mobile-friendly forms and analyze responses with Power BI.
  • Field Service Management – Field workers can access work orders, update job statuses, and capture customer signatures through mobile devices.
  • HR Onboarding App – Manage the onboarding process for new employees with guided forms, policy documents, and task checklists.

Getting Started with Canvas Apps

To start building a Canvas App, follow these steps:

  1. Sign in to Power Apps (https://make.powerapps.com)
  2. Click on ‘Create’ and select ‘Canvas App from Blank’
  3. Choose a layout (Tablet or Mobile) based on your app’s intended use
  4. Design your app using Power Apps Studio:
    • Add Screens: Home screen, forms, galleries, etc.
    • Insert Controls: Buttons, text inputs, dropdowns, and images
    • Connect Data Sources: Link to Dataverse, SharePoint, SQL, etc.
    • Apply Business Logic: Use Power Fx formulas to create dynamic interactions
    • Test the App: Use Preview mode to validate functionality
  5. Publish and Share Your App: Deploy the app and control access using Microsoft Entra ID (Azure AD)

Best Practices for Building Canvas Apps

  1. Plan Your App Structure – Define screens, navigation, and key functionalities before starting.
  2. Optimize Performance – Reduce unnecessary data calls and use delegation-friendly queries.
  3. Use Components for Reusability – Create custom components for commonly used UI elements.
  4. Ensure Responsive Design – Design layouts that work across multiple device sizes.
  5. Leverage Power Automate for Automation – Automate approvals, notifications, and data processing.

What’s Next?

In the next post, we’ll walk through setting up your first Canvas App from scratch, covering app layout, adding controls, and connecting to a data source.

Stay tuned! Don’t forget to follow along…

Cheers,

PMDY

Filter data with single date slicer when multiple dates in fact table fall in range without creating relationship in Power BI

Hi Folks,

After a while, I am back with another interesting way to solve this type of problem in Power BI. It took increasingly more amount of time to figure out best approach, this post is to help suggest a way of solving differently. This post is a bit lengthy but I will try to explain it in the best way I can.

Here is the problem, I have date fields from 2 fact tables, I have to filter them using a single date slicer which is connected to a calendar table and show the data when any of dates in a particular row falls in the date slicer range. I initially thought this was an easy one and could be solved by creating a relationship between the two fact tables with calendar table, then slice and dice the data as I was able to filter the data with one fact table when connected to calendar table.

I was unable to do that because there were multiple date fields in one fact table and need to consider dates from two tables. I tried to get the value from the slicer using Calculated field since I have do row by row checking. Later understood that, date slicer values can be obtained using a calculated field but those will not be changing when the dates in date slicer is getting changed, this is because the Calculated fields using row context and will only be updated when data is loaded or user explicitly does the refresh. Instead we have to use measure which is calculated by filter context.

The interesting point here is that, if a measure is added to the visual, it returns same value for each row, so a measure shouldn’t be added to a visual as it calculates values on a table level and not at row level, it is ideal if you want to perform any aggregations.

I tried this approach using the great blog post from legends of Power BI(Marco Russo,Alberto Ferrari), but this looked increasingly complex to my scenario and don’t really need to use this, if you still wish to check this out, below is the link to that.

https://www.sqlbi.com/articles/filtering-and-comparing-different-time-periods-with-power-bi/

So, then I tried to calculate the Maximum and Minimum for each row in my fact table using MAXX; MINX functions

MaxxDate = 

VAR Date1 = FactTable[Custom Date1]
VAR Date2 = FactTable[Custom Date2]

RETURN 
MAXX(
    {
        Date1,
        Date2
        
    },
    [Value]
)
MinXDate = 

VAR Date1 = FactTable[Custom Date1]
VAR Date2 = FactTable[Custom Date2]

RETURN 
MAXX(
    {
        Date1,
        Date2
        
    },
    [Value]
)

After merging the two tables into a single one, then create two slicers connected to Maximum Date and Minimum Date for each row. I thought my problem is solved, but it isn’t, since I was only able to filter the dates which have a maximum or minimum value selected in the date slicer, any date value within the date range is being missed.

So I am back to the same situation again

This blog post really helped me get this idea

https://community.fabric.microsoft.com/t5/Desktop/How-to-return-values-based-on-if-dates-are-within-Slicer-date/m-p/385603

Below is the approach I have used,

  1. Create a date table, using the DAX below
Date =
VAR MinDate = DATE(2023,03,01)
VAR MaxDate = TODAY()
VAR Days = CALENDAR(MinDate, MaxDate)
RETURN
ADDCOLUMNS(
Days,
"UTC Date", [Date],
"Singapore Date", [Date] + TIME(8, 0, 0),
"Year", YEAR([Date]),
"Month Number", MONTH([Date]),
"Month", FORMAT([Date], "mmmm"),
"Year Month Number", YEAR([Date]) * 12 + MONTH([Date]) – 1,
"Year Month", FORMAT([Date], "mmmm yyyy"),
"Week Number", WEEKNUM([Date]),
"Week Number and Year", "W" & WEEKNUM([Date]) & " " & YEAR([Date]),
"WeekYearNumber", YEAR([Date]) & 100 + WEEKNUM([Date]),
"Is Working Day", TRUE()
)

2. Here I didn’t create any relationship between the fact and dimension tables, you can leave them as disconnected as below

    3. All you need is a simple measure which calculates if any of the dates in the fact table fall under the slicer date range, here is the piece of code

    MEASURE =
    IF (
    (
    SELECTEDVALUE ( 'Text file to test'[Date] ) > MIN ( 'Date'[Date] )
    && SELECTEDVALUE ( 'Text file to test'[Date] ) < MAX ( 'Date'[Date] )
    )
    || (
    SELECTEDVALUE ( 'Text file to test'[Custom Date1] ) > MIN ( 'Date'[Date] )
    && SELECTEDVALUE ( 'Text file to test'[Custom Date1] ) < MAX ( 'Date'[Date] )
    ) || (
    SELECTEDVALUE ( 'Text file to test'[Custom Date2] ) > MIN ( 'Date'[Date] )
    && SELECTEDVALUE ( 'Text file to test'[Custom Date2] ) < MAX ( 'Date'[Date] )
    )
    ,
    1,
    0
    )

    4. Then filtered the table with this measure value

    That’s it, you should be able to see the table values changing based on date slicer.

    Hope this helps save at least few minutes of your valuable time.

    Cheers,

    PMDY

    Dataverse Accelerator | API playground (Preview)

    Hi Folks,

    In this post, I will be talking about the features of Dataverse Accelerator in brief. Actually, the Microsoft Dataverse accelerator is an application that provides access to select preview features and tooling related to Dataverse development, it is based on Microsoft Power Pages. This is totally different from Dataverse Industry Accelerator.

    Dataverse accelerator app is automatically available in all new Microsoft Dataverse environments. If your environment doesn’t already have it, you can install the Dataverse accelerator by going to Power Platform Admin Center –> Environments –> Dynamics 365 Apps –> Install App –> Choose Dataverse Accelerator

    You can also refer to my previous blog post on installing it here if you prefer

    Once installed, you should see something like below under the Apps

    On selection of the Dataverse Accelerator App, you should see something like below, do note that you must have App-level access to the Dataverse accelerator model driven app, such as system customizer or direct access from a security role.

    Now let’s quickly see what are features available with Dataverse Accelerator

    FeatureDescription
    Low-code plug-insReusable, real-time workflows that execute a specific set of commands within Dataverse. Low-code plug-ins run server-side and are triggered by personalized event handlers, defined in Power Fx.
    Plug-in monitorA modern interface to surface the existing plug-in trace log table in Dataverse environments, designed for developing and debugging Dataverse plug-ins and custom APIs.
    Do you remember viewing Plugin Trace logs from customizations, now you don’t need system administrator role to view trace logs, giving access to this app will do, rest everything remains the same.
    API PlaygroundA preauthenticated software testing tool which helps to quickly test and play with Dataverse API’s.

    I wrote a blog post earlier on using Low Code Plugins, you may check it out here, while using Plugin Monitor is pretty straight forward.

    You can find my blog post on using Postman to test Dataverse API’s here.

    Now let’s see how can use the API Playground, basically you will be able to test the below from API Playground similar to Postman. All you need to open the API Playground from Dataverse accelerator. You will be preauthenticated while using API Playground.

    TypeDescription
    Custom APIThis includes any Dataverse Web API actionsfunctions from Microsoft, or any public user-defined custom APIs registered in the working environment.
    Instant plug-inInstant plug-ins are classified as any user-defined workflows registered as a custom API in the environment with a related Power Fx Expressions.
    OData requestAllows more granular control over the request inputs to send OData requests.

    Custom API, Instant Plugin – You select the relevant request in the drop down available in API Playground and provide the necessary input parameters if required for your request

    OData request – Select OData as your request and provide the plural name of the entity and hit send

    After a request is sent, the response is displayed in the lower half of your screen which would be something like below.

    OData response

    I will update this post as these features get released in my region(APAC), because at the time of writing this blog post, this API Playground feature is being rolled out globally and was still in preview.

    The Dataverse accelerator isn’t available in GCC or GCC High environments.

    Hope learned something about Dataverse Accelerator.

    Cheers,

    PMDY

    My top 3 favorite features released at Build 2024 for Canvas Apps…

    Hi Folks,

    This sounds good for those who are Pro Dev and also those working on Fusion teams (Pro + Low Code), as well. Just note, all these features are in preview or experimental features and are available in US Preview Region now as they were just released in the Microsoft Build 2024 last week. Public preview of these features is expected to be released in June 2024, so you can then try out in your region as well. If you want to check these out now, spin up a trial in US Preview region.

    These are the top new features

    1. View the Code behind the Canvas Apps
    2. Use Copilot to generate comments and expressions for your Canvas Apps
    3. Integrate Canvas Apps with GitHub

    Feature #1: View the Code behind the Canvas Apps

    Now you can view the code behind your Canvas Apps, besides the screen where the components reside, click on the ellipses as below

    You should be able to see the YAML source code for your Canvas App, the code is currently read only, you can click on Copy code option in the above screen at the bottom of page.

    Make the necessary changes to the YAML code, create a new blank screen and you can then copy the YAML code to recreate the previous screen for which you copied the YAML code into a new screen if you wish you.

    Here I will copy the code for the container inside, then I will create a new Blank screen

    Once blank screen is added, expand it so that it occupies the entire size of the App, then click on Paste as below

    Give it a minute, your new screen is now ready with container inside as below e.g here it was Screen3, just rename this accordingly.

    How easy it was…. make sure you copy it to relevant item, meaning if you copied the code of container, you could only copy it to another container and not a screen.

    Feature #2: Use Copilot to generate comments and expressions for your Canvas Apps

    Do you want to generate comments for the expressions you wrote. Or have you forgot the logic which you have written for Canvas Apps long time back, don’t worry, use this approach.

    Let’s say, I am choosing OnSelect Property, I have the below formula

    Let’s ask Copilot what this mean, click on the Copilot icon available as below

    Choose to explain the formula

    So now you click on Copy option available and paste it above your command, this serves as a comment for your expression, you can try for any complex expression you wrote in your App. This improves the readability of your app and also makers can use existing knowledge to quickly get up to speed, minimize errors, and build—fast next time you were working on the same App

    Now you can generate Power Fx using Copilot, you can start typing in natural language what you need in comments and as soon you stop typing, it shows as generating as below…you could use either // and /* */, comments can remain in the formula bar as documentation, just like with traditional code.

    It generates the Power Fx command for your input as below and then you need to click on Tab key on your keyboard, it will show something like below

    And finally, you can see the output as below.

    You can apply these two tips for complex formulas as well.

    Feature 3: Integrate Canvas Apps with GitHub

    Did you ever notice that if the canvas app was opened by one user, when another user tries to open the same Canvas app, you would see a warning message and you need to explicitly click on override to take it forward, meaning at any point of time, only one person could be able to work on the Canvas App.

    Now we can use first class Devops with the GitHub Integration feature enabled, many people can work on the same canvas app at the same time and also commit the code for Canvas Apps to Git, let’s see this.

    Prerequisites:

    1. You need to have a GitHub Repo created, creating branches is optional, we can use main branch otherwise.
    2. Enable the experimental feature as below

    Then you should see

    Next thing is you need to configure Git version control as below, you can either use GitHub or Azure DevOps for this, I want to create a new Directory for storing my canvas app like GitHub Test which is not yet in my GitHub Account.

    You need to your GitHub Account settings to create a new token as below.

    For the personal access token, give repo level scope and click generate token.

    Copy the personal access token in the connect to a Git Repository window, once authenticated, you should see a message like below.

    Click Yes, you should see something like below

    Within a minute, you should the below screen w

    So, you should the code files being created in your GitHub Account as below

    Now, your team can make the changes in the GitHub, since GitHub allows multiple people to work on it, latest commit will be reflected whenever you try to open the Canvas App from the maker portal. This helps developers build and deploy to source control without leaving the maker portal. Whenever you try to open this app, it will ask your Git Account credentials.

    Do note that these features are currently available in US Preview region as they are just released last week in Build and would be released to other regions in near future, possibly in June 2024.

    Hope you learned something new coming up next month or sooner…

    That’s it for today…

    Cheers,

    PMDY

    Setup Copilot in a Model-driven app – Quick Review

    Hi Folks,

    Wondering how you can enable Copilot in Dynamics 365 Model Driven App …? Then you come to the right place, few days ago, I was trying to use it few days back but couldn’t. Hence this blog post is from my experience.

    There were few things to configure for your Copilot to respond to your queries. So, I will be taking about that in this blog post today. Let’s get started…

    Copilot in model-driven Power Apps was in Preview since July 2023.

    Prerequisite: You must have a non-production environment with Dataverse database, apps, and data.

    Step 1: Go to Power Platform Admin Center –> Select the environment –> Settings –> Product –> Features –> Select On for AI-powered experience as highlighted below, if you were App maker and want to try it for yourself, you would also need to check the option in yellow below.

    Step 2: Go to Power Platform Admin Center –> Select the environment –> Settings –> Product –> Behaviour –> Select Monthly channel or Auto  for Model-driven app release channel option and click save.

    Step 3: Well, this step is important, in this task, we configure a Dataverse table and columns for Copilot.

    Go to Power Apps and make sure that you have the correct environment.

    Select tables and navigate to the respective table for which you want to enable Copilot capability.

    Step 4: Here I am using OOB Account entity, you can choose whichever entity you wish to setup.

    Step 5: Navigate to Properties for the Account table as below

    Step 6: Choose settings as highlighted below and click on save.

    Step 8: Open the Account table and go views

    Step 9: Here in this step, would need configure the Quick Find View, add the necessary fields to the view for it to be searchable for Copilot. Add in the fields which your users would be searching for in the Copilot.

    Step 10: Here we have to make sure the fields are added to the view and then save and publish.

    That’s it, the configuration is done.

    Step 11: In this step, we will test the Copilot by opening the App in which the configured entity is available. Click on the Copilot icon as highlighted below, this shows the Chat window for Copilot

    Step 12:

    Test 1: Prompt: How many Accounts are there which Primary Contact starting with H? Well, it showed correctly as below.

    Test 2: Prompt: Show Accounts whose Annual Revenue is more than 300,000? It showed correctly as below.

    Hope this helps you to setup Copilot for your Model Driven Apps. I will leave it to yourself to try this out.

    Make sure, you give all the details in the prompt itself, it will not be able to store the previous response, meaning you can’t continue your conversation providing information in bits and pieces. You can setup the same for your Custom entity also, make sure you add the fields to the quick find view of that entity.

    It is not recommended for Production environments as it is still a preview feature. In case, the response is not accurate, you can report this to Microsoft by hitting thumbs up or thumbs down and provide the relevant feedback.

    Lot more to come in the upcoming days, learning different aspects of Copilot became a necessity these days.

    That’s it for today…hope this helps…

    Cheers,

    PMDY