Choosing The Right Data Source for Power Apps & Power Automate

One of the biggest challenges that we face when building apps and automations is the decision on where to store my data. Sometimes this choice may be dictated to us based on licensing and architectural factors, however, if you have a choice of options then this blog post is for you.

Think about the way that cars are advertised. Most car manufacturers have a super mini, family hatchback, cross-over, and SUV offering, and each one from an adoption perspective acts like a steppingstone towards the next model up next time as your wants and needs become more sophisticated! Great for us, but also great marketing for the supplier!

We all want the Audi RSQ8, but right now we might only be able to afford the Audi S1, or we might not want to commit the investment of the most expensive one right now. Note: Other car manufacturers are available of course!

Anyway, back to the technology, the most used data sources for Power Apps and Power Automate are usually Excel, SharePoint, Dataverse for Teams, and Dataverse, so let’s compare the options and understand how our needs can be met within Microsoft 365.

Microsoft Excel, the super-mini.

I have an ‘I 💖 Spreadsheets’ mug for my morning tea, and for some reason the world just can’t get enough of spreadsheets! Many organisations around the world are run on spreadsheets and nothing else. It was revolutionary at the time it was released.

A screenshot of Excel being used to create a shopping list.
A shopping list swiftly created via Excel.

Microsoft Excel is a hugely popular and fulfilling software tool, providing us with a quick way to format lists, calculate information, and visualise data. It’s formula functionality is so successful, that Power Fx, Microsoft’s language for developing Power Platform components, was inspired by it!

Excel is not a relational database though, and unless you have already defined a digital adoption strategy, spreadsheets are still saved locally on team member devices leading to a loss of business data over time.

When to use spreadsheets:

  • Quick lists
  • Personal recording of information
  • Extraction of data from another system into a universal format

When to seek one of the alternatives in this post:

  • When data is shared across multiple people or departments
  • When data repeats the same information multiple times, such as contact details

Lists (SharePoint), the hatchback.

Microsoft Lists has become a more prominent feature of SharePoint and has been rebranded as such to position the product as a feature primarily for use within Microsoft Teams. Lists combine the familiarity of Microsoft Excel, whilst also introducing concepts from relational databases and centralising of information to help increase quality by a significant proportion in comparison.

A screenshot of Microsoft Lists on the mobile and on a tablet device.
Microsoft Lists has the same features available regardless of what device you use.

Microsoft Lists can allow categorisation, links to users, and formatted fields with extraordinarily little effort. They won’t solve every problem, but they will help to keep sight of business data, and you can even generate apps and automation from Lists directly too.

Microsoft Lists is a well-received solution to our hybrid working scenarios where Microsoft Teams plays a huge part in operations.

When to use Lists:

  • Track information and progress within a team
  • Organise work and assign owners
  • Indirect benefit of preparing our business for modern cloud solutions that integrate across all of Microsoft 365

When to seek one of the alternatives in this post:

  • When sensitive data requires better security considerations
  • When your data needs to flow into another process in another system, or with another department

Dataverse for Teams, the crossover.

Following the release of Dataverse (previously the Common Data Service, or the on-premise Dynamics CRM SQL database to some of us older folk!), Microsoft also released Dataverse for Teams. This has been a fantastic middle-ground, offering organisations a step into the world of relational databases within the Power Platform, without having to initially commit to a licensing investment. The benefits of taking a relational database approach for this are huge.

A screenshot of creating a Power App in Teams, which will lead to the creation of a Dataverse for Teams environment.
Creating a Power App in Team will lead to the creation of a Dataverse for Teams environment.

There are some significant caveats in comparison to Dataverse, but you can set up your own database within a Microsoft Team, and build apps and automations on top of it, to service your end users. Remember, this is entirely free!

When to use Dataverse for Teams:

  • Small operational processes that require a team scope that can be defined within a Microsoft Team
  • When you need to build appetite for further Power Platform delivery in the future to demonstrate the art of the possible with little investment
  • When you plan to invest in Dataverse in the future, as the upgrade path from Dataverse for Teams to Dataverse is much more seamless than a migration project

When to seek Dataverse as an alternative:

  • When you need to retrieve data from sources outside of the Microsoft Team your Dataverse for Teams environment lives in
  • When you need to deliver Application Lifecycle Management (ALM) and utilise the concept of ‘development’ and ‘production’ for your solutions.
  • When you want to start utilising Dynamics 365 apps using the same database as your custom solutions.

Dataverse, the SUV.

We now look towards our final data source for review. I love describing Dataverse as the SUV. We see a nice car on the motorway with all the extras, we look up to it for inspiration on our next purchase, and one day we can finally make it to buy this dream car and it just works.

Dataverse is the same, it’s a full database offering with a comprehensive list of functions that require no expertise in SQL, just a theoretical understanding of relational databases and normalisation.

A screenshot of a Developer Dataverse environment.
A standard database environment configured for developer use.

Dataverse helps us to create a single source of the truth, and it helps us to share data from one record and relate it to others. Over time as multiple users build upon the quality of the data, you gain a significantly better understanding of how your business operates, which will help you to further improve your efficiency and services in the future. It’s worth the investment, and it’s worth setting up a free developer account to explore the possibilities if you haven’t already.

Conclusion

As we’ve discovered, there are so many tools at our disposal, even just within the Microsoft 365 stack when we’re delivering apps and automations.

My recommendation would always be to ‘climb down’ rather than ‘climb up’. We all know how easy it is to set up a spreadsheet and often we talk ourselves out of using another tool.

If we step back for a moment and consider our audience, our data model, and the impact across the organisation, it may be far better to rule out Dataverse first rather than having to justify its purpose 3-levels away from our currently proposed ‘easy’ solution which could cause maintenance issues in the future.

Consistent Y-Axis in Model-Driven App Charts

This week I faced a very old school problem with model-driven apps from the days of working on-premise with Dynamics CRM, I needed to show two different series on the same graph, but every time I would view the chart it would show me two different scales on the same Y-Axis!

As this chart was for the purpose of comparing values, this makes the out-of-the-box chart meaningless as in some respects, smaller numbers look bigger than their counter part as shown below with test data in my development environment.

A screenshot of a model-driven app chart with two different Y-Axis scales for the same type of data.
Note that the y-axis reference on the left has a higher increment for each bar in comparison to the right y-axis.

We can resolve this with a few steps by editing code, and here’s how.

Step 1: Back up your environment

Before we get started, please note that Microsoft have gone a very long way to make solutions a no-code option for deploying components and that it is not recommended to edit solution files unless the requirement cannot be fulfilled any other way, and you are absolutely confident in how solutions are composed and deployed.

Always back up your database before significant operations, and seek support from peers if you are uncertain. It’s also worth considering how impactful this change is, and whether the effort vs. benefit stacks up in the correct way.

Step 2: Create a temporary solution file

In order to make sure that our changes persist from development to production, we will need to re-import our code changes back into the source environment once complete. This is so that the changes are recognised every time we export from source and deploy to target in the future, otherwise you would need to make this change every time you deploy to a new environment in the future which carries risk due to the frequency of this activity.

A screenshot of a user creating a new temporary solution file.
This helps us to target the components that we need to change that otherwise live in another solution, and therefore reduces risk considerably.

Step 2: Add your chart(s)

When we add existing components, we’ll need to locate the Table and its associated Chart components for change.

A screenshot of a user selecting multiple existing charts to add to their solution.

Again, let’s make sure that we only add what we need here.

Step 3: Export an unmanaged copy of the solution

As we need to edit the code that sits within the solution, we must export it so that we can make the changes and re-import it here later.

A screenshot of a user selecting the Unmanaged option for the solution export.

By selecting Unmanaged, we retain full control over the customisations once the solution is re-imported, which is important as we probably want to keep the changes but remove the solution file later on.

Step 4: Unzip your file and make the change

Solution files download as .zip files, so we need to extract the files before we can work on them. When you extract the files, you’ll see three files:

  • [Content_Types].xml
  • customizations.xml
  • solutions.xml

We now need to open up customizations.xml in our favourite code editor, preferably Notepad++ or Visual Studio Code. Search for the word ‘Secondary’ within your code, and remove the YAxisType variable. You will have one of these tags for each chart that you’ve added with multiple Y-Axis, and in this instance I have two due to the two chart components that I selected earlier.

A screenshot of a user editing the customizations.xml file in Visual Studio Code.

Original code:

<Series ChartType="Line" IsValueShownAsLabel="True" BorderWidth="3" MarkerStyle="Square" MarkerSize="9" MarkerColor="37, 128, 153" MarkerBorderColor="37, 128, 153" YAxisType="Secondary" />

Amended code:

<Series ChartType=”Line” IsValueShownAsLabel=”True” BorderWidth=”3″ MarkerStyle=”Square” MarkerSize=”9″ MarkerColor=”37, 128, 153″ MarkerBorderColor=”37, 128, 153″ />

Step 5: Zip up the files and re-import

Now this is where we need to be extremely careful, we need to select the three extracted files and compress into a .zip file.

At this point in time, your Windows device will ask for a name for the .zip file. You should make sure that the name of your file within this folder is exactly the same as the original file name exported from your PC. As long as you zip these files up anywhere other than the same location that you downloaded the .zip file too, you will have no problems doing this, and you’ll then be able to go back to your browser to import the newly compressed .zip file.

A screenshot of the newly modified .zip file being uploaded to the environment.

And there you have it! I have mentioned this just a few times before, but remember that this is a relatively complex and risky operation that should be executed with focus and confidence. I have the luxury of working on on-premise versions of Dynamics “CRM” well before some Power Apps developers were out of secondary school, but if you aren’t so sure, please do reach out to me or to someone else who may be able to help with the more technical elements of this activity.

A screenshot of the final result, showing one y-axis for both lines in the graph.

Now that you’ve seen the results, you are safe to carefully remove each component from your temporary solution, before finally removing the solution itself.

Reduce Columns Created in a Collection in Canvas Apps

One of the first lessons when getting to grips with Canvas Apps was that you should always use Collections where possible to reduce the number of calls to the original data source, and with any luck, you may see a performance increase as a result too. However, I often find that the data source I’m using always collects a number of columns that I am never going to use in the Canvas App itself.

Let’s take the example of listing Account records from Dataverse using a simple Power Fx statement:

ClearCollect(ListOfAccounts, Accounts);

As you can see below, there are a significant number of columns that I don’t plan to use relating to various relationships across the Dataverse database.

A screenshot showing a Collection in Canvas Apps returning all fields from the data source.

These columns are extremely important for the database and we shouldn’t underestimate their criticality, but these are not necessarily important for me when building a Canvas App as I just want to retrieve the Account Name and the Account ID.

We can make a small change to the original Power Fx statement, by expressing exactly which columns to use, such as:

ClearCollect(ListOfAccounts, ShowColumns(Accounts, "name", "accountid"));

Which in turn produces a Collection that is much more refined, shown below.

A screenshot showing a Collection in Canvas Apps returning a more defined list of columns based on my needs for the app.
This won’t necessarily make a difference to the code that you write within your app, other than the collection’s size itself, however, when you start to write Power Fx within your components you’ll see a much shorter and more defined list of available attributes when trying to retrieve data from your collection!

Translating Unknowns into Tangible Requirements

For me, the most exciting part of a project is the challenge of figuring out exactly what a client is asking for based on a very short brief provided in an introductory call.

This challenge is increased in my industry when you move from Dynamics 365 based projects to pure Power Platform projects, because you move away from a functionally built system, to a set of tools that enable the capability. Not only do we now have to qualify the tool, but we also need to qualify the business process at an earlier stage than we typically used to, as well as the full data model.

For example, a “helpdesk replacement tool” screams Dynamics 365 Customer Service, and consultants in the industry typically understand the core operational processes before they speak to a customer. On the Power Platform, however, no two ‘self-serve chatbot’ projects would ever be the same, and there’s no functional process that you can align to this.

So how do we quantify projects with so many unknowns when we need to fully design the data model, the user interface, and the functional process? One way to start is to look for three themes:

  • Trends
  • Assumptions
  • Caveats

The first consideration I make is whether there are any repeatable components for any given high level requirement.

Whilst this doesn’t necessarily give us the full requirement ready to build, it does give us an idea of the size of the scope in contrast to a solution that is easier to estimate. Let’s take the idea of implementing a chat bot for a client on their website.

As a website user, I want to be able to engage with a chatbot, so that I can easily find out store opening times and current stock levels.

Within the industry I work in, we know that a configurable Power Virtual Agent for Teams solution that only uses Entities is relatively straight forward, and doesn’t require code. The interface used to build the solution is entirely controlled by Microsoft, so we also have confidence that it works! Let’s now put our original requirement into context by using known unknowns:

  • We know that the client cannot deploy this through Teams, but we don’t necessarily know exactly how to deploy it through a website that we don’t control just yet.
  • We are not being asked to build their website and we don’t know what their data source is, but we do know that we can take advantage of data and automation services that we can control to make this easier, perhaps Microsoft Dataverse with some sort of movement of data via Power Automate?

We now have broken down the requirement into tangible considerations and we can justify risk and complexity based on what we do know and what we can control, so we should factor this in to our estimate right from the beginning.

As a website user, I want to be able to engage with a chatbot, so that I can easily find out store opening times and current stock levels.

Trends:

1. Power Virtual Agents for intelligent chatbot functionality.

2. Power Automate to drive dynamic data interactions between end user and data source.

3. Dataverse to assist with controlling data where necessary.

Assumptions

Next up, assumptions. We are often taught that making assumptions is a bad thing, and in most cases that is correct, but assumptions can be extremely powerful when defining a requirement if used correctly.

Taking our earlier example of a chatbot being deployed via a client’s website, we really don’t want to be developing the website in unfamiliar territory, nor do we want run into any bumps if their data source isn’t fit for purpose. For now, we can set assumptions against our requirement to portray what we would typically expect within the client’s landscape, and if any of these are found not to be true, then we can justify a change in direction for a requirement through a change of scope, estimate, and change request!

As a website user, I want to be able to engage with a chatbot, so that I can easily find out store opening times and current stock levels.


Assumptions:
1. Assumes that the client’s existing data model is fit for purpose, and if any changes should be made, the client will take responsibility for these.

2. Assumes that the solution can be deployed using a embedded HTML code snippet, as per Microsoft’s standard approach.

Caveats

And last but not least, we have caveats. Clients may see these as the supplier creating ‘get out of jail free’ cards, but in reality, these are to ensure that everyone involved understands what should happen in the event that one of these factors occurs. Caveats are usually based on assumptions, but can extend further than this to cover typical project factors too.

As a website user, I want to be able to engage with a chatbot, so that I can easily find out store opening times and current stock levels.


Caveats:
1. If the data source should change after delivery, the client will be responsible for a change request for any errors that may occur with this solution if they wish to continue using the functionality.

2. If the client’s website cannot support HTML snippets for any given reason, the project may need to be delivered via a Power Apps Portal, which would incur extra cost to ensure the delivery is built to the correct standard.

Summary

When I describe this way of working with my team, I reference a phrase that may be familiar to some – It’s about the journey, not the destination. Imagine you have a 100 mile journey to make with no map functionality, digital or print. What would be your first move?

Success isn’t just the destination, or the solution in this case, it’s the route to it and the service provided along the way that counts. This continues to be a significant theme throughout the whole lifecycle of the project, and it can make or break the final engagement with the software.

Expect Dataverse Deployments To Fail First Time

Whilst the process of deployment hasn’t changed too much since the days of Dynamics CRM, one thing that has changed significantly is the volume of possible components that can be included in a solution file.

Not only is this due to an increase of readily-available functionality from Microsoft, but also by the ability for end users to install their own components, which in turn creates more dependencies on (what we think) is our small solution of configuration changes to be deployed from one environment to another. This can increase the number of failures that can occur during delivery, and often, the end user error isn’t very helpful.

A generic error provided by the Power Platform when trying to deploy a solution.

Solution deployment failures don’t have to be a problem, in fact, we should expect them.

In this blog post I will help you understand how to troubleshoot a failed deployment so that you can solve the issue in an informed way.

Step 1: Download A Code Editor

We want to ensure that the output from the failure is in a readable format, and for this we need a code editor that recognises XML formatted files. My preference as a functional consultant who needs to open the occasional file is Notepad++. It’s free, and it has an XML Tools plugin which allows you to ‘pretty format’ any XML files. You can also use Visual Studio or Visual Studio Code – I suspect some of you reading this will already have one of these installed!

Step 2: Download The Solution’s Log File

Whenever someone approaches me with a failed deployment, the first thing I ask them for is the log file. When you open this file in Notepad++, use ctrl+alt+shift+B, which will ‘pretty format’ your XML file. It’ll look something like this:

A screenshot of Notepad++ with XML Tools plugin installed. The file shown here is using 'pretty format' to make the code readable.

It looks difficult to decipher to the untrained eye, but we can quickly start to understand why the solution is failing with a few tips when we break down the file.

Step 3: Understand The Dependency

Let’s take a look at the first dependency, defined by the <MissingDependency> XML tags.

A snippet of code showing a missing dependency.

You’ll notice a <Required> line and a <Dependent> line which both include a Type. This, alongside the schema name, is the most important part of the dependency, as the two combined tell us what we’re looking for.

Fortunately we don’t need to remember all of the types as Microsoft provide a handy reference guide here.

We simply need to cross-reference the numbers in our dependency, and we now know that to complete the deployment we need to include the “Offering” entity (table) for the “Service” System Form.

Step 4: Modify Your Solution

We have two choices here:

  1. Remove the Service System Form from the solution, or,
  2. Add the Offering entity (table) into the solution.

In this particular instance it would make more sense to add the Offering into the solution, but sometimes you may challenge whether the component is really needed within your deployable solution, in which case, you’d remove the System Form.

Step 5: Rinse & Repeat

Not all dependencies will be resolved within one solution modification, but that’s ok, and you may need to repeat steps 3 & 4 multiple times before you have a solution file that can be successfully deployed. The key is to remember that failures can be expected, and that they don’t always have to be a problem!