Choosing The Right Data Source for Power Apps & Power Automate

One of the biggest challenges that we face when building apps and automations is the decision on where to store my data. Sometimes this choice may be dictated to us based on licensing and architectural factors, however, if you have a choice of options then this blog post is for you.

Think about the way that cars are advertised. Most car manufacturers have a super mini, family hatchback, cross-over, and SUV offering, and each one from an adoption perspective acts like a steppingstone towards the next model up next time as your wants and needs become more sophisticated! Great for us, but also great marketing for the supplier!

We all want the Audi RSQ8, but right now we might only be able to afford the Audi S1, or we might not want to commit the investment of the most expensive one right now. Note: Other car manufacturers are available of course!

Anyway, back to the technology, the most used data sources for Power Apps and Power Automate are usually Excel, SharePoint, Dataverse for Teams, and Dataverse, so let’s compare the options and understand how our needs can be met within Microsoft 365.

Microsoft Excel, the super-mini.

I have an ‘I 💖 Spreadsheets’ mug for my morning tea, and for some reason the world just can’t get enough of spreadsheets! Many organisations around the world are run on spreadsheets and nothing else. It was revolutionary at the time it was released.

A screenshot of Excel being used to create a shopping list.
A shopping list swiftly created via Excel.

Microsoft Excel is a hugely popular and fulfilling software tool, providing us with a quick way to format lists, calculate information, and visualise data. It’s formula functionality is so successful, that Power Fx, Microsoft’s language for developing Power Platform components, was inspired by it!

Excel is not a relational database though, and unless you have already defined a digital adoption strategy, spreadsheets are still saved locally on team member devices leading to a loss of business data over time.

When to use spreadsheets:

  • Quick lists
  • Personal recording of information
  • Extraction of data from another system into a universal format

When to seek one of the alternatives in this post:

  • When data is shared across multiple people or departments
  • When data repeats the same information multiple times, such as contact details

Lists (SharePoint), the hatchback.

Microsoft Lists has become a more prominent feature of SharePoint and has been rebranded as such to position the product as a feature primarily for use within Microsoft Teams. Lists combine the familiarity of Microsoft Excel, whilst also introducing concepts from relational databases and centralising of information to help increase quality by a significant proportion in comparison.

A screenshot of Microsoft Lists on the mobile and on a tablet device.
Microsoft Lists has the same features available regardless of what device you use.

Microsoft Lists can allow categorisation, links to users, and formatted fields with extraordinarily little effort. They won’t solve every problem, but they will help to keep sight of business data, and you can even generate apps and automation from Lists directly too.

Microsoft Lists is a well-received solution to our hybrid working scenarios where Microsoft Teams plays a huge part in operations.

When to use Lists:

  • Track information and progress within a team
  • Organise work and assign owners
  • Indirect benefit of preparing our business for modern cloud solutions that integrate across all of Microsoft 365

When to seek one of the alternatives in this post:

  • When sensitive data requires better security considerations
  • When your data needs to flow into another process in another system, or with another department

Dataverse for Teams, the crossover.

Following the release of Dataverse (previously the Common Data Service, or the on-premise Dynamics CRM SQL database to some of us older folk!), Microsoft also released Dataverse for Teams. This has been a fantastic middle-ground, offering organisations a step into the world of relational databases within the Power Platform, without having to initially commit to a licensing investment. The benefits of taking a relational database approach for this are huge.

A screenshot of creating a Power App in Teams, which will lead to the creation of a Dataverse for Teams environment.
Creating a Power App in Team will lead to the creation of a Dataverse for Teams environment.

There are some significant caveats in comparison to Dataverse, but you can set up your own database within a Microsoft Team, and build apps and automations on top of it, to service your end users. Remember, this is entirely free!

When to use Dataverse for Teams:

  • Small operational processes that require a team scope that can be defined within a Microsoft Team
  • When you need to build appetite for further Power Platform delivery in the future to demonstrate the art of the possible with little investment
  • When you plan to invest in Dataverse in the future, as the upgrade path from Dataverse for Teams to Dataverse is much more seamless than a migration project

When to seek Dataverse as an alternative:

  • When you need to retrieve data from sources outside of the Microsoft Team your Dataverse for Teams environment lives in
  • When you need to deliver Application Lifecycle Management (ALM) and utilise the concept of ‘development’ and ‘production’ for your solutions.
  • When you want to start utilising Dynamics 365 apps using the same database as your custom solutions.

Dataverse, the SUV.

We now look towards our final data source for review. I love describing Dataverse as the SUV. We see a nice car on the motorway with all the extras, we look up to it for inspiration on our next purchase, and one day we can finally make it to buy this dream car and it just works.

Dataverse is the same, it’s a full database offering with a comprehensive list of functions that require no expertise in SQL, just a theoretical understanding of relational databases and normalisation.

A screenshot of a Developer Dataverse environment.
A standard database environment configured for developer use.

Dataverse helps us to create a single source of the truth, and it helps us to share data from one record and relate it to others. Over time as multiple users build upon the quality of the data, you gain a significantly better understanding of how your business operates, which will help you to further improve your efficiency and services in the future. It’s worth the investment, and it’s worth setting up a free developer account to explore the possibilities if you haven’t already.

Conclusion

As we’ve discovered, there are so many tools at our disposal, even just within the Microsoft 365 stack when we’re delivering apps and automations.

My recommendation would always be to ‘climb down’ rather than ‘climb up’. We all know how easy it is to set up a spreadsheet and often we talk ourselves out of using another tool.

If we step back for a moment and consider our audience, our data model, and the impact across the organisation, it may be far better to rule out Dataverse first rather than having to justify its purpose 3-levels away from our currently proposed ‘easy’ solution which could cause maintenance issues in the future.

Consistent Y-Axis in Model-Driven App Charts

This week I faced a very old school problem with model-driven apps from the days of working on-premise with Dynamics CRM, I needed to show two different series on the same graph, but every time I would view the chart it would show me two different scales on the same Y-Axis!

As this chart was for the purpose of comparing values, this makes the out-of-the-box chart meaningless as in some respects, smaller numbers look bigger than their counter part as shown below with test data in my development environment.

A screenshot of a model-driven app chart with two different Y-Axis scales for the same type of data.
Note that the y-axis reference on the left has a higher increment for each bar in comparison to the right y-axis.

We can resolve this with a few steps by editing code, and here’s how.

Step 1: Back up your environment

Before we get started, please note that Microsoft have gone a very long way to make solutions a no-code option for deploying components and that it is not recommended to edit solution files unless the requirement cannot be fulfilled any other way, and you are absolutely confident in how solutions are composed and deployed.

Always back up your database before significant operations, and seek support from peers if you are uncertain. It’s also worth considering how impactful this change is, and whether the effort vs. benefit stacks up in the correct way.

Step 2: Create a temporary solution file

In order to make sure that our changes persist from development to production, we will need to re-import our code changes back into the source environment once complete. This is so that the changes are recognised every time we export from source and deploy to target in the future, otherwise you would need to make this change every time you deploy to a new environment in the future which carries risk due to the frequency of this activity.

A screenshot of a user creating a new temporary solution file.
This helps us to target the components that we need to change that otherwise live in another solution, and therefore reduces risk considerably.

Step 2: Add your chart(s)

When we add existing components, we’ll need to locate the Table and its associated Chart components for change.

A screenshot of a user selecting multiple existing charts to add to their solution.

Again, let’s make sure that we only add what we need here.

Step 3: Export an unmanaged copy of the solution

As we need to edit the code that sits within the solution, we must export it so that we can make the changes and re-import it here later.

A screenshot of a user selecting the Unmanaged option for the solution export.

By selecting Unmanaged, we retain full control over the customisations once the solution is re-imported, which is important as we probably want to keep the changes but remove the solution file later on.

Step 4: Unzip your file and make the change

Solution files download as .zip files, so we need to extract the files before we can work on them. When you extract the files, you’ll see three files:

  • [Content_Types].xml
  • customizations.xml
  • solutions.xml

We now need to open up customizations.xml in our favourite code editor, preferably Notepad++ or Visual Studio Code. Search for the word ‘Secondary’ within your code, and remove the YAxisType variable. You will have one of these tags for each chart that you’ve added with multiple Y-Axis, and in this instance I have two due to the two chart components that I selected earlier.

A screenshot of a user editing the customizations.xml file in Visual Studio Code.

Original code:

<Series ChartType="Line" IsValueShownAsLabel="True" BorderWidth="3" MarkerStyle="Square" MarkerSize="9" MarkerColor="37, 128, 153" MarkerBorderColor="37, 128, 153" YAxisType="Secondary" />

Amended code:

<Series ChartType=”Line” IsValueShownAsLabel=”True” BorderWidth=”3″ MarkerStyle=”Square” MarkerSize=”9″ MarkerColor=”37, 128, 153″ MarkerBorderColor=”37, 128, 153″ />

Step 5: Zip up the files and re-import

Now this is where we need to be extremely careful, we need to select the three extracted files and compress into a .zip file.

At this point in time, your Windows device will ask for a name for the .zip file. You should make sure that the name of your file within this folder is exactly the same as the original file name exported from your PC. As long as you zip these files up anywhere other than the same location that you downloaded the .zip file too, you will have no problems doing this, and you’ll then be able to go back to your browser to import the newly compressed .zip file.

A screenshot of the newly modified .zip file being uploaded to the environment.

And there you have it! I have mentioned this just a few times before, but remember that this is a relatively complex and risky operation that should be executed with focus and confidence. I have the luxury of working on on-premise versions of Dynamics “CRM” well before some Power Apps developers were out of secondary school, but if you aren’t so sure, please do reach out to me or to someone else who may be able to help with the more technical elements of this activity.

A screenshot of the final result, showing one y-axis for both lines in the graph.

Now that you’ve seen the results, you are safe to carefully remove each component from your temporary solution, before finally removing the solution itself.

Reduce Columns Created in a Collection in Canvas Apps

One of the first lessons when getting to grips with Canvas Apps was that you should always use Collections where possible to reduce the number of calls to the original data source, and with any luck, you may see a performance increase as a result too. However, I often find that the data source I’m using always collects a number of columns that I am never going to use in the Canvas App itself.

Let’s take the example of listing Account records from Dataverse using a simple Power Fx statement:

ClearCollect(ListOfAccounts, Accounts);

As you can see below, there are a significant number of columns that I don’t plan to use relating to various relationships across the Dataverse database.

A screenshot showing a Collection in Canvas Apps returning all fields from the data source.

These columns are extremely important for the database and we shouldn’t underestimate their criticality, but these are not necessarily important for me when building a Canvas App as I just want to retrieve the Account Name and the Account ID.

We can make a small change to the original Power Fx statement, by expressing exactly which columns to use, such as:

ClearCollect(ListOfAccounts, ShowColumns(Accounts, "name", "accountid"));

Which in turn produces a Collection that is much more refined, shown below.

A screenshot showing a Collection in Canvas Apps returning a more defined list of columns based on my needs for the app.
This won’t necessarily make a difference to the code that you write within your app, other than the collection’s size itself, however, when you start to write Power Fx within your components you’ll see a much shorter and more defined list of available attributes when trying to retrieve data from your collection!

How to convert UTC into Your Local Timezone in Canvas Apps

One of the technical challenges we have in the UK is that for half of the year we are in the UTC time zone that we’re all familiar with, and the other half we’re in British Summer Time (BST). Those lucky few that keep the same time zone all year don’t know how easy they have it!

It can be quite confusing, as some digital solutions (including Dynamics 365) host UTC and our local time as separate time zones but call both UTC, but others don’t always make this distinction, and you may have seen data that you just submitted appear with a date stamp of ‘1 hour ago’. This is easily done if you’re non-technical. Why would you ever consider having to change your time zone if you can already see ‘UTC’ in the dropdown?

This doesn’t have a major material impact until you’re working with date values without times, particularly if the solution you’re using only allows you to control the date entry from the front end, and not the time entry. The difficulty we face in this scenario is that an application could even show yesterday’s date!

Yesterday’s date? Are you sure?

Well submitting data at 2pm during your workday doesn’t cause too much of an issue, you might see data entry from 1pm instead. But what if you submit a ‘date only’ value, or, (hopefully you’re not working at this time) but at some time between 00:00 and 00:59?! In this instance, the application can often confuse the user and present the data back as yesterday’s date instead!

How do I prevent this?

Fortunately we don’t have any problems submitting data as these will always be submitted in UTC and convert appropriately.

The issue we face occurs when we are trying to retrieve data from a data source, where (for example) the database stores the date as 30/07/22 00:00:00, but our Canvas App reads this from the data source as 29/07/22 23:00:00 due to the database storing our submitted date in UTC.

I discovered this when using the Outlook Tasks Connector to pull in today’s To Do items into a Collection, rather than using the Today() function to compare dates.

Check out the example below:

DateAdd(DateTimeValue(DueDateTime.DateTime),-TimeZoneOffset(),Minutes)) = Today()

“Add the negative of my local timezone offset in minutes to the local date, and then show me all of the To Do Items where the DueDateTime.DateTime value is equal to the newly calculated date.”

Note: For this particular connector I needed to explicitly specify DateTimeValue as the format, but you don’t need to do this for all Connectors.

That’s all. Fortunately Power Fx allows us to grab the time zone offset for the time zone I am currently in, but we must be aware that this value is a negative, and therefore we need to negate the negative in order to add the correct number of minutes. I’ll be using this in every Canvas App I build now, particularly as I work in an organisation that spans multiple time zones!

Modify An Owner’s Connection References in Power Automate

No matter how amazing an organisation may be, unfortunately there will always be the possibility of someone leaving the organisation. When it comes to Power Automate, this means that you can be stuck with the original Owner of the Cloud Flow having left the organisation, where Connection References eventually error, and lead to an automated process failing which may be critical to business systems.

Referenced Forever?!

At present, there is no way for you to delete the original Owner of a Cloud Flow even if you manage to establish yourself as a Co-Owner. Connection References cannot necessarily be deleted either!

Workaround

In this example I’ll use the Centre of Excellence Starter Kit environment that I inherited from a previous colleague, and for demo purposes I’m going to modify the Dataverse Legacy Connector as it is currently in the correct state to demo.

Let’s make our Connection References valid, and eventually fix the Cloud Flow by following the below steps for each Connection Reference:

A screenshot of a Power Platform environment, looking at Connection References within the Default Solution.
Your screen should look similar to the above screenshot at this stage.
  • Open the Connection Reference that you wish to modify. Hint: Filter by Owner to get to your reference quickly if you have many to search through.
  • Click on Edit.
  • Select the Dropdown with the existing Connection and re-point it to an existing valid Connection or create a new one.
  • Repeat those same steps for every invalid Connection Reference.

But Wait!…

Now there are some caveats to this approach which you should consider during this process:

  1. This does not remove the Owner from the flow, but it stops the Owner’s account from being used as a Connection Reference when using a data source in your Cloud flow.
  2. In my instructions I asked you to navigate to the Default Solution. For the consultants among us, with great power comes great responsibility. Be careful here, and use the original Unmanaged solution if you can. In most circumstances, you will be presented with Managed solutions and will be forced to use the Default solution.
  3. To ensure that you can see the full scope of your solution and automations, you ideally need to be a System Administrator to complete this exercise.

Find & Use Microsoft To Do For Your Personal Account in Power Automate

Way before Microsoft had a fully-fledged Outlook and Microsoft To Do app for iOS and Android, there were two apps that tightly integrated with each other to form an absolute machine in productivity – Sunrise and Wunderlist.

Tasks would show as ‘All Day’ items at the top of your calendar, with ticks next to each one completed as a frequent reminder of progress as you check your calendar for the seventeenth time during the working day.

A digitally produced image of Sunrise Calendar with Wunderlist Integration on an iPad
Sunrise Calendar with Wunderlist Integration on an iPad

Microsoft bought both of those products and that’s how we arrived at Microsoft’s eventual Outlook Tasks replacement and the ability to add third party calendars to our Outlook with ease, but not all features were migrated easily, and I have always wanted a replacement, but never found one.

By using the Power Platform, we now have the ability to bring together the capabilities of personal Microsoft To Do with Outlook, and any other service is hidden within the Outlook Tasks Connector within Power Automate!

Simply search for the Outlook Tasks when creating a flow, and once you’ve chosen your trigger or action, you’ll be able to see your tasks.

A screenshot showing the selection of a Microsoft To Do list in Power Automate via the Outlook Tasks Connector
Selecting a Microsoft To Do list in Power Automate via the Outlook Tasks Connector

I’m unsure on when exactly this feature became available for personal accounts, but Microsoft To Do with business accounts has been available for a while under it’s own Connector.

What’s the catch?

As with a lot of early Connectors that have since had iterative updates in Power Automate, not all actions are built consistently.

A screenshot showing a list of some of the available Actions within the Outlook Tasks Connector.
A list of some of the available Actions within the Outlook Tasks Connector.

We also have to bear in mind that Microsoft To Do and Outlook Tasks are built on entirely different architectures where functionality has merged over the years, and therefore there are several fields available that may not directly align to what you expect, particularly when trying to use the data you’ve received in another Connector.

Having said all of the above, once you have established the correct Dynamic Values and the correct Actions to use, the connector is extremely reliable and hasn’t failed me yet in any working examples.

References

Microsoft Docs: Outlook Tasks Connector

Microsoft Docs: Microsoft To Do (Business) Connector

Delegation in Canvas Apps

A couple of weeks ago I found an empty slot in my diary, and I (dangerously) thought “I know, I’ll brush up on my Canvas app skills!”.

In my role I find myself looking across multiple Dynamics 365 apps, Excel spreadsheets, and Power BI reports daily, and I set myself the task of bringing all of this together into one place so that I could access all of the data I need with one or two clicks instead of manually transforming data and keeping several browser tabs permanently open.

This was going great, until I saw the dreaded ‘delegation’ warning that all Canvas app novices will see very quickly in their career.

“Delegation warning. The Filter part of this formula might not work on large data sets.”

When you expand the warning, you get the following detail:

What is Delegation?

Simply put, delegation is an instruction from the target application to the data source, to carry out a query before returning the subset of results that are wanted by the target application itself.

This means that we only ever receive the desired data in the target application, and in turn, performance is increased as a result.

When you compare the processing required in this scenario compared to retrieving every piece of data and then filtering it in the target application, you see a measurable performance increase by using delegation, and you’re also increasing technical debt by pulling back data into the target environment that you want to throw away immediately.

Cause

The wording for this warning can be considered a little misleading. The warning is actually telling us that there will be a lack of delegation in the data source. In this instance, the data source does not have the ability to carry out the condition logic with its capabilities, and therefore it needs request that the Canvas app carries out the query instead.

For example, Power Fx provides the ability to retrieve a day, month, or year value from a Date field, but Dataverse cannot do this! Dataverse can only query date ranges such as ‘on or before [Date]’! When querying a ‘month’ in this scenario, you would receive the delegation warning as the delegation cannot happen.

As a result, the full data set from the data source has to be retrieved by the target application, only for the target application to filter the data once it has all been received. This lowers the performance of the app, but it could be worse than that – if you exceed the definition of ‘large data set’, the data set may not return at all, leaving you with incomplete results with no error and a low quality solution.

Solution

The biggest lesson learned whilst working on delegation recently was from a colleague – there is always a workaround.

Whilst you can’t “fix” the warning with the same piece of code, you can use combinations of delegated conditional logic in order to achieve the same results.

A classic example steps back into using dates in Canvas Apps. In Power Fx I can express “Month = 1”, but Dataverse only allows date ranges so the Canvas App needs to bring back the full data set to work out whether “Month = 1”. As a result, I can’t quite express “in January this year” using delegated logic, so instead I need to combine two ranges using something that Dataverse can recognise. In this example I would combine “Created On must be on or after 1st January 2021”, and “Created On must be on or before 31st January 2021” to obtain the right data at source.

Some examples can be more complicated than this, but a top tip for Dataverse specifically is that if you can achieve it using Advanced Find, then you can be certain that the logic can be delegated!

Have you worked in this space before and found any cool workarounds? Leave a comment below!

Quickly Enable Migrated Power Apps Portal & Power Pages Configuration

Microsoft’s documentation goes to great lengths in order to explain how we can migrate Power Apps Portal data from one environment to another by using the Configuration Migration Tool, but it doesn’t quite go as far as explaining how to re-point the already-provisioned portal to your newly migrated data upon first deployment.

Follow the below steps once you’ve moved your data in order to see your changes come to life!

1a. Locate via Dataverse

Navigate to Apps and find your Portal app from the list. Click on the three dots, and choose ‘Settings‘.

A screenshot of make.powerapps.com highlighting Apps and Administration.

Select the ‘Administration‘ option which will open a new tab.

1b. Locate via Power Platform Admin Centre

Navigate to the Resources tab which will expand to show a Portals option, and find your Portal app from the list.

A screenshot of the Power Platform admin centre, highlighting the Portal and Manage options.

Click on the three dots, and choose ‘Manage‘.

2. Update Portal Bindings

A screenshot of the Power Apps portals admin centre, showing the Update Portal Binding option.

Stay on the ‘Portal Details‘ tab and scroll down to ‘Update Portal Binding‘ and choose the newly migrated Website Record from the list.