Choosing The Right Data Source for Power Apps & Power Automate

One of the biggest challenges that we face when building apps and automations is the decision on where to store my data. Sometimes this choice may be dictated to us based on licensing and architectural factors, however, if you have a choice of options then this blog post is for you.

Think about the way that cars are advertised. Most car manufacturers have a super mini, family hatchback, cross-over, and SUV offering, and each one from an adoption perspective acts like a steppingstone towards the next model up next time as your wants and needs become more sophisticated! Great for us, but also great marketing for the supplier!

We all want the Audi RSQ8, but right now we might only be able to afford the Audi S1, or we might not want to commit the investment of the most expensive one right now. Note: Other car manufacturers are available of course!

Anyway, back to the technology, the most used data sources for Power Apps and Power Automate are usually Excel, SharePoint, Dataverse for Teams, and Dataverse, so let’s compare the options and understand how our needs can be met within Microsoft 365.

Microsoft Excel, the super-mini.

I have an ‘I 💖 Spreadsheets’ mug for my morning tea, and for some reason the world just can’t get enough of spreadsheets! Many organisations around the world are run on spreadsheets and nothing else. It was revolutionary at the time it was released.

A screenshot of Excel being used to create a shopping list.
A shopping list swiftly created via Excel.

Microsoft Excel is a hugely popular and fulfilling software tool, providing us with a quick way to format lists, calculate information, and visualise data. It’s formula functionality is so successful, that Power Fx, Microsoft’s language for developing Power Platform components, was inspired by it!

Excel is not a relational database though, and unless you have already defined a digital adoption strategy, spreadsheets are still saved locally on team member devices leading to a loss of business data over time.

When to use spreadsheets:

  • Quick lists
  • Personal recording of information
  • Extraction of data from another system into a universal format

When to seek one of the alternatives in this post:

  • When data is shared across multiple people or departments
  • When data repeats the same information multiple times, such as contact details

Lists (SharePoint), the hatchback.

Microsoft Lists has become a more prominent feature of SharePoint and has been rebranded as such to position the product as a feature primarily for use within Microsoft Teams. Lists combine the familiarity of Microsoft Excel, whilst also introducing concepts from relational databases and centralising of information to help increase quality by a significant proportion in comparison.

A screenshot of Microsoft Lists on the mobile and on a tablet device.
Microsoft Lists has the same features available regardless of what device you use.

Microsoft Lists can allow categorisation, links to users, and formatted fields with extraordinarily little effort. They won’t solve every problem, but they will help to keep sight of business data, and you can even generate apps and automation from Lists directly too.

Microsoft Lists is a well-received solution to our hybrid working scenarios where Microsoft Teams plays a huge part in operations.

When to use Lists:

  • Track information and progress within a team
  • Organise work and assign owners
  • Indirect benefit of preparing our business for modern cloud solutions that integrate across all of Microsoft 365

When to seek one of the alternatives in this post:

  • When sensitive data requires better security considerations
  • When your data needs to flow into another process in another system, or with another department

Dataverse for Teams, the crossover.

Following the release of Dataverse (previously the Common Data Service, or the on-premise Dynamics CRM SQL database to some of us older folk!), Microsoft also released Dataverse for Teams. This has been a fantastic middle-ground, offering organisations a step into the world of relational databases within the Power Platform, without having to initially commit to a licensing investment. The benefits of taking a relational database approach for this are huge.

A screenshot of creating a Power App in Teams, which will lead to the creation of a Dataverse for Teams environment.
Creating a Power App in Team will lead to the creation of a Dataverse for Teams environment.

There are some significant caveats in comparison to Dataverse, but you can set up your own database within a Microsoft Team, and build apps and automations on top of it, to service your end users. Remember, this is entirely free!

When to use Dataverse for Teams:

  • Small operational processes that require a team scope that can be defined within a Microsoft Team
  • When you need to build appetite for further Power Platform delivery in the future to demonstrate the art of the possible with little investment
  • When you plan to invest in Dataverse in the future, as the upgrade path from Dataverse for Teams to Dataverse is much more seamless than a migration project

When to seek Dataverse as an alternative:

  • When you need to retrieve data from sources outside of the Microsoft Team your Dataverse for Teams environment lives in
  • When you need to deliver Application Lifecycle Management (ALM) and utilise the concept of ‘development’ and ‘production’ for your solutions.
  • When you want to start utilising Dynamics 365 apps using the same database as your custom solutions.

Dataverse, the SUV.

We now look towards our final data source for review. I love describing Dataverse as the SUV. We see a nice car on the motorway with all the extras, we look up to it for inspiration on our next purchase, and one day we can finally make it to buy this dream car and it just works.

Dataverse is the same, it’s a full database offering with a comprehensive list of functions that require no expertise in SQL, just a theoretical understanding of relational databases and normalisation.

A screenshot of a Developer Dataverse environment.
A standard database environment configured for developer use.

Dataverse helps us to create a single source of the truth, and it helps us to share data from one record and relate it to others. Over time as multiple users build upon the quality of the data, you gain a significantly better understanding of how your business operates, which will help you to further improve your efficiency and services in the future. It’s worth the investment, and it’s worth setting up a free developer account to explore the possibilities if you haven’t already.

Conclusion

As we’ve discovered, there are so many tools at our disposal, even just within the Microsoft 365 stack when we’re delivering apps and automations.

My recommendation would always be to ‘climb down’ rather than ‘climb up’. We all know how easy it is to set up a spreadsheet and often we talk ourselves out of using another tool.

If we step back for a moment and consider our audience, our data model, and the impact across the organisation, it may be far better to rule out Dataverse first rather than having to justify its purpose 3-levels away from our currently proposed ‘easy’ solution which could cause maintenance issues in the future.

Consistent Y-Axis in Model-Driven App Charts

This week I faced a very old school problem with model-driven apps from the days of working on-premise with Dynamics CRM, I needed to show two different series on the same graph, but every time I would view the chart it would show me two different scales on the same Y-Axis!

As this chart was for the purpose of comparing values, this makes the out-of-the-box chart meaningless as in some respects, smaller numbers look bigger than their counter part as shown below with test data in my development environment.

A screenshot of a model-driven app chart with two different Y-Axis scales for the same type of data.
Note that the y-axis reference on the left has a higher increment for each bar in comparison to the right y-axis.

We can resolve this with a few steps by editing code, and here’s how.

Step 1: Back up your environment

Before we get started, please note that Microsoft have gone a very long way to make solutions a no-code option for deploying components and that it is not recommended to edit solution files unless the requirement cannot be fulfilled any other way, and you are absolutely confident in how solutions are composed and deployed.

Always back up your database before significant operations, and seek support from peers if you are uncertain. It’s also worth considering how impactful this change is, and whether the effort vs. benefit stacks up in the correct way.

Step 2: Create a temporary solution file

In order to make sure that our changes persist from development to production, we will need to re-import our code changes back into the source environment once complete. This is so that the changes are recognised every time we export from source and deploy to target in the future, otherwise you would need to make this change every time you deploy to a new environment in the future which carries risk due to the frequency of this activity.

A screenshot of a user creating a new temporary solution file.
This helps us to target the components that we need to change that otherwise live in another solution, and therefore reduces risk considerably.

Step 2: Add your chart(s)

When we add existing components, we’ll need to locate the Table and its associated Chart components for change.

A screenshot of a user selecting multiple existing charts to add to their solution.

Again, let’s make sure that we only add what we need here.

Step 3: Export an unmanaged copy of the solution

As we need to edit the code that sits within the solution, we must export it so that we can make the changes and re-import it here later.

A screenshot of a user selecting the Unmanaged option for the solution export.

By selecting Unmanaged, we retain full control over the customisations once the solution is re-imported, which is important as we probably want to keep the changes but remove the solution file later on.

Step 4: Unzip your file and make the change

Solution files download as .zip files, so we need to extract the files before we can work on them. When you extract the files, you’ll see three files:

  • [Content_Types].xml
  • customizations.xml
  • solutions.xml

We now need to open up customizations.xml in our favourite code editor, preferably Notepad++ or Visual Studio Code. Search for the word ‘Secondary’ within your code, and remove the YAxisType variable. You will have one of these tags for each chart that you’ve added with multiple Y-Axis, and in this instance I have two due to the two chart components that I selected earlier.

A screenshot of a user editing the customizations.xml file in Visual Studio Code.

Original code:

<Series ChartType="Line" IsValueShownAsLabel="True" BorderWidth="3" MarkerStyle="Square" MarkerSize="9" MarkerColor="37, 128, 153" MarkerBorderColor="37, 128, 153" YAxisType="Secondary" />

Amended code:

<Series ChartType=”Line” IsValueShownAsLabel=”True” BorderWidth=”3″ MarkerStyle=”Square” MarkerSize=”9″ MarkerColor=”37, 128, 153″ MarkerBorderColor=”37, 128, 153″ />

Step 5: Zip up the files and re-import

Now this is where we need to be extremely careful, we need to select the three extracted files and compress into a .zip file.

At this point in time, your Windows device will ask for a name for the .zip file. You should make sure that the name of your file within this folder is exactly the same as the original file name exported from your PC. As long as you zip these files up anywhere other than the same location that you downloaded the .zip file too, you will have no problems doing this, and you’ll then be able to go back to your browser to import the newly compressed .zip file.

A screenshot of the newly modified .zip file being uploaded to the environment.

And there you have it! I have mentioned this just a few times before, but remember that this is a relatively complex and risky operation that should be executed with focus and confidence. I have the luxury of working on on-premise versions of Dynamics “CRM” well before some Power Apps developers were out of secondary school, but if you aren’t so sure, please do reach out to me or to someone else who may be able to help with the more technical elements of this activity.

A screenshot of the final result, showing one y-axis for both lines in the graph.

Now that you’ve seen the results, you are safe to carefully remove each component from your temporary solution, before finally removing the solution itself.

Reduce Columns Created in a Collection in Canvas Apps

One of the first lessons when getting to grips with Canvas Apps was that you should always use Collections where possible to reduce the number of calls to the original data source, and with any luck, you may see a performance increase as a result too. However, I often find that the data source I’m using always collects a number of columns that I am never going to use in the Canvas App itself.

Let’s take the example of listing Account records from Dataverse using a simple Power Fx statement:

ClearCollect(ListOfAccounts, Accounts);

As you can see below, there are a significant number of columns that I don’t plan to use relating to various relationships across the Dataverse database.

A screenshot showing a Collection in Canvas Apps returning all fields from the data source.

These columns are extremely important for the database and we shouldn’t underestimate their criticality, but these are not necessarily important for me when building a Canvas App as I just want to retrieve the Account Name and the Account ID.

We can make a small change to the original Power Fx statement, by expressing exactly which columns to use, such as:

ClearCollect(ListOfAccounts, ShowColumns(Accounts, "name", "accountid"));

Which in turn produces a Collection that is much more refined, shown below.

A screenshot showing a Collection in Canvas Apps returning a more defined list of columns based on my needs for the app.
This won’t necessarily make a difference to the code that you write within your app, other than the collection’s size itself, however, when you start to write Power Fx within your components you’ll see a much shorter and more defined list of available attributes when trying to retrieve data from your collection!

Delegation in Canvas Apps

A couple of weeks ago I found an empty slot in my diary, and I (dangerously) thought “I know, I’ll brush up on my Canvas app skills!”.

In my role I find myself looking across multiple Dynamics 365 apps, Excel spreadsheets, and Power BI reports daily, and I set myself the task of bringing all of this together into one place so that I could access all of the data I need with one or two clicks instead of manually transforming data and keeping several browser tabs permanently open.

This was going great, until I saw the dreaded ‘delegation’ warning that all Canvas app novices will see very quickly in their career.

“Delegation warning. The Filter part of this formula might not work on large data sets.”

When you expand the warning, you get the following detail:

What is Delegation?

Simply put, delegation is an instruction from the target application to the data source, to carry out a query before returning the subset of results that are wanted by the target application itself.

This means that we only ever receive the desired data in the target application, and in turn, performance is increased as a result.

When you compare the processing required in this scenario compared to retrieving every piece of data and then filtering it in the target application, you see a measurable performance increase by using delegation, and you’re also increasing technical debt by pulling back data into the target environment that you want to throw away immediately.

Cause

The wording for this warning can be considered a little misleading. The warning is actually telling us that there will be a lack of delegation in the data source. In this instance, the data source does not have the ability to carry out the condition logic with its capabilities, and therefore it needs request that the Canvas app carries out the query instead.

For example, Power Fx provides the ability to retrieve a day, month, or year value from a Date field, but Dataverse cannot do this! Dataverse can only query date ranges such as ‘on or before [Date]’! When querying a ‘month’ in this scenario, you would receive the delegation warning as the delegation cannot happen.

As a result, the full data set from the data source has to be retrieved by the target application, only for the target application to filter the data once it has all been received. This lowers the performance of the app, but it could be worse than that – if you exceed the definition of ‘large data set’, the data set may not return at all, leaving you with incomplete results with no error and a low quality solution.

Solution

The biggest lesson learned whilst working on delegation recently was from a colleague – there is always a workaround.

Whilst you can’t “fix” the warning with the same piece of code, you can use combinations of delegated conditional logic in order to achieve the same results.

A classic example steps back into using dates in Canvas Apps. In Power Fx I can express “Month = 1”, but Dataverse only allows date ranges so the Canvas App needs to bring back the full data set to work out whether “Month = 1”. As a result, I can’t quite express “in January this year” using delegated logic, so instead I need to combine two ranges using something that Dataverse can recognise. In this example I would combine “Created On must be on or after 1st January 2021”, and “Created On must be on or before 31st January 2021” to obtain the right data at source.

Some examples can be more complicated than this, but a top tip for Dataverse specifically is that if you can achieve it using Advanced Find, then you can be certain that the logic can be delegated!

Have you worked in this space before and found any cool workarounds? Leave a comment below!

Quickly Enable Migrated Power Apps Portal & Power Pages Configuration

Microsoft’s documentation goes to great lengths in order to explain how we can migrate Power Apps Portal data from one environment to another by using the Configuration Migration Tool, but it doesn’t quite go as far as explaining how to re-point the already-provisioned portal to your newly migrated data upon first deployment.

Follow the below steps once you’ve moved your data in order to see your changes come to life!

1a. Locate via Dataverse

Navigate to Apps and find your Portal app from the list. Click on the three dots, and choose ‘Settings‘.

A screenshot of make.powerapps.com highlighting Apps and Administration.

Select the ‘Administration‘ option which will open a new tab.

1b. Locate via Power Platform Admin Centre

Navigate to the Resources tab which will expand to show a Portals option, and find your Portal app from the list.

A screenshot of the Power Platform admin centre, highlighting the Portal and Manage options.

Click on the three dots, and choose ‘Manage‘.

2. Update Portal Bindings

A screenshot of the Power Apps portals admin centre, showing the Update Portal Binding option.

Stay on the ‘Portal Details‘ tab and scroll down to ‘Update Portal Binding‘ and choose the newly migrated Website Record from the list.

Translating Unknowns into Tangible Requirements

For me, the most exciting part of a project is the challenge of figuring out exactly what a client is asking for based on a very short brief provided in an introductory call.

This challenge is increased in my industry when you move from Dynamics 365 based projects to pure Power Platform projects, because you move away from a functionally built system, to a set of tools that enable the capability. Not only do we now have to qualify the tool, but we also need to qualify the business process at an earlier stage than we typically used to, as well as the full data model.

For example, a “helpdesk replacement tool” screams Dynamics 365 Customer Service, and consultants in the industry typically understand the core operational processes before they speak to a customer. On the Power Platform, however, no two ‘self-serve chatbot’ projects would ever be the same, and there’s no functional process that you can align to this.

So how do we quantify projects with so many unknowns when we need to fully design the data model, the user interface, and the functional process? One way to start is to look for three themes:

  • Trends
  • Assumptions
  • Caveats

The first consideration I make is whether there are any repeatable components for any given high level requirement.

Whilst this doesn’t necessarily give us the full requirement ready to build, it does give us an idea of the size of the scope in contrast to a solution that is easier to estimate. Let’s take the idea of implementing a chat bot for a client on their website.

As a website user, I want to be able to engage with a chatbot, so that I can easily find out store opening times and current stock levels.

Within the industry I work in, we know that a configurable Power Virtual Agent for Teams solution that only uses Entities is relatively straight forward, and doesn’t require code. The interface used to build the solution is entirely controlled by Microsoft, so we also have confidence that it works! Let’s now put our original requirement into context by using known unknowns:

  • We know that the client cannot deploy this through Teams, but we don’t necessarily know exactly how to deploy it through a website that we don’t control just yet.
  • We are not being asked to build their website and we don’t know what their data source is, but we do know that we can take advantage of data and automation services that we can control to make this easier, perhaps Microsoft Dataverse with some sort of movement of data via Power Automate?

We now have broken down the requirement into tangible considerations and we can justify risk and complexity based on what we do know and what we can control, so we should factor this in to our estimate right from the beginning.

As a website user, I want to be able to engage with a chatbot, so that I can easily find out store opening times and current stock levels.

Trends:

1. Power Virtual Agents for intelligent chatbot functionality.

2. Power Automate to drive dynamic data interactions between end user and data source.

3. Dataverse to assist with controlling data where necessary.

Assumptions

Next up, assumptions. We are often taught that making assumptions is a bad thing, and in most cases that is correct, but assumptions can be extremely powerful when defining a requirement if used correctly.

Taking our earlier example of a chatbot being deployed via a client’s website, we really don’t want to be developing the website in unfamiliar territory, nor do we want run into any bumps if their data source isn’t fit for purpose. For now, we can set assumptions against our requirement to portray what we would typically expect within the client’s landscape, and if any of these are found not to be true, then we can justify a change in direction for a requirement through a change of scope, estimate, and change request!

As a website user, I want to be able to engage with a chatbot, so that I can easily find out store opening times and current stock levels.


Assumptions:
1. Assumes that the client’s existing data model is fit for purpose, and if any changes should be made, the client will take responsibility for these.

2. Assumes that the solution can be deployed using a embedded HTML code snippet, as per Microsoft’s standard approach.

Caveats

And last but not least, we have caveats. Clients may see these as the supplier creating ‘get out of jail free’ cards, but in reality, these are to ensure that everyone involved understands what should happen in the event that one of these factors occurs. Caveats are usually based on assumptions, but can extend further than this to cover typical project factors too.

As a website user, I want to be able to engage with a chatbot, so that I can easily find out store opening times and current stock levels.


Caveats:
1. If the data source should change after delivery, the client will be responsible for a change request for any errors that may occur with this solution if they wish to continue using the functionality.

2. If the client’s website cannot support HTML snippets for any given reason, the project may need to be delivered via a Power Apps Portal, which would incur extra cost to ensure the delivery is built to the correct standard.

Summary

When I describe this way of working with my team, I reference a phrase that may be familiar to some – It’s about the journey, not the destination. Imagine you have a 100 mile journey to make with no map functionality, digital or print. What would be your first move?

Success isn’t just the destination, or the solution in this case, it’s the route to it and the service provided along the way that counts. This continues to be a significant theme throughout the whole lifecycle of the project, and it can make or break the final engagement with the software.

Expect Dataverse Deployments To Fail First Time

Whilst the process of deployment hasn’t changed too much since the days of Dynamics CRM, one thing that has changed significantly is the volume of possible components that can be included in a solution file.

Not only is this due to an increase of readily-available functionality from Microsoft, but also by the ability for end users to install their own components, which in turn creates more dependencies on (what we think) is our small solution of configuration changes to be deployed from one environment to another. This can increase the number of failures that can occur during delivery, and often, the end user error isn’t very helpful.

A generic error provided by the Power Platform when trying to deploy a solution.

Solution deployment failures don’t have to be a problem, in fact, we should expect them.

In this blog post I will help you understand how to troubleshoot a failed deployment so that you can solve the issue in an informed way.

Step 1: Download A Code Editor

We want to ensure that the output from the failure is in a readable format, and for this we need a code editor that recognises XML formatted files. My preference as a functional consultant who needs to open the occasional file is Notepad++. It’s free, and it has an XML Tools plugin which allows you to ‘pretty format’ any XML files. You can also use Visual Studio or Visual Studio Code – I suspect some of you reading this will already have one of these installed!

Step 2: Download The Solution’s Log File

Whenever someone approaches me with a failed deployment, the first thing I ask them for is the log file. When you open this file in Notepad++, use ctrl+alt+shift+B, which will ‘pretty format’ your XML file. It’ll look something like this:

A screenshot of Notepad++ with XML Tools plugin installed. The file shown here is using 'pretty format' to make the code readable.

It looks difficult to decipher to the untrained eye, but we can quickly start to understand why the solution is failing with a few tips when we break down the file.

Step 3: Understand The Dependency

Let’s take a look at the first dependency, defined by the <MissingDependency> XML tags.

A snippet of code showing a missing dependency.

You’ll notice a <Required> line and a <Dependent> line which both include a Type. This, alongside the schema name, is the most important part of the dependency, as the two combined tell us what we’re looking for.

Fortunately we don’t need to remember all of the types as Microsoft provide a handy reference guide here.

We simply need to cross-reference the numbers in our dependency, and we now know that to complete the deployment we need to include the “Offering” entity (table) for the “Service” System Form.

Step 4: Modify Your Solution

We have two choices here:

  1. Remove the Service System Form from the solution, or,
  2. Add the Offering entity (table) into the solution.

In this particular instance it would make more sense to add the Offering into the solution, but sometimes you may challenge whether the component is really needed within your deployable solution, in which case, you’d remove the System Form.

Step 5: Rinse & Repeat

Not all dependencies will be resolved within one solution modification, but that’s ok, and you may need to repeat steps 3 & 4 multiple times before you have a solution file that can be successfully deployed. The key is to remember that failures can be expected, and that they don’t always have to be a problem!

The Power Of A Great User Story

For anyone that personally knows me, they will know that I am absolutely obsessed with getting User Stories right!

On the face of it, it may seem impossible to deliver part of a business process from one single sentence, but user stories are one of the best tools you can use, right from your initial engagement through to post go-live support as long as each part adds value.

What is a User Story?

User stories can help us to define a requirement for a business user or process that needs to be included within a project or product to ensure success. Generally speaking, user stories follow the following format:

As a [persona], I want to [achieve], so that I can [action].

That is all, one seemingly simple statement. Some may feel that writing user stories are a waste of time in an agile project, particularly as an agile project is supposed to deliver technical outputs quickly, however, user stories can actually speed up the turnaround time of the solution, providing that we pay particular and collaborative attention to it’s construction.

The Construction

Let’s take a look at what we want to achieve from each part of the User Story and how we can add value. As I am a Power Platform and Dynamics 365 Customer Engagement consulting manager, I’ll be using an example from Dynamics 365 Customer Service.

As a [persona],…

It would be easy to write “As a user…” here and be done, but this doesn’t tell us anything except that this isn’t an automated process.

Particularly in the Power Platform and Dynamics 365 space, the functionality and security model can span multiple applications, so perhaps we can describe the user and the way that they’re accessing the product or feature.

As a Customer Service Representative accessing the standard Customer Service Hub application,

From the above, we can understand that:

  • The user definitely needs a Dynamics 365 Customer Service license if existing licenses do not allow access.
  • The user is likely to use the standard Security Role provided due to the persona’s role.
  • The user is not expecting a tailored sitemap experience, as they will access the application through existing means.
  • The experience is triggered by end user behaviour rather than automated processes, until we discover more about the rest of the story.

I want to [achieve],…

We now want to ensure that we describe the Customer Service Representative’s objective in this part of the user story, so that we can start to understand our scope and design our solution.

As a Customer Service Representative accessing the standard Customer Service Hub application, I want to see all of my priority ‘1 – Blocker’ Cases in a separate list sorted by oldest to newest creation date,…

To add to our previous understanding, we now know:

  • The user has a focus on Cases that need the highest amount of attention, and these should be categorised by priority. Right now we don’t know the full list of priorities, but we can add that as a known unknown in our design.
  • The user needs a new view for just the Cases categorised by this priority, and the out-of-the-box priorities are High, Medium, Low. It seems like we need to carry out Column and List configuration in Dataverse here.
  • The List that we configure needs to include the Created On date, and needs a sorting on this Column too.

So that I can [action].

The primary purpose of this part of the user story is to justify the action through behaviour. Some consider this part of the user story optional, however I like to ensure it’s included in every user story as it can significantly change the estimate required to deliver.

This part of the user story doesn’t just have design benefits, but it can also help us prioritise the user story against other user stories in the backlog when we are working in iterations or sprints.

As a Customer Service Representative accessing the standard Customer Service Hub application, I want to see all of my priority ‘1 – Blocker’ Cases in a separate list sorted by oldest to newest creation date, so that I can ensure that we do our best to meet our 1 day ‘solution or workaround’ Service Level Agreement (SLA) for blocked customers.

We now know why this design is so important, and we can deduce the following:

  • The organisation makes promises within their agreements with customers to ensure that business processes aren’t blocked for more than one day, and this needs to be a core emphasise within the design.
  • We can ask if we can further improve the design by introducing system triggered Service Level Agreement functionality.
  • Most importantly, another business initialism has been cleared up, by clarifying why the customer keeps writing SLA all over their documents!

Isn’t This Too Much Detail?

Not at all. It’s unlikely that we will every get to this level of detail within one round of workshops, however, through refinement during iterations or sprints, this can tell us almost exactly how we need to build a feature. It will also help us more accurately estimate our delivery and provide a higher chance of passing tests after deployment.

One phrase I frequently hear is ‘we don’t need to worry, it’s just out of the box functionality’, but from one sentence regarding standard functionality, we have been able to arrive at 10 conclusions with definitive design that definitely carry an associated effort.

I am sure others reading this will find additional conclusions too, and this is the great thing about user stories – multiple perspectives can help to narrow down exactly what the client is asking for, and ultimately lead to a higher quality delivery.