Level of Difficulty: Beginner – Senior.

Power Automate is a powerful tool that allows for integrations and RPA solutions to be developed through the use of Power Automate Desktop and Cloud flows. A disadvantage of using Power Automate is that it can take strain when working with large volumes of data. Lengthy flows take long to save, load and execute. Large amounts of data increase the run time of flows exponentially if they aren’t developed optimally. There are a few things that you could try to optimise a flow and reduce the runtime of a flow while maintaining the functionality of the solution.
Power Automate Desktop Flows
1. Remove Commented (or Disabled) Code
A large part of the RPA development process includes debugging a solution while developing the functionality. Often developers comment out or disable actions instead of removing them. These could act as test harnesses (actions used purely for testing) or safety nets (often used when hoarders of code fear deleting code that previously worked). It becomes quite easy to pollute a solution with disabled actions.
When a PA desktop flow is run, disabled actions are read, even though they aren’t necessarily executed. Removing a few disabled actions might not make a massive dent in process duration but it does make a difference.
2. Build Business Exception Checks
When building UI Automation flows, ‘Send hotkey’, ‘Click’ and ‘Get details’ actions are commonly used with ‘On Error’ actions set up. Send hotkeys are efficient ways of executing actions to work around frequently changing user interfaces, although the ‘Send hotkey’ actions can often get lost between screen changes. This could result in many issues.
A good practice for UI automations would be to execute an action and check that the screen is in the correct state before submitting an action. In a case where the screen did not change as expected or that the executed action did not render the expected result, these checks could return Business or Application Exceptions and immediately ending the sub-flow before proceeding to the environment clean-up phase (closing any applications that may have been opened through the flow). These checks should be built into sub-flows as it can be reused and makes debugging easier.
3. Introduce Sub-flows to Group Reusable Logic
It is a good practice to clean up an environment before and after every process run. The clean up process would refer to the termination of any applications that may be open or be used within the process. The aim of this is to reduce the risk of open applications jeopardising the execution of the process. By cleaning the environment, that risk is mitigated. This is just one example of logic that can be grouped by building a single clean-up sub-flow and invoking or ‘running’ the sub-flow at all points where it is applicable. The deduplication of code declutters a solution and contributes heavily to the advantage of modularisation as a concept in solution development.
4. Ensure That ‘On Error’ Actions are Setup Correctly
As previously mentioned, ‘Clicks’ and ‘Get details’ activities are common with UI automations. These activities come with the ability to configure ‘On Error’ actions. These actions allow for retries, delays, setting variables, executing specific actions and running sub-flows. Ensure that the waiting time between retries is feasible. Decreasing these wait times and rather adding checks could decrease the execution time of a process. By grouping exception handling logic in a sub-flow (see above), the ‘On Error’ option can be configured to execute the exception handling sub-flow instead of waiting for a failure and duplicating code.
5. Reduce the Amount of Waits Being Used
When using a concatenation of hotkeys and clicks, waits (and wait untils) are commonly used to ensure that an application is ready for input. These waits could turn into endless loops and prolong process execution times more than is necessary. By building in checks and limits, these process execution times could be reduced with more indicative measures of where the process encounters errors.
6. Use clicks rather than send hot keys
As far as possible, rather use clicks than send hotkeys when building UI automations. Although send hotkeys execute faster, clicks allow for more stable execution and also allows for the configuration of ‘On Error’ actions.
7. Write Input Data to Textfile
In a case where large amounts of data are being parsed through to the desktop flow from the cloud flow, consider writing the input variable contents to a text file. The aim would be to enable unit testing of Power Automate solutions between Desktop and Cloud flows. The data from the cloud flow would be written to a text file before the process is executed. If the input variables are blank, read the contents from the text files to execute the process with a previous runs data. This might not have a direct impact on execution time, but it does allow for decreasing development and testing times when executing the flows in individual units.
It is important to note that cloud flows execute desktop flows through the gateway at a different pace than desktop flows are executed directly on a computer.
Power Automate Cloud Flows
1. Remove Loops That Process Items Individually
Connectors, like the Excel connector, only allow for items to be processed on a ‘line-by-line’ basis. The more items that there are to process, the longer the process will take to execute. By bulk processing these items, the risk of exponentially increasing process duration is mitigated to some degree. This could be done through introducing other components, like Azure functions (which come with their own challenges). A good example of this can be found in this post.
Although there might still be an increase in process duration when dealing with more data, the increase will not be as exponential as individual connector calls. A lot of the connectors available on Power Automate make use of API calls, some of which have ‘x-calls-per-minute’ or ‘x-calls-per-hour’ limits which could also increase process duration. Depending on your licensing, this might have an impact too. Per user plans have an API limit per day, so by reducing the amount of calls you make, you’re potentially reducing the cost of your flow.
2. Rework Loop Levels
Power Automate cloud allows for a maximum depth of 9 levels. Scopes are really useful when grouping actions logically although they are considered to be a ‘level’ of ‘loops’ which does have an affect on execution times. In a case where logical validations go further down than 6 levels, it would be advised to rather set variables and process logic based on conditional variables rather than adding loops within loops. Especially when adding ‘apply to each’ loops within ‘apply to each loops’. The theory behind the Big O notation explains why this has such a significant impact on process duration in some case. It is advisable that such complex logic be reworked as far as possible.
3. Introduce Sub-flows to Group Reusable Logic
It is important that flows that are built as automation solutions have some form of exception handling and logging. These components can be built as a separate flow to group the logic of handling and logging exceptions which can be reused each time that an exception or error occurs. By logging exceptions, the ability to report on the successes and failures becomes a lot more feasible. This is an example of code that could be grouped and reused.
Do you have any other tips for optimising Power Automate solutions? Drop a comment below.
Also if some of your scheduled flows take a lot of time creating, updating, or deleting items in SharePoint, then batch actions may help.
Batch Create: https://www.youtube.com/watch?v=2dV7fI4GUYU
Batch Update: https://www.youtube.com/watch?v=l0NuYtXdcrQ
Batch Delete: https://www.youtube.com/watch?v=2ImkuGpEeoo
These cut the total action API calls & flow run-times in my organization by a third.
LikeLiked by 1 person
Thank you very much for adding this! This definitely is a very useful tip to remember when optimising flows that integrate with SharePoint.
LikeLike