Scaling Azure Functions with Durable Functions 2

Serverless compute is the latest advancement in Platform as a Service (PaaS) capabilities in the cloud. Azure Functions is the Serverless Compute service options available within the Microsoft Azure cloud platform. At it’s most basic, it lets you deploy a single method of code and run it in the cloud without the need to manage a VM, updates/patches, or even the application that hosts the method of code. In fact, within all the amazing abstraction is the bindings that hook up function triggers, data inputs, and data outputs to various services declaratively. All this functionality is really amazing, but there is room to improve the scalability of using Azure Functions and Serverless compute to build out full software solutions that contain more than just simple, isolated code methods. This article walks through the capabilities offered to Azure Functions with the featured called Durable Functions; aka the integration of the Durable Task Framework with Azure Functions.

Why Azure Functions?

In a previous article, I outlined the primary features of when / why you would choose either Azure Functions or Web Jobs to implement background processing tasks. There are a few differences between them, but the recommendation today is that Azure Functions should be used first. Traditionally, it’s been the case that if Azure Functions don’t fit the best, then using Web Jobs would be the most appropriate implementation.

Don’t forget that Azure Functions are built on top of Azure Web Jobs. Basically, an Azure Function is a Web Job at it’s core. Although, Azure Functions offer a ton of features on top of Web Jobs that just don’t inherently exist within Azure App Service directly. Azure Functions are a part of the latest generation of PaaS services known as Serverless.

Azure Functions are a part of the latest generation of PaaS services known as Serverless.

What exactly is Serverless? To sum it up, serverless is the ability to host your code for execution where you write, manage, and deploy the minimum code necessary to provide the custom functionality you need. This means no more Virtual Machines (VMs) to manage, no more software updates or patches, and… The most important part of Serverless is that there’s no more application to manage. You simply manage your custom code as individual code methods, or Functions.

The most important part of Serverless is that there’s no more application to manage.

Among this further abstraction of Serverless within Azure Functions is the capability to connect and setup your Functions to be Triggered, as well as send / receive data (via Inputs and Outputs), by configuring these declaratively. The Azure Functions runtime and platform will automatically take care of the usual programming to make the communication work between the different services supported and the Azure Functions themselves. Azure Functions enable you to develop and deploy faster, while spending your time on custom logic instead of boilerplate code.

Azure Functions enable you to develop and deploy faster, while spending your time on custom logic instead of boilerplate code.

Another huge benefit to cloud computing that has come along with the PaaS advancement of Serverless, is the concept of Consumption-based pricing. This means that it’s no longer necessary to provision, reserve, and consequentially payfor an entire Virtual Machine (CPU cores, Memory, etc.) to run an application 24/7. Instead, the Functions code deployed is billed on a Consumption model where you only pay for what you use. If the app code is deployed but never executed then you don’t pay anything. If the code is executed 1 or 1000 times an hour, then you pay for the specific resources you consume and no more.

Azure Functions code is billed on a Consumption model where you only pay for what you use.

Function Chaining and Long Running Tasks

When looking at integrating a long running background processing using Azure Web Jobs, it’s pretty straight forward. You just write your code that does something, and then deploy it to Web Jobs and let it run. Web Jobs don’t care how long a process takes to run. However, with Azure Functions this isn’t the case. Azure Functions have a hard time limit on the execution time of an individual Function.

In Azure Functions, a single Function execution has a maximum of 5 minutes by default to execute. If the Function is running longer than the maximum timeout, then the Azure Functions runtime can end the process at any point after the maximum timeout has been reached. This inherently limits the amount of code and process that can be implemented using Azure Functions. Basically, because of this limit, each Azure Function needs to be relatively short. This may seem arbitrary, but there is a reason.

It’s worth noting that the maximum runtime timeout of Azure Functions is 5 minutes by default. However, the “functionTimeout” property within the host.json file can be used to extend it up to 10 minutes of runtime. However, the same limitation still exists if the Function is still running after that maximum timeout period is elapsed.

Azure Functions are built to implement background processes that are short in duration. Following Clean Code principles, a single method of code should be short and do only 1 thing really well. Carrying this over to Azure Functions, then it makes sense that each Function needs to be short in terms of not executing longer than 5 minutes. But, how does this help in implementing long running background processes using Azure Functions?

In Serverless computing, and Azure Functions, it’s best practice to break up long running tasks / processes into smaller discrete units that can be chained together. This is referred to as Function Chaining. With Function Chaining, each task in a background process workflow will trigger the next task in the workflow upon it’s complete and successful execution.

In Serverless computing, and Azure Functions, it’s best practice to break up long running tasks / processes into smaller discrete units that can be chained together. This is referred to as Function Chaining.

Originally, in order to implement Function Chaining with Azure Functions, there needed to be a Message Queue of some kind that would be used for the communication between workflow tasks in the chain. This Queue could be Storage Queues, Service Bus Queues or Topics, or really any message queue that has the scalability to meet the demands of the workflow process.

Scaling Azure Functions with Durable Functions 3

Rather than implementing a single background process that is “monolithic” in nature, Function Chaining allows that same background process to be broken up into multiple separate tasks. These tasks, when chained together, would then form the complete workflow. This is in general more work to implement, but the final solution is much more robust. If a single task in the workflow fails, then the system can pick up where it left off to finish. There’s no need to fail in the middle or end of the workflow and then require the entire workflow to be executed again from the start later.

If a Function in the workflow fails, then the system can pick up where it left off to finish.

There is also a benefit to Function Chaining with scalability where each task in the workflow can essentially be scaled independently. Essentially with Function Chaining, a background process workflow is turned into a mini-Microservices architecture of sorts. Each task in the process can be scaled, revisioned, and updated independently of the rest. Additionally, Function Chaining also eliminates the issues of the maximum 5 minute execution time of a Function being a limiting factor in the implementation of background processes using Azure Functions.

What are Durable Functions?

Durable Functions are a feature of Azure Functions that enables the Azure Functions runtime to abstract out the need to implement message queues when implementing Function Chaining. There are still valid benefits to using “standard” Function Chaining under certain circumstances, but Durable Functions can help ease the pain of implementation when building long running background processes using Azure Functions.

Note: At the time of writing this, Durable Functions are in early preview, but hopefully they’ll be Generally Available soon!

With Durable Functions, Async/Await and Asynchronous programming are used to write long running processes using multiple Azure Functions as if the process was built as a traditional executable with all the Functions (aka code methods) located within the same assembly. This allows the Functions Chaining to be implemented in a manner that is more transparent to the programmer, easier to maintain, and no longer requires the provisioning of separate Message Queues for communication between Functions, or tasks in the workflow.

Scaling Azure Functions with Durable Functions 4

The simplicity of Durable Functions can easily be seen by comparing the above diagram with the previous one showing all the message queues used for communication. Azure Functions and Serverless computing offer great improvements to building and deploying code to run in the cloud. Durable Functions is one of the latest examples of how the new Serverless platforms and architecture can be used to further assist in solution development through higher levels of abstraction.

When writing Azure Functions in C#, the “async” and “await” keywords are used to implement asynchronous programming to link Functions together in the chain. When execution reaches an “await” keyword, the Function is essentially paused and the other Function is then triggered or called. When this happens, the state of the paused Function is saved and persisted, so that when execution returns the execution can pick right where it left off with all the necessary state it left off with.

#r "Microsoft.Azure.cWebJobs.Extensions.DurableTask"

public static async Task<long>; Run(DurableOrchestrationContext backupContext)
string rootDirectory = backupContext.GetInput<string>();
if (string.IsNullOrEmpty(rootDirectory))
rootDirectory = Environment.CurrentDirectory;

string[] files = await backupContext.CallFunctionAsync<string[]>(

var tasks = new Task<long>[files.Length];
for (int i = 0; i &amp;amp;lt; files.Length; i++)
tasks[i] = backupContext.CallFunctionAsync<long>(

await Task.WhenAll(tasks);

long totalBytes = tasks.Sum(t =&amp;amp;gt; t.Result);
return totalBytes;

In the above code example, the “for” loop calls our to a Function named “E2_CopyFileToBlob” and returns a Task. This task will manage the background calling of the Function and then return back when it’s complete. This particular code is setting up multiple simultaneous calls to the “E2_CopyFileToBlob” Function and capturing the Tasks for those within the “tasks” variable. Then after the “for” loop, the “await” keyword is used to Pause the Function until all the asynchronous calls have completed execution before continuing on to Sum the results of them all together.

If you’re familiar with the async/await syntax in C# of implementing asynchronous programming, then this code should make sense. If you’re not, then the key immediate takeaway is that writing Functions this way using Function Chaining and async/await enables more complex workflows and processes to be more easily built out.

Function Chaining enables Functions to be integrated together and “call out” to each other in a more natural way compared to source code written within a more monolithic executable. The advancement of Function Chaining certainly offers another leap forward in the Serverless computing platform within Microsoft Azure using Azure Functions. This enables more complex code to be written and more complex requirements to be met more easily using a more highly scalable and cost affective architecture.

Posted by Chris Pietschmann

Chris is a Microsoft MVP and has nearly 20 years of experience building enterprise systems both in the cloud and on-premises. He is also a Microsoft Certified Azure Solutions Architect and developer, a Microsoft Certified Trainer (MCT), and Cloud Advocate. He has a passion for technology and sharing what he learns with others to help enable them to learn faster and be more productive.


  1. First of all, I want to congratulate you for write this kind of post, ’cause I thnik the Durable Functions will be generate more value for developers that uses Azure Functions as a background system processing and integration. But, I would like to put one consideration on this post. Azure Functions SDK was updated, and now, Azure Functions can wait for 10 minutes, no more 5 minutes as said in this Post.


    1. Chris Pietschmann August 15, 2017 at 3:30 pm

      It happens, that the default is still 5 minutes, but you can optionally extend it up to 10 minutes. Good to know, thanks!


  2. […] Scaling Azure Functions with Durable Functions (Chris Pietschmann) […]


  3. 1- the example you mentioned seems like doing parallel function calls in the background instead of in a workflow scenario or function chaining .. where output of one becomes the input of another one. Correct me if I am not getting the point.

    2- thank you for writing about this feature. Its a good addition to the development toolbox


    1. Chris Pietschmann August 17, 2017 at 5:40 pm

      I encourage you to read up on async/await and how it works. That will help clarify this for you. 🙂


  4. This is really interesting. However, as you pointed out in the article, this doesn’t completely replace the function chaining method. If you have a critical, restartable process, then function chaining would be easier to facilitate durability within the task. As it looks, durable functions are only durable in the sense of durably handing off state from one function to another. If the task should fail or error out, you need to restart the whole process. Function chaining (via queueing) gives you some capability to restart the processing at certain points.

    Great article. Thanks for writing it.


  5. Can we manage app.setting through a file rather then managing them through portal


  6. Great Post. Thanks.
    What about if I need to process multiple records and each record processing requires some operations? So for each operation I can have a function but how to handle the scenario where i need to process multiple records (kind of loop)? In this case, should I consider Webjobs?


    1. Chris Pietschmann October 4, 2017 at 8:38 am

      That’s some thing that Durable functions is built for, but you do need to design your code differently.


  7. interesting approach. another option that can be considered is to use LogicApps


  8. Using your example above to articulate my question: if the length of the files array is large (say 2,000), what governs how many “instances” of CopyFileToBlob are instantiated? Can I control that somehow?


    1. Chris Pietschmann January 28, 2018 at 6:09 pm

      I’m not sure how many instances is the maximum number of instances of the Azure Function that will be spun up in parallel. You’ll have to test it out and see how it performs.


  9. […] Scaling Azure Functions with Durable Functions […]


Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.