Azure Functions 2.0 – real world use case for serverless architecture

At the end of September, Microsoft announced the general availability of Azure Functions 2.0. When we read Eduardo Laureano blog post, we were very excited about improvements and new features. Eduardo wrote: “Azure Functions 2.0 is production ready and capable of handling your most demanding workloads, backed by our 99.95 percent SLA.”

In ASC LAB, we decided, that is good time for testing it deeper than “Hello World” example.

Why Serverless Matters

For many years, we, software engineers, have been thought to design our systems from low coupled components with high cohesion. Yet later, all of these components were deployed to one big machine losing much of its power to scale and its independence, as all of them have to share resources of the same machine.

With serverless, it’s time to be rewarded for a good design. Now, we can construct our system from small independent components that scale independently and each component can have its own performance and scalability requirements satisfied.

What’s more important – good design is also economically justified. With serverless, we only pay for CPU and RAM that our components actually use.

Another advantage of this approach is that we do not have to manage infrastructure – no need to provision VMs, install and update OSes. Serverless cloud takes care of all of this, plus providing monitoring and auto-scaling features.

A Quick Tour Of Azure Functions Features

Main features of Azure Functions:

    • Choice of language – write functions using C#, F#, Node.js, Java, PHP, batch, bash, or any executable.
    • Pay-per-use pricing model – pay only for the time spent running your code. See the Consumption hosting plan option in the pricing section.
    • Bring your own dependencies – functions supports NuGet and NPM, so you can use your favorite libraries.
    • Integrated security – protect HTTP-triggered functions with OAuth providers such as Azure Active Directory, Facebook, Google, Twitter, and Microsoft Account.
    • Simplified integration – easily leverage Azure services and software-as-a-service (SaaS) offerings: SendGrid, Twilio
    • Flexible development – code your functions right in the portal or set up continuous integration and deploy your code through GitHub, local Git, Visual Studio Team Services, and other supported development tools.
    • Open-source – the function’s runtime is open-source and available on GitHub.
    • Possibility to deploy on-premise.

Business Use Case

We had an idea to try to implement a simple billing in serverless architecture for a customer who sells its services in subscription based model. Customer sends a list of its employees who can use offered services. Based on contract prices for each type of subscription, system calculates fees for each employee. Then it aggregates it to create an invoice. Finally, a PDF printout is generated and sent to customers, together with sms/email notification.

The diagram below shows in detail how the flow between functions looks like.
The source code and tutorial about running solution locally are available on GitHub under the link.

Azure Functions

  1. User uploads CSV file with Beneficiaries to a specific data storage – Azure Blob Container.
  2. The above action triggers a function GenerateBillingItemsFunc that is responsible for:
    1. generating billing items, which use prices from an external database – CosmosDB and saving them in the table – Azure Table;
    2. sending message about the need to create a new invoice to Azure Queue;
  3. When a new message appears on the queue, next function is triggered (GenerateInvoiceFunc). This function creates domain object Invoice and saves this object in database – CosmosDB. After successful save, it sends a message to two Azure Queues.
  4. When a new message appears in one of queues, function PrintInvoiceFunc is triggered. This function uses external engine to PDF generation – JsReport and saves PDF file in Azure Blob Storage.
  5. When a new message appears in the second queue, function NotifyInvoiceFunc is triggered. This function uses two external systems – SendGrid to email sending and Twilio to SMS sending.

We tried to create functions with the best practices, i.e. they would be small, simple and would work independently.

We tested two approaches to create functions:

  1. one application/project = one function
  2. all functions in one application / project

Azure FunctionsAllInOneProject vs. separated functions

Then choosing an approach influences the way the functions are grouped into Function App. The following is written in the Azure docs about a Function App:

A function app provides an execution context in Azure in which your functions run. A function app consists of one or more individual functions that are managed together by Azure App Service. All of the functions in a function app share the same pricing plan, continuous deployment and runtime version. Think of a function app as a way to organize and collectively manage your functions.

If we choose the first approach, all functions will share the same pricing plan, continuous deployment and runtime version.

The second approach allows you to separate it all.
If you are interested in more details, read this Marc Duiker’s article.

In the following sections we will describe in more detail how each of these functions are built.

Billing Items Generation

The main responsibility of this function is to parse uploaded CSV file and generate billing items using prices from external database.

public static void Run(
    [BlobTrigger("active-lists/{name}", Connection = "AzureWebJobsStorage")] Stream myBlob, string name,
    [Table("billingItems")] out ICollector billingItems,
    [Queue("invoice-generation-request")] out InvoiceGenerationRequest queueRequest,
    ILogger log)
    log.LogInformation($"C# Blob Trigger function Processed blob: {name} Bytes");

    var activeList = ActiveListParser.Parse(name, myBlob);
    var generator = new BillingItemGenerator();
    var priceList = GetPriceList(activeList.CustomerCode);
    foreach (var bi in generator.Generate(activeList, priceList))

    queueRequest = InvoiceGenerationRequest.ForActiveList(activeList);

Thanks to [BlobTrigger] attribute, function will be triggered if user uploads a CSV file to Blob Storage Container with name active-lists which is configured by Connection parameter.

Name of uploaded file must be compatible with a pattern: [CLIENT_CODE]_[YEAR]_[MONTH]_*, for example: ASC_2018_11_activeList.txt.

Example file content:

99050555745;Annaliese Verena;A 29120458762;Josepha Gusti;A 39091666028;Deborah Wenzi;B 77050929111;John Smith;A 76091166752;Bob Martin;A 97031653569;Alice Smith;B 35060205229;Patricia Glide;A 38112669875;Mike Kowalski;B 13102408939;Kali Mali;A

Each line in this file represents: National Identification Number (PESEL in Poland), name with surname and product code.

Based on first part of filename (client code, in example ASC), system knows, what prices should be used to generate billing items. In other words, prices for specific products are defined per customer and customer code is unique id.

If you are interested in details – look at the ActiveListParser.cs and later PriceRepository.cs.

Thanks to [Table] attribute, function can save information in Azure Table Storage that stores structured NoSQL data, providing a key/attribute store with a schemaless design.

To add a new record to the table, just use billingItems.Add().

Thanks to [Queue] attribute, function can save message in Azure Queue. Assigning a value to the input parameter queueRequest will send the message to the queue.

Invoice Generation

After generating billing items, we need to generate invoices for clients. GenerateInvoiceFunc prepares and saves in database domain object Invoice and sends information to queues about this.

public static void Run(
    [QueueTrigger("invoice-generation-request")] InvoiceGenerationRequest request,
    [Table("billingItems")] CloudTable billingItems,
    [CosmosDB("crm", "invoices", ConnectionStringSetting = "cosmosDb")] out dynamic generatedInvoice,
    [Queue("invoice-print-request")] out InvoicePrintRequest printRequest,
    [Queue("invoice-notification-request")] out InvoiceNotificationRequest notificationRequest,
    ILogger log)
    log.LogInformation($"C# Queue trigger function processed: {request.CustomerCode} {request.Year} {request.Month}");

    var generator = new InvoiceGenerator();
    var items = GetBillingItemsFromTable(billingItems, request);
    var invoice = generator.Generate(request, items);

    generatedInvoice = invoice;
    printRequest = new InvoicePrintRequest { InvoiceToPrint = invoice };
    notificationRequest = new InvoiceNotificationRequest { InvoiceForNotification = invoice };

Thanks to [QueueTrigger], when a message appears on the invoice-generation-request queue, function should be triggered.

Thanks to [Table] attribute in function, we can have access to table in which we saved billing items in the previous step.
We used CloudTable class to read the table, because popular IQueryable is not supported in the Functions v2 runtime.
Using the method GetBillingItemsFromTable (created by us), based on data from request, we are able to download the table segment we are interested in:

static List GetBillingItemsFromTable(CloudTable billingItems, InvoiceGenerationRequest request)
    TableQuery query = new TableQuery()
            TableQuery.GenerateFilterCondition("PartitionKey", QueryComparisons.Equal, $"{request.CustomerCode}-{request.Year}-{request.Month}")

    var querySegment = billingItems.ExecuteQuerySegmentedAsync(query, null);
    var items = new List();
    foreach (BillingItem item in querySegment.Result)
    return items;

Thanks to [CosmosDB] attribute, we bind database, where invoice object should be saved.

Last two [Queue] bindings are used to inform next two functions which will be responsible for creating PDF with invoice and sending notification (email and sms) to user.

Invoice Printing

This function should be triggered when we want to create PDF with invoice for the client:

public static void Run(
    [QueueTrigger("invoice-print-request")]InvoicePrintRequest printRequest,
    Binder binder,
    ILogger log)
    var jsReportUrl = Environment.GetEnvironmentVariable("JsReportUrl");
    var pdf = new InvoicePrinter(jsReportUrl).Print(printRequest.InvoiceToPrint);


[QueueTrigger] attribute is known and is used to start the function when a message appears on the selected queue (in this example queue with name invoice-print-request).

The PDF creation process has been delegated to an external system – JS Report. We created our own instance from docker image on Azure and connection with this, based on URL from environment variable.

Thanks to Binder method parameter, we can asynchronously save created PDF in blob storage. The following method is used for this:

private static async Task StoreResultInBlobAsync(Binder binder, string title, byte[] doc)
    using (var stream = await binder.BindAsync(new BlobAttribute($"printouts/{title}.pdf", FileAccess.Write)))
        using (var writer = new BinaryWriter(stream))

Notifications Sending

In parallel to PDF creation process, the user notification process is started:

public static void Run(
    [QueueTrigger("invoice-notification-request")] InvoiceNotificationRequest notificationRequest,
    [SendGrid(ApiKey = "SendGridApiKey")] out SendGridMessage email,
    [TwilioSms(AccountSidSetting = "TwilioAccountSid", AuthTokenSetting = "TwilioAuthToken", From = "+15005550006")] out CreateMessageOptions sms,
    ILogger log)
    log.LogInformation($"C# Queue trigger function processed: {notificationRequest}");

    email = CreateEmail(notificationRequest);
    sms = CreateSMS(notificationRequest);

Thanks to the built-in integration with such systems as Twilio and SendGrid, we can send SMS text messages and emails without any problem or over-configuration.

To send an email with SendGrid, we need to use [SendGrid] attribute with defined APIKey property and save created object in the method parameter. To create SendGridMessage object we used the method below:

private static SendGridMessage CreateEmail(InvoiceNotificationRequest request)
    var email = new SendGridMessage();

    email.AddContent("text/html", $"You have new invoice {request.InvoiceForNotification.InvoiceNumber} for {request.InvoiceForNotification.TotalCost.ToString()}.");
    email.SetFrom(new EmailAddress(""));
    email.SetSubject($"New Invoice - {request.InvoiceForNotification.InvoiceNumber}");

    return email;

Sending SMS should be done in the same way. Thanks to [TwilioSms] attribute, we integrated with our Twilio account. Property from is filled by magic test number based on this docs.

private static CreateMessageOptions CreateSMS(InvoiceNotificationRequest request)
    return new CreateMessageOptions(new PhoneNumber("+15005550006"))
        Body = $"You have new invoice {request.InvoiceForNotification.InvoiceNumber} for {request.InvoiceForNotification.TotalCost.ToString()}."


Pricing depends mainly on the execution time and memory consumption.

Based on approximately 100 test calls to each function, we calculated the average execution time:

Function NameAverage Execution Time
GenerateBillingItemsFunc5,61 sec
GenerateInvoiceFunc3,27 sec
PrintInvoiceFunc3,00 sec
NotifyInvoiceFunc2,00 sec

Unfortunately, in Azure Functions there is no way to check what “Memory consumption” per request was. Version 2.0 did not bring any changes in this topic. Issue on GitHub is still open. Unofficial ways to measure this (comparing to the local environment, checking how much memory the Function App allocates) did not seem precise enough, so we skipped them. For this reason, we assume 512 MB for each function, for purpose of our estimation (which greatly exceeds actual memory usage).

Next, based on documentation we prepared Excel file with pricing calculation:

Azure Functions

According to these calculations, we would pay 79,3 euro for a million executions of the process. Limiting Resource consumption to 256MB decreases this amount to 28 euros.

Approximately first 130 000 calls are free.

It should be noted that storage rates and network rates have been omitted in the calculations.


Thanks to integration with Azure Application Insight, we can monitor functions very well and easily observe results.

In Functions 2.0, creators expanded integration with Application Insights to give more visibility into distributed tracing. Thanks to visualisation in the Application map, we can understand how components interact and drill into individual executions to diagnose issues.

Azure FunctionsApplication Map all functions in one application (without separation)

Azure FunctionsApplication Map for functions in separate projects

Thanks to end-to-end transaction details view, we can monitor each execution and find performance issues.

Azure Functions


This example, in our opinion, shows that Azure Functions 2.0 are production-ready technology. In a few points below we have written the main pros and cons of serverless architecture and this solution.

Developers’ experience is great. We can easily build the whole solution for our machines, as well as deploy it to Azure.

Platform provides declarative bindings for access to all resources: blobs, tables, databases, queues, http request/response, and external services like SendGrid or Twilio. It removes burden of having to manage connections manually and freeing resources, which greatly simplifies code.

Monitoring capabilities are of high quality and help diagnose problems quickly.


  • great developers’ experience
  • promotes better design practices
  • allows developers to focus on writing small, autonomous components that adhere to SOLID/OOP practices
  • autoscaling and monitoring out of the box
  • pay only for resources that actually use
  • removes the whole burden of server / vm / container management
  • integration with a lot of technologies (queues, databases, blobs, external systems like Twilio, SendGrid) is very simple thanks built-in methods
  • monitoring tools available out of the box


  • magical infrastructure increases the risk of integration problems
  • “cold start” problem still exists
  • the integration possibilities are limited to what Azure offers (but you can always try to get around this via HTTP Trigger)
  • lack of control over server apps requires re-thinking around sessions and authorization
  • configuration becomes a lot more complex
  • cost control is not perfect, but it improves from version to version