Exploring .NET 8(7) Azure Functions in Isolated Mode

Implications for Your Applications

Stas(Stanislav) Lebedenko
6 min readAug 31, 2023
Photo by Dominika Roseclay from Pexels

TL;DR; This article is dedicated to the upcoming release of .NET 8 and its impact on Azure Functions. With .NET 8 ahead-of-time(AOT) compilation and ready-to-run function compilation option, we can get a significant performance boost for serverless apps. That being said, the Isolated model will put more responsibility for performance on .NET developers, which is a minor setback IMHO. The isolated model provides slower performance than in-process, and we will have a look into the details.
Visit
azurebacktoschool for more content.

The plan

We have been through the season when the Functions team constantly needed to keep up with the .NET framework team to release optimized middleware for in-process Azure Functions, and the functions team was solely responsible for cold starts and performance. So, around two years ago, Microsoft decided to make an isolated mode, which is the only future model, and gradually move in that direction.

But still, there is a lack of feature parity between the models, so there might be no final switch to an isolated model with the release of .NET 8. The plan shifted towards the release of .NET 10 with the new feature priority shifted towards isolated mode

  • Isolated model overview, current .NET 7 worker, dev responsibility
  • AOT compilation in .NET 8
  • Comparison of performance in-process and isolated
  • Performance of .NET 8 isolated(coming this month, not available on consumption yet)
  • Conclusion and implications
A diagram showing the change in release patterns after parity from Matthew Henderson

Isolated worker model overview

The isolated worker model currently has a higher latency than the in-process and ready-to-run/Linux-x64 combo, resulting in twice as slow performance as a portable option.

Key points

  • No conflict between your version of libraries and those used by in-process
  • Any framework version (v4 have 7.x, 6.x, 4.8)
  • Program.cs console style application
  • Dependencies around Microsoft.Azure.Functions.Worker library
  • .NET native Dependency injection with extended configuration
  • Middleware registration
  • Placeholders - functions platform configuration for “cold start”
  • ReadyToRun - a form of ahead-of-time compilation and with .NET 8 this is what makes your functions app run faster in theory :).
  • Instead of Microsoft.Azure.WebJobs.Extensions.Http we will use Microsoft.Azure.Functions.Worker.Http, worker extensions still don't have a feature parity with the in-process.
  • Instead of IActionResult, we will use HttpResponseData
Brief look into Program.cs file

Function code will essentially remain the same, with minor differences

Isolated mode function
In-process mode function

The .NET 8 AOT

We all know how language-specific compiler converts C# code to “Intermediate language” (IL) that still a relatively high level and can look like this

IL_0010: call void [System.Console]System.Console::WriteLine(string)

Then the second stage Just-In-Time compiler(JIT) will convert IL to assembly code for the particular CPU

L001d: push dword ptr [0x8b086e8]
L0023: mov ecx, ebx

So with .NET 8 AOT and ready-to-run, there will be no JIT compilation at runtime when the function executes.

Moreover, as you know, many extra classes and methods in the System.Net library are not used in your application. And with the .NET 8 AOT compilation, they will be thrown away to minimize the bundle up to eight times! This process, called “app trimming, “ will remove unused parts of the framework and application code from assembly code.

But as I said, performance gain would depend on application types and extra libraries injected at startup. I think that trimming will work perfectly only with Microsoft libraries, so if you are planning to use exotic ones for PDF processing or ORM to operate data in the Oracle database, there might be fewer benefits. You will also be bound to the particular platform, a minor problem that can be solved with different CI/CD pipelines.

Performance in-process

I’m using loader.io for testing purposes; it’s super easy to set up and start in seconds. Please let me know if you want to see a really massive and heavy test with something like https://nbomber.com/ :).

10k over 30 seconds cold instance

10k over 15 seconds warm instance

10k over 15 seconds cold instance

There are not many surprises in this area; in-process .NET 6 performance is really predictable and stable. Latency spikes are the startup of additional instances.

Performance isolated process

Now, with an isolated process worker, things are getting much more interesting. This test was done with .NET 7 because, at the moment, .NET 8 functions are not working, so I will update the testing sections in a few weeks.

First case with read-to-run option with linux-x64

10k requests over period of 30 seconds

Now we see functions worker called as a dependency and spikes of latency. But it will be much more interesting now.

10k requests over period of 15 seconds

Let’s look into these errors and Function invocation exceptions. I was really surprised.

After running additional tests, I replicated this situation once below, so be aware of usage spikes if you use .NET 7 and the isolated process.

With the prewarmed functions on things are getting back to normal.

10k requests over period of 15 seconds

The conclusion

With .NET 8 AOT and ready-to-run option, we hopefully get means to alleviate cold starts without extra hacks and gain more control over the application. However, not everyone will appreciate a slight increase in complexity over the in-process model. But these changes will further boost usage of Functions with .NET.

Functions with an isolated worker model currently work with a higher latency than in-process, and the ready-to-run/Linux-x64 combo will result in twice as slow performance as a portable option.

Isolated model worker .NET 7 with ready-to-run average latency is around 3000 msec, with portable around 700 msec and exceptions on freshly started consumption plan instances. In-process has an average latency of 170 msec for cold and warm consumption plans.

As of September 2nd, we have .NET 8 functions only available in the preview version of .NET 8 dotnet-sdk-8.0.100-preview.7.23376.3 for Visual Studio 2022 preview and premium & dedicated Linux plans. For Consumption there is a need to wait until October 2023 to run initial tests.

That’s it, thanks for reading. Cheers!

Useful links.

--

--

Stas(Stanislav) Lebedenko

Azure MVP | MCT | Software/Cloud Architect | Dev | https://github.com/staslebedenko | Odesa MS .NET/Azure group | Serverless fan 🙃| IT2School/AtomSpace