In this video Jeff Hollan, the product manager lead for Microsoft’s Serverless AZURE FUNCTIONS elaborates on how serverless technology has succeeded today, the challenges and it’s future.
From 1:30 Jeff introduces the core ideals of Serverless – “Focus on the code not the plumbing”.
He sets the scene by comparing it with traditional hosting, which requires hardware that costs money if not used and conversely faces limitations when needed to rapidly scale out. It also requires ongoing maintenance like patching.
In contrast Serverless is pure on demand computing – It’s invoked and paid for only when used, and by definition all of the required server management is abstracted out of the equation.
Azure Functions 2.0
At 2:30 he begins by defining ideal use case scenarios for Microsoft Azure Functions which includes microservices powering business applications, real-time event streams, backend APIs, machine learning, big data processing and ‘connective glue’.
The 2.0 release saw Microsoft re-write the Azure Functions run-time in .Net core 2.1, providing new trigger, binding and assembly extensibility, and also break it up so that the component parts are pulled in as needed, rather than before when they were all included in the run-time.
Behind the scenes
From 9:39, he explains what happens behind the scenes of Azure Functions.
Functions sit atop of a pool of virtual machines in the Azure data centres, adding an HTTP front-end and the Scale Controller, the intelligence that decides how to run and scale Functions.
He explains that when an HTTP request is sent to the Azure Function front end, a decision process activates to determine if the Function is already running on a VM and if not provisions one to run the code. The VM is kept live for around 15-20 mins and then de-commissioned if no longer needed.
Similarly if high demand is experienced the Scale Controller invokes further VMs to meet the demand.
The process of applying customers code on to the VM is known as ‘specializing’ the server, which Microsoft describes as a ‘cold start’ and is a KPI for the team as it determines the service performance. Jeff receives a daily report of this KPI across their global infrastructure, with the average being 2-3 seconds.
From 12:55 Jeff demonstrates how to watch the performance in real-time, through Azure Application Insights connected to Functions.
At 17:50 Jeff deep dives into the challenges and approaches to scaling Functions.
This gets into the heart of the Serverless performance challenge, for all hosting providers, where the cold start process requires a set up time following the first request.
At 20:45 he introduces the Functions Premium Plan, which has an option of pre-scaling where some resources are made constantly available, “warmed up” in advance.
Also via some Q&A with the audience he explores some tips and tricks for keeping alive a single instance, through embedding a timer into Functions that invoke on a regular basis.
Durable Functions Patterns
At 31:05 Jeff moves into explaining Durable Function Patterns.
This addresses the challenge of writing long-running stateful workflows, through providing a framework extension to Functions that lets you write orchestration as code.
Some of the function patterns mentioned include fanning-out and-fanning in, external events correlation, flexible automated long-running process monitoring.
The power of this approach is developers can easily embed parallel processing of tasks into their Functions in a simple way.
At 34:40 Jeff highlights that Functions is open source, leading into a broader conversation about the platform strategy, such as being able to run Functions anywhere.
For example they have Financial sector clients who want to use Serverless but can’t host their data in the public Cloud. Functions can be run in a Docker container and so can be run anywhere to meet these types of needs, such as on-premise.
He adds on that the prominent platform features of the Azure functions include the networking, authentication, API (Application Programming Interface) definition, app service plan, metrics and log streaming.
At 38:10 he begins a demo of Functions running on Kubernetes, demonstrating actions such as dependency injection, and explores the integration with Key Vault in detail.
To conclude the session from 51:30 Jeff returns to Durable Functions and explores their use and challenges in detail, including audience questions that illustrate the types of real-world use cases developers have in mind.