Warming Up 100 AWS Lambda Hosted ASP.NET Web API Applications
Want to learn more about AWS Lambda and .NET? Check out my A Cloud Guru course on ASP.NET Web API and Lambda.
Download full source code.
This post is a follow-up to a previous post where I showed how to force the Lambda service to warm 100 execution environments for a simple function. In this post, I’ll show how to use the same mechanism to warm up execution environments for a full Web API application running inside a Lambda function.
The animation below shows basic mechanism:
- Make special requests that the Lambda function can’t respond to quickly.
- While an execution environment is warming up, make another request, and so on.
This way, you can warm up many execution environments ahead of when they are needed.
See that post for background on what a cold start is, and why it might be a problem for your application.
The Lambda function
When you want to deploy a full web API application to a Lambda function, there are two Lambda project templates to choose from - serverless.AspNetCoreWebAPI
and serverless.AspNetCoreMinimalAPI
. I’m going to use the former template, but the same principles apply to the latter.
Create the project using the following command -
dotnet new serverless.AspNetCoreWebAPI -n WarmupAspNetCoreWebAPI
Change to the WarmupAspNetCoreWebAPI/src/WarmupAspNetCoreWebAPI
directory, and open that project in an IDE.
Add a new WarmupController
to the project, with the following code -
1using Microsoft.AspNetCore.Mvc;
2
3namespace WarmupAspNetCoreWebAPI.Controllers
4{
5 [Route("api/[controller]")]
6 [ApiController]
7 public class WarmupController : ControllerBase
8 {
9 private static readonly string FunctionId = Guid.NewGuid().ToString().Substring(0,8);
10 private static readonly string FunctionCreated = DateTime.Now.ToString("HH:mm:ss.fff");
11 private static bool _isCold = true;
12
13 [HttpGet]
14 public async Task<string> Get()
15 {
16 string requestId = Guid.NewGuid().ToString().Substring(0,8);
17 if (_isCold)
18 {
19 Console.WriteLine($"{FunctionId} {FunctionCreated} - warming up the function");
20 await Task.Delay(10000);
21 _isCold = false;
22 return $"{FunctionId} {FunctionCreated} {requestId} - delayed response for 10 seconds";
23 }
24 return $"{FunctionId} {FunctionCreated} {requestId} - was already warm";
25 }
26 }
27}
The controller has a single action method. The first time it is invoked, it will delay the response for 10 seconds. This gives you a chance to make more requests to the same Lambda function, forcing the Lambda service to create more execution environments for your function.
Its action method returns a different message depending on whether the function is cold or warm.
Keep in mind, I’m showing the basic mechanism. You may want to add some security around this.
Deploy the function
If you have never deployed a serverless function before, you need to create an S3 bucket for the deployment package, see this post for details.
Deploy the function using the following command -
dotnet lambda deploy-serverless --stack-name WarmupAspNetCoreWebAPI --s3-bucket cloudformation-templates-2022
This will take a few minutes. At the bottom of the output, you will see the URL of the API Gateway endpoint for the function.
Output Name Value
------------------------------ --------------------------------------------------
ApiURL https://xxxxxxxxxx.execute-api.us-east-1.amazonaws.com/Prod/
Don’t invoke it yet.
Invoking the function
To force the creation of multiple execution environments for the function, you need to request the GET action method on the WarmupController multiple times, in quick succession. Remember, you need don’t want an existing execution environment to be reused, you want a new one to be created.
You can do this with a tool like Fiddler, by making a request to https://xxxxxxxxxx.execute-api.us-east-1.amazonaws.com/Prod/api/warmup
.
Another option is to write a small console application that uses a HttpClient to make multiple requests to the action method.
Create a new console application in the same solution, and add the following code -
1using System.Text;
2using System.Text.Json;
3
4if (args.Length != 2)
5{
6 Console.WriteLine("Usage: dotnet run url count");
7 return;
8}
9string url = args[0];
10int count = int.Parse(args[1]);
11Console.WriteLine($"Invoking {url}...");
12
13HttpClient client = new HttpClient() { BaseAddress = new Uri(url) };
14List<Task<string>> taskList = new List<Task<string>>();
15
16for (int i = 1; i <= count; i++)
17{
18 Console.WriteLine($"Invoking {i}");
19 taskList.Add(client.GetStringAsync(""));
20}
21
22await Task.WhenAll(taskList);
23
24foreach (var task in taskList)
25{
26 var response = await task;
27 Console.WriteLine(response);
28}
Run the application with -
dotnet run https://xxxxxxxxxx.execute-api.us-east-1.amazonaws.com/Prod/api/warmup 100
This will make 10 requests to the action method, in quick succession. The output will look something like this -
Invoking https://xxxxxxxxxxxx.execute-api.us-east-1.amazonaws.com/Prod/api/warmup...
Invoking 1
Invoking 2
...snip
Invoking 100
Function id:b2a6fb8f, Function created:18:31:41.294, Request id:20f7a1bd - delayed response for 10 seconds
Function id:35df7378, Function created:18:31:40.826, Request id:b19c9aa8 - delayed response for 10 seconds
...snip
Function id:dcb4cebf, Function created:18:31:41.267, Request id:4816d814 - delayed response for 10 seconds
You now have two warm execution environments for the function.
Run the console application again, and you will see that the function is already warm, and the response is much faster.
Invoking https://xxxxxxxxxxxx.execute-api.us-east-1.amazonaws.com/Prod/api/warmup...
Invoking 1
Invoking 2
...snip
Invoking 10
Function id:35df7378, Function created:18:31:40.826, Request id:92003d2c - was already warm
Function id:b2a6fb8f, Function created:18:31:41.294, Request id:bb51280d - was already warm
...snip
Function id:dcb4cebf, Function created:18:31:41.267, Request id:b0006f13 - was already warm
Download full source code.