

# Building Lambda functions with C\$1
Building with C\$1

You can run your .NET application in Lambda using the managed .NET 8 runtime, a custom runtime, or a container image. After your application code is compiled, you can deploy it to Lambda either as a .zip file or a container image. Lambda provides the following runtimes for .NET languages:


| Name | Identifier | Operating system | Deprecation date | Block function create | Block function update | 
| --- | --- | --- | --- | --- | --- | 
|  .NET 10  |  `dotnet10`  |  Amazon Linux 2023  |   Nov 14, 2028   |   Dec 14, 2028   |   Jan 15, 2029   | 
|  .NET 9 (container only)  |  `dotnet9`  |  Amazon Linux 2023  |   Nov 10, 2026   |   Not scheduled   |   Not scheduled   | 
|  .NET 8  |  `dotnet8`  |  Amazon Linux 2023  |   Nov 10, 2026   |   Dec 10, 2026   |   Jan 11, 2027   | 

## Setting up your .NET development environment
Development environment

To develop and build your Lambda functions, you can use any of the commonly available .NET integrated development environments (IDEs), including Microsoft Visual Studio, Visual Studio Code, and JetBrains Rider. To simplify your development experience, AWS provides a set of .NET project templates, as well as the `Amazon.Lambda.Tools` command line interface (CLI).

Run the following .NET CLI commands to install these project templates and command line tools.

### Installing the .NET project templates


To install the project templates, run the following command:

```
dotnet new install Amazon.Lambda.Templates
```

### Installing and updating the CLI tools


Run the following commands to install, update, and uninstall the `Amazon.Lambda.Tools` CLI.

To install the command line tools:

```
dotnet tool install -g Amazon.Lambda.Tools
```

To update the command line tools:

```
dotnet tool update -g Amazon.Lambda.Tools
```

To uninstall the command line tools:

```
dotnet tool uninstall -g Amazon.Lambda.Tools
```

# Define Lambda function handler in C\$1
Handler

The Lambda function *handler* is the method in your function code that processes events. When your function is invoked, Lambda runs the handler method. Your function runs until the handler returns a response, exits, or times out.

This page describes how to work with Lambda function handlers in C\$1 to work with the .NET managed runtime, including options for project setup, naming conventions, and best practices. This page also includes an example of a C\$1 Lambda function that takes in information about an order, produces a text file receipt, and puts this file in an Amazon Simple Storage Service (S3) bucket. For information about how to deploy your function after writing it, see [Build and deploy C\$1 Lambda functions with .zip file archives](csharp-package.md) or [Deploy .NET Lambda functions with container images](csharp-image.md).

**Topics**
+ [

## Setting up your C\$1 handler project
](#csharp-handler-setup)
+ [

## Example C\$1 Lambda function code
](#csharp-example-code)
+ [

## Class library handlers
](#csharp-class-library-handlers)
+ [

## Executable assembly handlers
](#csharp-executable-assembly-handlers)
+ [

## Valid handler signatures for C\$1 functions
](#csharp-handler-signatures)
+ [

## Handler naming conventions
](#csharp-handler-naming)
+ [

## Serialization in C\$1 Lambda functions
](#csharp-handler-serializer)
+ [

## File-based functions
](#csharp-file-based-functions)
+ [

## Accessing and using the Lambda context object
](#csharp-example-context)
+ [

## Using the SDK for .NET v3 in your handler
](#csharp-example-sdk-usage)
+ [

## Accessing environment variables
](#csharp-example-envvars)
+ [

## Using global state
](#csharp-handler-state)
+ [

## Simplify function code with the Lambda Annotations framework
](#csharp-handler-annotations)
+ [

## Code best practices for C\$1 Lambda functions
](#csharp-best-practices)

## Setting up your C\$1 handler project


When working with Lambda functions in C\$1, the process involves writing your code, then deploying your code to Lambda. There are two different execution models for deploying Lambda functions in .NET: the class library approach and the executable assembly approach.

In the class library approach, you package your function code as a .NET assembly (`.dll`) and deploy it to Lambda with the .NET managed runtime (`dotnet8`). For the handler name, Lambda expects a string in the format `AssemblyName::Namespace.Classname::Methodname`. During the function's initialization phase, your function's class is initialized, and any code in the constructor is run.

In the executable assembly approach, you use the [top-level statements feature](https://learn.microsoft.com/en-us/dotnet/csharp/tutorials/top-level-statements) that was first introduced in C\$1 9. This approach generates an executable assembly which Lambda runs whenever it receives an invoke command for your function. In this approach, you also use the .NET managed runtime (`dotnet8`). For the handler name, you provide Lambda with the name of the executable assembly to run.

The main example on this page illustrates the class library approach. You can initialize your C\$1 Lambda project in various ways, but the easiest way is to use the .NET CLI with the `Amazon.Lambda.Tools` CLI. Set up the `Amazon.Lambda.Tools` CLI by following the steps in [Setting up your .NET development environment](lambda-csharp.md#csharp-dev-env). Then, initialize your project with the following command:

```
dotnet new lambda.EmptyFunction --name ExampleCS
```

This command generates the following file structure:

```
/project-root 
    └ src
        └ ExampleCS
            └ Function.cs (contains main handler)
            └ Readme.md
            └ aws-lambda-tools-defaults.json
            └ ExampleCS.csproj          
    └ test
         └ ExampleCS.Tests
            └ FunctionTest.cs (contains main handler)
            └ ExampleCS.Tests.csproj
```

In this file structure, the main handler logic for your function resides in the `Function.cs` file.

## Example C\$1 Lambda function code


The following example C\$1 Lambda function code takes in information about an order, produces a text file receipt, and puts this file in an Amazon S3 bucket.

**Example `Function.cs` Lambda function**  

```
using System;
using System.Text;
using System.Threading.Tasks;
using Amazon.Lambda.Core;
using Amazon.S3;
using Amazon.S3.Model;

// Assembly attribute to enable Lambda function logging
[assembly: LambdaSerializer(typeof(Amazon.Lambda.Serialization.SystemTextJson.DefaultLambdaJsonSerializer))]

namespace ExampleLambda;

public class Order
{
    public string OrderId { get; set; } = string.Empty;
    public double Amount { get; set; }
    public string Item { get; set; } = string.Empty;
}

public class OrderHandler
{
    private static readonly AmazonS3Client s3Client = new();

    public async Task<string> HandleRequest(Order order, ILambdaContext context)
    {
        try
        {
            string? bucketName = Environment.GetEnvironmentVariable("RECEIPT_BUCKET");
            if (string.IsNullOrWhiteSpace(bucketName))
            {
                throw new ArgumentException("RECEIPT_BUCKET environment variable is not set");
            }

            string receiptContent = $"OrderID: {order.OrderId}\nAmount: ${order.Amount:F2}\nItem: {order.Item}";
            string key = $"receipts/{order.OrderId}.txt";

            await UploadReceiptToS3(bucketName, key, receiptContent);

            context.Logger.LogInformation($"Successfully processed order {order.OrderId} and stored receipt in S3 bucket {bucketName}");
            return "Success";
        }
        catch (Exception ex)
        {
            context.Logger.LogError($"Failed to process order: {ex.Message}");
            throw;
        }
    }

    private async Task UploadReceiptToS3(string bucketName, string key, string receiptContent)
    {
        try
        {
            var putRequest = new PutObjectRequest
            {
                BucketName = bucketName,
                Key = key,
                ContentBody = receiptContent,
                ContentType = "text/plain"
            };

            await s3Client.PutObjectAsync(putRequest);
        }
        catch (AmazonS3Exception ex)
        {
            throw new Exception($"Failed to upload receipt to S3: {ex.Message}", ex);
        }
    }
}
```

This `Function.cs` file contains the following sections of code:
+ `using` statements: Use these to import C\$1 classes that your Lambda function requires.
+ `[assembly: LambdaSerializer(...)]`: `LambdaSerializer` is an assembly attribute that tells Lambda to automatically convert JSON event payloads into C\$1 objects before passing them to your function.
+ `namespace ExampleLambda`: This defines the namespace. In C\$1, the namespace name doesn't have to match the filename.
+ `public class Order {...}`: This defines the shape of the expected input event.
+ `public class OrderHandler {...}`: This defines your C\$1 class. Within it, you'll define the main handler method and any other helper methods.
+ `private static readonly AmazonS3Client s3Client = new();`: This initializes an Amazon S3 client with the default credential provider chain, outside of the main handler method. This causes Lambda to run this code during the [initialization phase](lambda-runtime-environment.md#runtimes-lifecycle-ib).
+ `public async ... HandleRequest (Order order, ILambdaContext context)`: This is the **main handler method**, which contains your main application logic.
+ `private async Task UploadReceiptToS3(...) {}`: This is a helper method that's referenced by the main `handleRequest` handler method.

Because this function requires an Amazon S3 SDK client, you must add it to your project's dependencies. You can do so by navigating to `src/ExampleCS` and running the following command:

```
dotnet add package AWSSDK.S3
```

### Add metadata information to aws-lambda-tools-defaults.json


By default, the generated `aws-lambda-tools-defaults.json` file doesn't contain `profile` or `region` information for your function. In addition, update the `function-handler` string to the correct value (`ExampleCS::ExampleLambda.OrderHandler::HandleRequest`). You can manually make this update and add the necessary metadata to use a specific credentials profile and region for your function. For example, your `aws-lambda-tools-defaults.json` file should look similar to this:

```
{
  "Information": [
    "This file provides default values for the deployment wizard inside Visual Studio and the AWS Lambda commands added to the .NET Core CLI.",
    "To learn more about the Lambda commands with the .NET Core CLI execute the following command at the command line in the project root directory.",
    "dotnet lambda help",
    "All the command line options for the Lambda command can be specified in this file."
  ],
  "profile": "default",
  "region": "us-east-1",
  "configuration": "Release",
  "function-architecture": "x86_64",
  "function-runtime": "dotnet8",
  "function-memory-size": 512,
  "function-timeout": 30,
  "function-handler": "ExampleCS::ExampleLambda.OrderHandler::HandleRequest"
}
```

For this function to work properly, its [execution role](lambda-intro-execution-role.md) must allow the `s3:PutObject` action. Also, ensure that you define the `RECEIPT_BUCKET` environment variable. After a successful invocation, the Amazon S3 bucket should contain a receipt file.

## Class library handlers


The main [example code](#csharp-example-code) on this page illustrates a class library handler. Class library handlers have the following structure:

```
[assembly: LambdaSerializer(typeof(Amazon.Lambda.Serialization.SystemTextJson.DefaultLambdaJsonSerializer))]

namespace NAMESPACE;

...

public class CLASSNAME {
    public async Task<string> METHODNAME (...) {
    ...
    }
}
```

When you create a Lambda function, you need to provide Lambda with information about your function's handler in the form of a string in the [Handler field](https://docs.aws.amazon.com/lambda/latest/api/API_CreateFunction.html#lambda-CreateFunction-request-Handler). This tells Lambda which method in your code to run when your function is invoked. In C\$1, for class library handlers, the format of the handler string is `ASSEMBLY::TYPE::METHOD`, where:
+ `ASSEMBLY` is the name of the .NET assembly file for your application. If you're using the `Amazon.Lambda.Tools` CLI to build your application and you don't set the assembly name using the `AssemblyName` property in the `.csproj` file, then `ASSEMBLY` is simply the name of your `.csproj` file.
+ `TYPE` is the full name of the handler type, which is `NAMESPACE.CLASSNAME`.
+ `METHOD` is the name of the main handler method in your code, which is `METHODNAME`.

For the main example code on this page, if the assembly is named `ExampleCS`, then the full handler string is `ExampleCS::ExampleLambda.OrderHandler::HandleRequest`.

## Executable assembly handlers


You can also define Lambda functions in C\$1 as an executable assembly. Executable assembly handlers utilize C\$1's top-level statements feature, in which the compiler generates the `Main()` method and puts your function code within it. When using executable assemblies, the Lambda runtime must be bootstrapped. To do this, use the `LambdaBootstrapBuilder.Create` method in your code. The inputs to this method are the main handler function as well as the Lambda serializer to use. The following shows an example of an executable assembly handler in C\$1:

```
namespace GetProductHandler;

IDatabaseRepository repo = new DatabaseRepository();

await LambdaBootstrapBuilder.Create<APIGatewayProxyRequest>(Handler, new DefaultLambdaJsonSerializer())
    .Build()
    .RunAsync();

async Task<APIGatewayProxyResponse> Handler(APIGatewayProxyRequest apigProxyEvent, ILambdaContext context)
{
    var id = apigProxyEvent.PathParameters["id"];
    var databaseRecord = await this.repo.GetById(id);
    
    return new APIGatewayProxyResponse 
    {
        StatusCode = (int)HttpStatusCode.OK,
        Body = JsonSerializer.Serialize(databaseRecord)
    };
};
```

In the [Handler field](https://docs.aws.amazon.com/lambda/latest/api/API_CreateFunction.html#lambda-CreateFunction-request-Handler) for executable assembly handlers, the handler string that tells Lambda how to run your code is the name of the assembly. In this example, that's `GetProductHandler`.

## Valid handler signatures for C\$1 functions


In C\$1, valid Lambda handler signatures take between 0 and 2 arguments. Typically, your handler signature has two arguments, as shown in the main example:

```
public async Task<string> HandleRequest(Order order, ILambdaContext context)
```

When providing two arguments, the first argument must be the event input, and the second argument must be the Lambda context object. Both arguments are optional. For example, the following are also valid Lambda handler signatures in C\$1:
+ `public async Task<string> HandleRequest()`
+ `public async Task<string> HandleRequest(Order order)`
+ `public async Task<string> HandleRequest(ILambdaContext context)`

Apart from the base syntax of the handler signature, there are some additional restrictions:
+ You cannot use the `unsafe` keyword in the handler signature. However, you can use the `unsafe` context inside the handler method and its dependencies. For more information, see [unsafe (C\$1 reference)](https://msdn.microsoft.com/en-us/library/chfa2zb8.aspx) on the Microsoft documentation website.
+ The handler may not use the `params` keyword, or use `ArgIterator` as an input or return parameter. These keywords support a variable number of parameters. The maximum number of arguments your handler can accept is two.
+ The handler may not be a generic method. In other words, it can't use generic type parameters such as `<T>`.
+ Lambda doesn't support async handlers with `async void` in the signature.

## Handler naming conventions


Lambda handlers in C\$1 don't have strict naming restrictions. However, you must ensure that you provide the correct handler string to Lambda when you deploy your function. The right handler string depends on if you're deploying a [class library handler](#csharp-class-library-handlers) or an [executable assembly handler](#csharp-executable-assembly-handlers).

Although you can use any name for your handler, function names in C\$1 are generally in PascalCase. Also, although the file name doesn't need to match the class name or handler name, it's generally a best practice to use a filename like `OrderHandler.cs` if your class name is `OrderHandler`. For example, you can modify the file name in this example from `Function.cs` to `OrderHandler.cs`.

## Serialization in C\$1 Lambda functions


JSON is the most common and standard input format for Lambda functions. In this example, the function expects an input similar to the following:

```
{
    "orderId": "12345",
    "amount": 199.99,
    "item": "Wireless Headphones"
}
```

In C\$1, you can define the shape of the expected input event in a class. In this example, we define the `Order` class to model this input:

```
public class Order
{
    public string OrderId { get; set; } = string.Empty;
    public double Amount { get; set; }
    public string Item { get; set; } = string.Empty;
}
```

If your Lambda function uses input or output types other than a `Stream` object, you must add a serialization library to your application. This lets you convert the JSON input into an instance of the class that you defined. There are two methods of serialization for C\$1 functions in Lambda: reflection-based serialization and source-generated serialization.

### Reflection-based serialization


AWS provides pre-built libraries that you can quickly add to your application. These libraries implement serialization using [ reflection](https://learn.microsoft.com/en-us/dotnet/csharp/advanced-topics/reflection-and-attributes/). Use one of the following packages to implement reflection-based serialization:
+ `Amazon.Lambda.Serialization.SystemTextJson` – In the backend, this package uses `System.Text.Json` to perform serialization tasks.
+ `Amazon.Lambda.Serialization.Json` – In the backend, this package uses `Newtonsoft.Json` to perform serialization tasks.

You can also create your own serialization library by implementing the `ILambdaSerializer` interface, which is available as part of the `Amazon.Lambda.Core` library. This interface defines two methods:
+ `T Deserialize<T>(Stream requestStream);`

  You implement this method to deserialize the request payload from the `Invoke` API into the object that is passed to your Lambda function handler.
+ `T Serialize<T>(T response, Stream responseStream);`

  You implement this method to serialize the result returned from your Lambda function handler into the response payload that the `Invoke` API operation returns.

The main example on this page uses reflection-based serialization. Reflection-based serialization works out of the box with AWS Lambda and requires no additional setup, making it a good choice for simplicity. However, it does require more function memory usage. You may also see higher function latencies due to runtime reflection.

### Source-generated serialization


With source-generated serialization, serialization code is generated at compile time. This removes the need for reflection and can improve the performance of your function. To use source-generated serialization in your function, you must do the following:
+ Create a new partial class that inherits from `JsonSerializerContext`, adding `JsonSerializable` attributes for all types that require serialization or deserialization.
+ Configure the `LambdaSerializer` to use a `SourceGeneratorLambdaJsonSerializer<T>`.
+ Update any manual serialization and deserialization in your application code to use the newly created class.

The following example shows how you can modify the main example on this page, which uses reflection-based serialization, to use source-generated serialization instead.

```
using System.Text.Json;
using System.Text.Json.Serialization;

...

public class Order
{
    public string OrderId { get; set; } = string.Empty;
    public double Amount { get; set; }
    public string Item { get; set; } = string.Empty;
}

[JsonSerializable(typeof(Order))]
public partial class OrderJsonContext : JsonSerializerContext {}

public class OrderHandler
{

    ...

    public async Task<string> HandleRequest(string input, ILambdaContext context)
    {
    
    var order = JsonSerializer.Deserialize(input, OrderJsonContext.Default.Order);
    
    ...
    
    }

}
```

Source-generated serialization requires more setup than reflection-based serialization. However, functions using source-generated tend to use less memory and have better performance due to compile-time code generation. To help eliminate function [ cold starts](lambda-runtime-environment.md#cold-start-latency), consider switching to source-generated serialization.

**Note**  
If you want to use native [ahead-of-time compilation (AOT)](dotnet-native-aot.md) with Lambda, you must use source-generated serialization.

## File-based functions


Introduced in .NET 10, file-based apps enable you to build .NET applications from a single `.cs` file, without a `.csproj` file or directory structure. Lambda supports file-based functions, starting with .NET 10. They offer a streamlined, lightweight way to build Lambda functions in C\$1.

The fastest way to get started creating a C\$1 file-based Lambda function is to use the `Amazon.Lambda.Templates` package. To install the package, run the following command:

```
dotnet new install Amazon.Lambda.Templates
```

Next, create a C\$1 file-based Lambda example function:

```
dotnet new lambda.FileBased -n MyLambdaFunction
```

File-based functions use [executable assembly handlers](#csharp-executable-assembly-handlers). You must therefore include the `Amazon.Lambda.RuntimeSupport` NuGet package and use the `LambdaBootstrapBuilder.Create` method to register the .NET handler function for the event type and start the .NET Lambda runtime client.

File-based functions use .NET Native AOT by default, which requires source-generated serialization. You can disable Native AOT by specifying `#:property PublishAot=false` in your source file. For more information on using Native AOT in Lambda, see [Compile .NET Lambda function code to a native runtime format](dotnet-native-aot.md).

## Accessing and using the Lambda context object


The Lambda [context object](csharp-context.md) contains information about the invocation, function, and execution environment. In this example, the context object is of type `Amazon.Lambda.Core.ILambdaContext`, and is the second argument of the main handler function.

```
public async Task<string> HandleRequest(Order order, ILambdaContext context) {
    ...
}
```

The context object is an optional input. For more information about valid accepted handler signatures, see [Valid handler signatures for C\$1 functions](#csharp-handler-signatures).

The context object is useful for producing function logs to Amazon CloudWatch. You can use the `context.getLogger()` method to get a `LambdaLogger` object for logging. In this example, we can use the logger to log an error message if processing fails for any reason:

```
context.Logger.LogError($"Failed to process order: {ex.Message}");
```

Outside of logging, you can also use the context object for function monitoring. For more information about the context object, see [Using the Lambda context object to retrieve C\$1 function information](csharp-context.md).

## Using the SDK for .NET v3 in your handler


Often, you'll use Lambda functions to interact with or make updates to other AWS resources. The simplest way to interface with these resources is to use the SDK for .NET v3.

**Note**  
The SDK for .NET (v2) is deprecated. We recommend that you use only the SDK for .NET v3.

You can add SDK dependencies to your project using the following `Amazon.Lambda.Tools` command:

```
dotnet add package <package_name>
```

For example, in the main example on this page, we need to use the Amazon S3 API to upload a receipt to S3. We can import the Amazon S3 SDK client with the following command:

```
dotnet add package AWSSDK.S3
```

This command adds the dependency to your project. You should also see a line similar to the following in your project's `.csproj` file:

```
<PackageReference Include="AWSSDK.S3" Version="3.7.2.18" />
```

Then, import the dependencies directly in your C\$1 code:

```
using Amazon.S3;
using Amazon.S3.Model;
```

The example code then initializes an Amazon S3 client (using the [default credential provider chain](https://docs.aws.amazon.com/sdkref/latest/guide/standardized-credentials.html)) as follows:

```
private static readonly AmazonS3Client s3Client = new();
```

In this example, we initialized our Amazon S3 client outside of the main handler function to avoid having to initialize it every time we invoke our function. After you initialize your SDK client, you can then use it to interact with other AWS services. The example code calls the Amazon S3 `PutObject` API as follows:

```
var putRequest = new PutObjectRequest
{
    BucketName = bucketName,
    Key = key,
    ContentBody = receiptContent,
    ContentType = "text/plain"
};

await s3Client.PutObjectAsync(putRequest);
```

## Accessing environment variables


In your handler code, you can reference any [environment variables](configuration-envvars.md) by using the `System.Environment.GetEnvironmentVariable` method. In this example, we reference the defined `RECEIPT_BUCKET` environment variable using the following lines of code:

```
string? bucketName = Environment.GetEnvironmentVariable("RECEIPT_BUCKET");
if (string.IsNullOrWhiteSpace(bucketName))
{
    throw new ArgumentException("RECEIPT_BUCKET environment variable is not set");
}
```

## Using global state


Lambda runs your static code and the class constructor during the [ initialization phase](lambda-runtime-environment.md#runtimes-lifecycle-ib) before invoking your function for the first time. Resources created during initialization stay in memory between invocations, so you can avoid having to create them every time you invoke your function.

In the example code, the S3 client initialization code is outside the main handler method. The runtime initializes the client before the function handles its first event, which can lead to longer processing times. Subsequent events are much faster because Lambda doesn’t need to initialize the client again.

## Simplify function code with the Lambda Annotations framework


[Lambda Annotations](https://www.nuget.org/packages/Amazon.Lambda.Annotations) is a framework for .NET 8 which simplifies writing Lambda functions using C\$1. The Annotations framework uses [source generators](https://learn.microsoft.com/en-us/dotnet/csharp/roslyn-sdk/source-generators-overview) to generate code that translates from the Lambda programming model to the simplified code. With the Annotations framework, you can replace much of the code in a Lambda function written using the regular programming model. Code written using the framework uses simpler expressions that allow you to focus on your business logic. See [Amazon.Lambda.Annotations](https://www.nuget.org/packages/Amazon.Lambda.Annotations) in the nuget documentation for examples.

For an example of a full application utilizing Lambda Annotations, see the [ PhotoAssetManager](https://github.com/awsdocs/aws-doc-sdk-examples/tree/main/dotnetv3/cross-service/PhotoAssetManager) example in the `awsdocs/aws-doc-sdk-examples` GitHub repository. The main `Function.cs` file in the `PamApiAnnotations` directory uses Lambda Annotations. For comparison, the `PamApi` directory has equivalent files written using the regular Lambda programming model.

### Dependency injection with Lambda Annotations framework


You can also use the Lambda Annotations framework to add dependency injection to your Lambda functions using syntax you are familiar with. When you add a `[LambdaStartup]` attribute to a `Startup.cs` file, the Lambda Annotations framework will generate the required code at compile time.

```
[LambdaStartup]
public class Startup
{
    public void ConfigureServices(IServiceCollection services)
    {
        services.AddSingleton<IDatabaseRepository, DatabaseRepository>();
    }
}
```

Your Lambda function can inject services using either constructor injection or by injecting into individual methods using the `[FromServices]` attribute.

```
[assembly: LambdaSerializer(typeof(Amazon.Lambda.Serialization.SystemTextJson.DefaultLambdaJsonSerializer))]

namespace GetProductHandler;

public class Function
{
    private readonly IDatabaseRepository _repo;
    
    public Function(IDatabaseRepository repo)
    {
        this._repo = repo;
    }
    
    [LambdaFunction]
    [HttpApi(LambdaHttpMethod.Get, "/product/{id}")]
    public async Task<Product> FunctionHandler([FromServices] IDatabaseRepository repository, string id)
    {
        return await this._repo.GetById(id);
    }
}
```

## Code best practices for C\$1 Lambda functions


Adhere to the guidelines in the following list to use best coding practices when building your Lambda functions:
+ **Separate the Lambda handler from your core logic.** This allows you to make a more unit-testable function.
+ **Control the dependencies in your function's deployment package. ** The AWS Lambda execution environment contains a number of libraries. To enable the latest set of features and security updates, Lambda will periodically update these libraries. These updates may introduce subtle changes to the behavior of your Lambda function. To have full control of the dependencies your function uses, package all of your dependencies with your deployment package. 
+ **Minimize the complexity of your dependencies.** Prefer simpler frameworks that load quickly on [execution environment](lambda-runtime-environment.md) startup.
+ **Minimize your deployment package size to its runtime necessities. ** This will reduce the amount of time that it takes for your deployment package to be downloaded and unpacked ahead of invocation. For functions authored in .NET, avoid uploading the entire AWS SDK library as part of your deployment package. Instead, selectively depend on the modules which pick up components of the SDK you need (e.g. DynamoDB, Amazon S3 SDK modules and Lambda core libraries). 

**Take advantage of execution environment reuse to improve the performance of your function.** Initialize SDK clients and database connections outside of the function handler, and cache static assets locally in the `/tmp` directory. Subsequent invocations processed by the same instance of your function can reuse these resources. This saves cost by reducing function run time.

To avoid potential data leaks across invocations, don’t use the execution environment to store user data, events, or other information with security implications. If your function relies on a mutable state that can’t be stored in memory within the handler, consider creating a separate function or separate versions of a function for each user.

**Use a keep-alive directive to maintain persistent connections.** Lambda purges idle connections over time. Attempting to reuse an idle connection when invoking a function will result in a connection error. To maintain your persistent connection, use the keep-alive directive associated with your runtime. For an example, see [Reusing Connections with Keep-Alive in Node.js](https://docs.aws.amazon.com/sdk-for-javascript/v3/developer-guide/node-reusing-connections.html).

**Use [environment variables](configuration-envvars.md) to pass operational parameters to your function.** For example, if you are writing to an Amazon S3 bucket, instead of hard-coding the bucket name you are writing to, configure the bucket name as an environment variable.

**Avoid using recursive invocations** in your Lambda function, where the function invokes itself or initiates a process that may invoke the function again. This could lead to unintended volume of function invocations and escalated costs. If you see an unintended volume of invocations, set the function reserved concurrency to `0` immediately to throttle all invocations to the function, while you update the code.

**Do not use non-documented, non-public APIs** in your Lambda function code. For AWS Lambda managed runtimes, Lambda periodically applies security and functional updates to Lambda's internal APIs. These internal API updates may be backwards-incompatible, leading to unintended consequences such as invocation failures if your function has a dependency on these non-public APIs. See [the API reference](https://docs.aws.amazon.com/lambda/latest/api/welcome.html) for a list of publicly available APIs.

**Write idempotent code.** Writing idempotent code for your functions ensures that duplicate events are handled the same way. Your code should properly validate events and gracefully handle duplicate events. For more information, see [How do I make my Lambda function idempotent?](https://aws.amazon.com/premiumsupport/knowledge-center/lambda-function-idempotent/).

# Build and deploy C\$1 Lambda functions with .zip file archives
Deployment package

A .NET deployment package (.zip file archive) contains your function's compiled assembly along with all of its assembly dependencies. The package also contains a `proj.deps.json` file. This signals to the .NET runtime all of your function's dependencies and a `proj.runtimeconfig.json` file, which is used to configure the runtime.

To deploy individual Lambda functions, you can use the `Amazon.Lambda.Tools` .NET Lambda Global CLI. Using the `dotnet lambda deploy-function` command automatically creates a .zip deployment package and deploys it to Lambda. However, we recommend that you use frameworks like the AWS Serverless Application Model (AWS SAM) or the AWS Cloud Development Kit (AWS CDK) to deploy your .NET applications to AWS.

Serverless applications usually comprise a combination of Lambda functions and other managed AWS services working together to perform a particular business task. AWS SAM and AWS CDK simplify building and deploying Lambda functions with other AWS services at scale. The [AWS SAM template specification](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-specification.html) provides a simple and clean syntax to describe Lambda functions, APIs, permissions, configurations, and other AWS resources that make up your serverless application. With the [AWS CDK](https://docs.aws.amazon.com/cdk/v2/guide/home.html) you define cloud infrastructure as code to help you build reliable, scalable, cost-effective applications in the cloud using modern programming languages and frameworks like .NET. Both the AWS CDK and the AWS SAM use the .NET Lambda Global CLI to package your functions.

While it's possible to use [Lambda layers](chapter-layers.md) with functions in C\$1 by [using the .NET Core CLI](csharp-package-cli.md#csharp-layers), we recommend against it. Functions in C\$1 that use layers manually load the shared assemblies into memory during the [Init phase](lambda-runtime-environment.md#runtimes-lifecycle-ib), which can increase cold start times. Instead, include all shared code at compile time to avoid the performance impact of loading assemblies at runtime.

You can find instructions for building and deploying .NET Lambda functions using the AWS SAM, the AWS CDK, and the .NET Lambda Global CLI in the following sections.

**Topics**
+ [

# Using the .NET Lambda Global CLI
](csharp-package-cli.md)
+ [

# Deploy C\$1 Lambda functions using AWS SAM
](csharp-package-sam.md)
+ [

# Deploy C\$1 Lambda functions using AWS CDK
](csharp-package-cdk.md)
+ [

# Deploy ASP.NET applications
](csharp-package-asp.md)

# Using the .NET Lambda Global CLI
NET Lambda Global CLI

The .NET CLI and the .NET Lambda Global Tools extension (`Amazon.Lambda.Tools`) offer a cross-platform way to create .NET-based Lambda applications, package them, and deploy them to Lambda. In this section, you learn how to create new Lambda .NET projects using the .NET CLI and Amazon Lambda templates, and to package and deploy them using `Amazon.Lambda.Tools`

**Topics**
+ [

## Prerequisites
](#csharp-package-cli-prerequisites)
+ [

## Creating .NET projects using the .NET CLI
](#csharp-package-cli-create)
+ [

## Deploying .NET projects using the .NET CLI
](#csharp-package-cli-deploy)
+ [

## Using Lambda layers with the .NET CLI
](#csharp-layers)

## Prerequisites


**.NET 8 SDK**  
If you haven't already done so, install the [.NET 8](https://dotnet.microsoft.com/en-us/download/dotnet/8.0) SDK and Runtime.

**AWS Amazon.Lambda.Templates .NET project templates**  
To generate your Lambda function code, use the [https://www.nuget.org/packages/Amazon.Lambda.Templates](https://www.nuget.org/packages/Amazon.Lambda.Templates) NuGet package. To install this template package, run the following command:  

```
dotnet new install Amazon.Lambda.Templates
```

**AWS Amazon.Lambda.Tools .NET Global CLI tools**  
To create your Lambda functions, you use the [https://www.nuget.org/packages/Amazon.Lambda.Tools](https://www.nuget.org/packages/Amazon.Lambda.Tools) [.NET Global Tools extension](https://aws.amazon.com/blogs/developer/net-core-global-tools-for-aws/). To install Amazon.Lambda.Tools, run the following command:  

```
dotnet tool install -g Amazon.Lambda.Tools
```
For more information about the Amazon.Lambda.Tools .NET CLI extension, see the [AWS Extensions for .NET CLI](https://github.com/aws/aws-extensions-for-dotnet-cli) repository on GitHub.

## Creating .NET projects using the .NET CLI


In the .NET CLI, you use the `dotnet new` command to create .NET projects from the command line. Lambda offers additional templates using the [https://www.nuget.org/packages/Amazon.Lambda.Templates](https://www.nuget.org/packages/Amazon.Lambda.Templates) NuGet package.

After installing this package, run the following command to see a list of the available templates.

```
dotnet new list
```

To examine details about a template, use the `help` option. For example, to see details about the `lambda.EmptyFunction` template, run the following command.

```
dotnet new lambda.EmptyFunction --help
```

To create a basic template for a .NET Lambda function, use the `lambda.EmptyFunction` template. This creates a simple function that takes a string as input and converts it to upper case using the `ToUpper` method. This template supports the following options: 
+ `--name` – The name of the function.
+ `--region` – The AWS Region to create the function in.
+ `--profile` – The name of a profile in your AWS SDK for .NET credentials file. To learn more about credential profiles in .NET, see [Configure AWS credentials](https://docs.aws.amazon.com/sdk-for-net/v3/developer-guide/net-dg-config-creds.html) in the *AWS SDK for .NET Developer Guide*.

In this example, we create a new empty function named `myDotnetFunction` using the default profile and AWS Region settings:

```
dotnet new lambda.EmptyFunction --name myDotnetFunction
```

This command creates the following files and directories in your project directory.

```
└── myDotnetFunction
    ├── src
    │   └── myDotnetFunction
    │       ├── Function.cs
    │       ├── Readme.md
    │       ├── aws-lambda-tools-defaults.json
    │       └── myDotnetFunction.csproj
    └── test
        └── myDotnetFunction.Tests
            ├── FunctionTest.cs
            └── myDotnetFunction.Tests.csproj
```

Under the `src/myDotnetFunction` directory, examine the following files:
+ **aws-lambda-tools-defaults.json**: This is where you specify the command line options when deploying your Lambda function. For example:

  ```
    "profile" : "default",
    "region" : "us-east-2",
    "configuration" : "Release",
    "function-architecture": "x86_64",
    "function-runtime":"dotnet8",
    "function-memory-size" : 256,
    "function-timeout" : 30,
    "function-handler" : "myDotnetFunction::myDotnetFunction.Function::FunctionHandler"
  ```
+ **Function.cs**: Your Lambda handler function code. It's a C\$1 template that includes the default `Amazon.Lambda.Core` library and a default `LambdaSerializer` attribute. For more information on serialization requirements and options, see [Serialization in C\$1 Lambda functions](csharp-handler.md#csharp-handler-serializer). It also includes a sample function that you can edit to apply your Lambda function code.

  ```
  using Amazon.Lambda.Core;
  
  // Assembly attribute to enable the Lambda function's JSON input to be converted into a .NET class.
  [assembly: LambdaSerializer(typeof(Amazon.Lambda.Serialization.SystemTextJson.DefaultLambdaJsonSerializer))]
  
  namespace myDotnetFunction;
  
  public class Function
  {
  
      /// <summary>
      /// A simple function that takes a string and does a ToUpper
      /// </summary≫
      /// <param name="input"></param>
      /// <param name="context"></param>
      /// <returns></returns>
      public string FunctionHandler(string input, ILambdaContext context)
      {
          return input.ToUpper();
      }
  }
  ```
+ **myDotnetFunction.csproj**: An [MSBuild](https://msdn.microsoft.com/en-us/library/dd393574.aspx) file that lists the files and assemblies that comprise your application.

  ```
  <Project Sdk="Microsoft.NET.Sdk">
    <PropertyGroup>
      <TargetFramework>net8.0</TargetFramework>
      <ImplicitUsings>enable</ImplicitUsings>
      <Nullable>enable</Nullable>
      <GenerateRuntimeConfigurationFiles>true</GenerateRuntimeConfigurationFiles>
      <AWSProjectType>Lambda</AWSProjectType>
      <!-- This property makes the build directory similar to a publish directory and helps the AWS .NET Lambda Mock Test Tool find project dependencies. -->
      <CopyLocalLockFileAssemblies>true</CopyLocalLockFileAssemblies>
      <!-- Generate ready to run images during publishing to improve cold start time. -->
      <PublishReadyToRun>true</PublishReadyToRun>
    </PropertyGroup>
    <ItemGroup>
      <PackageReference Include="Amazon.Lambda.Core" Version="2.2.0" />
      <PackageReference Include="Amazon.Lambda.Serialization.SystemTextJson" Version="2.4.0" />
    </ItemGroup>
  </Project>
  ```
+ **Readme**: Use this file to document your Lambda function.

Under the `myfunction/test` directory, examine the following files:
+ **myDotnetFunction.Tests.csproj**: As noted previously, this is an [MSBuild](https://msdn.microsoft.com/en-us/library/dd393574.aspx) file that lists the files and assemblies that comprise your test project. Note also that it includes the `Amazon.Lambda.Core` library, so you can seamlessly integrate any Lambda templates required to test your function.

  ```
  <Project Sdk="Microsoft.NET.Sdk">
     ... 
  
      <PackageReference Include="Amazon.Lambda.Core" Version="2.2.0 " />
     ...
  ```
+ **FunctionTest.cs**: The same C\$1 code template file that it is included in the `src` directory. Edit this file to mirror your function's production code and test it before uploading your Lambda function to a production environment.

  ```
  using Xunit;
  using Amazon.Lambda.Core;
  using Amazon.Lambda.TestUtilities;
  
  using MyFunction;
  
  namespace MyFunction.Tests
  {
      public class FunctionTest
      {
          [Fact]
          public void TestToUpperFunction()
          {
  
              // Invoke the lambda function and confirm the string was upper cased.
              var function = new Function();
              var context = new TestLambdaContext();
              var upperCase = function.FunctionHandler("hello world", context);
  
              Assert.Equal("HELLO WORLD", upperCase);
          }
      }
  }
  ```

## Deploying .NET projects using the .NET CLI


To build your deployment package and deploy it to Lambda, you use the `Amazon.Lambda.Tools` CLI tools. To deploy your function from the files you created in the previous steps, first navigate into the folder containing your function's `.csproj` file.

```
cd myDotnetFunction/src/myDotnetFunction
```

To deploy your code to Lambda as a .zip deployment package, run the following command. Choose your own function name.

```
dotnet lambda deploy-function myDotnetFunction
```

During the deployment, the wizard asks you to select a [Defining Lambda function permissions with an execution role](lambda-intro-execution-role.md). For this example, select the `lambda_basic_role`.

After you have deployed your function, you can test it in the cloud using the `dotnet lambda invoke-function` command. For the example code in the `lambda.EmptyFunction` template, you can test your function by passing in a string using the `--payload` option.

```
dotnet lambda invoke-function myDotnetFunction --payload "Just checking if everything is OK"
```

If your function has been successfully deployed, you should see output similar to the following.

```
dotnet lambda invoke-function myDotnetFunction --payload "Just checking if everything is OK"
Amazon Lambda Tools for .NET Core applications (5.8.0)
Project Home: https://github.com/aws/aws-extensions-for-dotnet-cli, https://github.com/aws/aws-lambda-dotnet

Payload:
"JUST CHECKING IF EVERYTHING IS OK"

Log Tail:
START RequestId: id Version: $LATEST
END RequestId: id
REPORT RequestId: id  Duration: 0.99 ms       Billed Duration: 1 ms         Memory Size: 256 MB     Max Memory Used: 12 MB
```

## Using Lambda layers with the .NET CLI


**Note**  
While it's possible to use [layers](chapter-layers.md) with functions in .NET, we recommend against it. Functions in .NET that use layers manually load the shared assemblies into memory during the `Init` phase, which can increase cold start times. Instead, include all shared code at compile time to take advantage of the built-in optimizations of the .NET compiler.

The .NET CLI supports commands to help you publish layers and deploy C\$1 functions that consume layers. To publish a layer to a specified Amazon S3 bucket, run the following command in the same directory as your `.csproj` file:

```
dotnet lambda publish-layer <layer_name> --layer-type runtime-package-store --s3-bucket <s3_bucket_name>
```

Then, when you deploy your function using the .NET CLI, specify the layer ARN the consume in the following command:

```
dotnet lambda deploy-function <function_name> --function-layers arn:aws:lambda:us-east-1:123456789012:layer:layer-name:1
```

For a complete example of a Hello World function, see the [ blank-csharp-with-layer](https://github.com/awsdocs/aws-lambda-developer-guide/tree/main/sample-apps/blank-csharp-with-layer) sample.

# Deploy C\$1 Lambda functions using AWS SAM
AWS SAM

The AWS Serverless Application Model (AWS SAM) is a toolkit that helps streamline the process of building and running serverless applications on AWS. You define the resources for your application in a YAML or JSON template and use the AWS SAM command line interface (AWS SAM CLI) to build, package, and deploy your applications. When you build a Lambda function from an AWS SAM template, AWS SAM automatically creates a .zip deployment package or container image with your function code and any dependencies you specify. AWS SAM then deploys your function using an [CloudFormation stack](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/stacks.html). To learn more about using AWS SAM to build and deploy Lambda functions, see [Getting started with AWS SAM](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-getting-started.html) in the *AWS Serverless Application Model Developer Guide*.

The following steps show you how to download, build, and deploy a sample .NET Hello World application using AWS SAM. This sample application uses a Lambda function and an Amazon API Gateway endpoint to implement a basic API backend. When you send an HTTP GET request to your API Gateway endpoint, API Gateway invokes your Lambda function. The function returns a "hello world" message, along with the IP address of the Lambda function instance that processes your request.

When you build and deploy your application using AWS SAM, behind the scenes the AWS SAM CLI uses the `dotnet lambda package` command to package the individual Lambda function code bundles.

## Prerequisites


**.NET 8 SDK**  
Install the [.NET 8](https://dotnet.microsoft.com/en-us/download/dotnet/8.0) SDK and Runtime.

**AWS SAM CLI version 1.39 or later**  
To learn how to install the latest version of the AWS SAM CLI, see [Installing the AWS SAM CLI](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/install-sam-cli.html).

## Deploy a sample AWS SAM application


1. Initialize the application using the Hello world .NET template using the following command.

   ```
   sam init --app-template hello-world --name sam-app \
   --package-type Zip --runtime dotnet8
   ```

   This command creates the following files and directories in your project directory.

   ```
   └── sam-app
       ├── README.md
       ├── events
       │   └── event.json
       ├── omnisharp.json
       ├── samconfig.toml
       ├── src
       │   └── HelloWorld
       │       ├── Function.cs
       │       ├── HelloWorld.csproj
       │       └── aws-lambda-tools-defaults.json
       ├── template.yaml
       └── test
           └── HelloWorld.Test
               ├── FunctionTest.cs
               └── HelloWorld.Tests.csproj
   ```

1. Navigate into the directory containing the `template.yaml file`. This file is a tempate that defines the AWS resources for your application, including your Lambda function and an API Gateway API.

   ```
   cd sam-app
   ```

1. To build the source of your application, run the following command.

   ```
   sam build
   ```

1. To deploy your application to AWS, run the following command.

   ```
   sam deploy --guided
   ```

   This command packages and deploys your application with the following series of prompts. To accept the default options, press Enter.
**Note**  
For **HelloWorldFunction may not have authorization defined, is this okay?**, be sure to enter `y`.
   + **Stack Name**: The name of the stack to deploy to CloudFormation. This name must be unique to your AWS account and AWS Region.
   + **AWS Region**: The AWS Region you want to deploy your app to.
   + **Confirm changes before deploy**: Select yes to manually review any change sets before AWS SAM deploys application changes. If you select no, the AWS SAM CLI automatically deploys application changes.
   + **Allow SAM CLI IAM role creation**: Many AWS SAM templates, including the Hello world one in this example, create AWS Identity and Access Management (IAM) roles to give your Lambda functions permission to access other AWS services. Select Yes to provide permission to deploy a CloudFormation stack that creates or modifies IAM roles.
   + **Disable rollback**: By default, if AWS SAM encounters an error during creation or deployment of your stack, it rolls the stack back to the previous version. Select No to accept this default.
   + **HelloWorldFunction may not have authorization defined, is this okay**: Enter `y`.
   + **Save arguments to samconfig.toml**: Select yes to save your configuration choices. In the future, you can re-run `sam deploy` without parameters to deploy changes to your application.

1. When the deployment of your application is complete, the CLI returns the Amazon Resource Name (ARN) of the Hello World Lambda function and the IAM role created for it. It also displays the endpoint of your API Gateway API. To test your application, open the endpoint in a browser. You should see a response similar to the following.

   ```
   {"message":"hello world","location":"34.244.135.203"}
   ```

1. To delete your resources, run the following command. Note that the API endpoint you created is a public endpoint accessible over the internet. We recommend that you delete this endpoint after testing.

   ```
   sam delete
   ```

## Next steps


To learn more about using AWS SAM to build and deploy Lambda functions using .NET, see the following resources:
+ The [https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/what-is-sam.html](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/what-is-sam.html)
+ [Building Serverless .NET Applications with AWS Lambda and the SAM CLI](https://aws.amazon.com/blogs/dotnet/building-serverless-net-applications-with-aws-lambda-and-the-sam-cli/)

# Deploy C\$1 Lambda functions using AWS CDK
AWS CDK

The AWS Cloud Development Kit (AWS CDK) is an open-source software development framework for defining cloud infrastructure as code with modern programming languages and frameworks like .NET. AWS CDK projects are executed to generate CloudFormation templates which are then used to deploy your code.

To build and deploy an example Hello world .NET application using the AWS CDK, follow the instructions in the following sections. The sample application implements a basic API backend consisting of an API Gateway endpoint and a Lambda function. API Gateway invokes the Lambda function when you send an HTTP GET request to the endpoint. The function returns a Hello world message, along with the IP address of the Lambda instance that processes your request.

## Prerequisites


**.NET 8 SDK**  
Install the [.NET 8](https://dotnet.microsoft.com/en-us/download/dotnet/8.0) SDK and Runtime.

**AWS CDK version 2**  
To learn how to install the latest version of the AWS CDK see [Getting started with the AWS CDK](https://docs.aws.amazon.com/cdk/v2/guide/getting_started.html) in the *AWS Cloud Development Kit (AWS CDK) v2 Developer Guide*.

## Deploy a sample AWS CDK application


1. Create a project directory for the sample application and navigate into it.

   ```
   mkdir hello-world
   cd hello-world
   ```

1. Initialize a new AWS CDK application by running the following command.

   ```
   cdk init app --language csharp
   ```

   The command creates the following files and directories in your project directory

   ```
   ├── README.md
   ├── cdk.json
   └── src
       ├── HelloWorld
       │   ├── GlobalSuppressions.cs
       │   ├── HelloWorld.csproj
       │   ├── HelloWorldStack.cs
       │   └── Program.cs
       └── HelloWorld.sln
   ```

1. Open the `src` directory and create a new Lambda function using the .NET CLI. This is the function you will deploy using the AWS CDK. In this example, you create a Hello world function named `HelloWorldLambda`using the `lambda.EmptyFunction` template.

   ```
   cd src
   dotnet new lambda.EmptyFunction -n HelloWorldLambda
   ```

   After this step, your directory structure inside your project directory should look like the following.

   ```
   ├── README.md
   ├── cdk.json
   └── src
       ├── HelloWorld
       │   ├── GlobalSuppressions.cs
       │   ├── HelloWorld.csproj
       │   ├── HelloWorldStack.cs
       │   └── Program.cs
       ├── HelloWorld.sln
       └── HelloWorldLambda
           ├── src
           │   └── HelloWorldLambda
           │       ├── Function.cs
           │       ├── HelloWorldLambda.csproj
           │       ├── Readme.md
           │       └── aws-lambda-tools-defaults.json
           └── test
               └── HelloWorldLambda.Tests
                   ├── FunctionTest.cs
                   └── HelloWorldLambda.Tests.csproj
   ```

1. Open the `HelloWorldStack.cs` file from the `src/HelloWorld` directory. Replace the contents of the file with the following code.

   ```
   using Amazon.CDK;
   using Amazon.CDK.AWS.Lambda;
   using Amazon.CDK.AWS.Logs;
   using Constructs;
   
   namespace CdkTest
   {
       public class HelloWorldStack : Stack
       {
           internal HelloWorldStack(Construct scope, string id, IStackProps props = null) : base(scope, id, props)
           {
               var buildOption = new BundlingOptions()
               {
                   Image = Runtime.DOTNET_8.BundlingImage,
                   User = "root",
                   OutputType = BundlingOutput.ARCHIVED,
                   Command = new string[]{
               "/bin/sh",
                   "-c",
                   " dotnet tool install -g Amazon.Lambda.Tools"+
                   " && dotnet build"+
                   " && dotnet lambda package --output-package /asset-output/function.zip"
                   }
               };
   
                var helloWorldLambdaFunction = new Function(this, "HelloWorldFunction", new FunctionProps
               {
                   Runtime = Runtime.DOTNET_8,
                   MemorySize = 1024,
                   LogRetention = RetentionDays.ONE_DAY,
                   Handler = "HelloWorldLambda::HelloWorldLambda.Function::FunctionHandler",
                   Code = Code.FromAsset("./src/HelloWorldLambda/src/HelloWorldLambda", new Amazon.CDK.AWS.S3.Assets.AssetOptions
                   {
                       Bundling = buildOption
                   }),
               });
           }
       }
   }
   ```

   This is the code to compile and bundle the application code, as well as the definition of the Lambda function itself. the `BundlingOptions` object allows a zip file to be created, along with a set of commands that are used to generate the contents of the zip file. In this instance, the `dotnet lambda package` command is used to compile and generate the zip file.

1. To deploy your application, run the following command.

   ```
   cdk deploy
   ```

1. Invoke your deployed Lambda function using the .NET Lambda CLI.

   ```
   dotnet lambda invoke-function HelloWorldFunction -p "hello world"
   ```

1. After you've finished testing, you can delete the resources you created, unless you want to retain them. Run the following command to delete your resources.

   ```
   cdk destroy
   ```

## Next steps


To learn more about using AWS CDK to build and deploy Lambda functions using .NET, see the following resources:
+ [Working with the AWS CDK in C\$1](https://docs.aws.amazon.com/cdk/v2/guide/work-with-cdk-csharp.html)
+ [Build, package, and publish .NET C\$1 Lambda functions with the AWS CDK](https://aws.amazon.com/blogs/modernizing-with-aws/build-package-publish-dotnet-csharp-lambda-functions-aws-cdk/)

# Deploy ASP.NET applications
ASP.NET

As well as hosting event-driven functions, you can also use .NET with Lambda to host lightweight ASP.NET applications. You can build and deploy ASP.NET applications using the `Amazon.Lambda.AspNetCoreServer` NuGet package. In this section, you learn how to deploy an ASP.NET web API to Lambda using the .NET Lambda CLI tooling.

**Topics**
+ [

## Prerequisites
](#csharp-package-asp-prerequisites)
+ [

## Deploying an ASP.NET Web API to Lambda
](#csharp-package-asp-deploy-api)
+ [

## Deploying ASP.NET minimal APIs to Lambda
](#csharp-package-asp-deploy-minimal)

## Prerequisites


**.NET 8 SDK**  
Install the [.NET 8](https://dotnet.microsoft.com/en-us/download/dotnet/8.0) SDK and ASP.NET Core Runtime.

**Amazon.Lambda.Tools**  
To create your Lambda functions, you use the [https://www.nuget.org/packages/Amazon.Lambda.Tools](https://www.nuget.org/packages/Amazon.Lambda.Tools) [.NET Global Tools extension](https://aws.amazon.com/blogs/developer/net-core-global-tools-for-aws/). To install Amazon.Lambda.Tools, run the following command:  

```
dotnet tool install -g Amazon.Lambda.Tools
```
For more information about the Amazon.Lambda.Tools .NET CLI extension, see the [AWS Extensions for .NET CLI](https://github.com/aws/aws-extensions-for-dotnet-cli) repository on GitHub.

**Amazon.Lambda.Templates**  
To generate your Lambda function code, use the [https://www.nuget.org/packages/Amazon.Lambda.Templates](https://www.nuget.org/packages/Amazon.Lambda.Templates) NuGet package. To install this template package, run the following command:  

```
dotnet new --install Amazon.Lambda.Templates
```

## Deploying an ASP.NET Web API to Lambda


To deploy a web API using ASP.NET, you can use the .NET Lambda templates to create a new web API project. Use the following command to initialize a new ASP.NET web API project. In the example command, we name the project `AspNetOnLambda`.

```
dotnet new serverless.AspNetCoreWebAPI -n AspNetOnLambda
```

This command creates the following files and directories in your project directory.

```
.
└── AspNetOnLambda
    ├── src
    │   └── AspNetOnLambda
    │       ├── AspNetOnLambda.csproj
    │       ├── Controllers
    │       │   └── ValuesController.cs
    │       ├── LambdaEntryPoint.cs
    │       ├── LocalEntryPoint.cs
    │       ├── Readme.md
    │       ├── Startup.cs
    │       ├── appsettings.Development.json
    │       ├── appsettings.json
    │       ├── aws-lambda-tools-defaults.json
    │       └── serverless.template
    └── test
        └── AspNetOnLambda.Tests
            ├── AspNetOnLambda.Tests.csproj
            ├── SampleRequests
            │   └── ValuesController-Get.json
            ├── ValuesControllerTests.cs
            └── appsettings.json
```

When Lambda invokes your function, the entry point it uses is the `LambdaEntryPoint.cs` file. The file created by the .NET Lambda template contains the following code.

```
namespace AspNetOnLambda;

public class LambdaEntryPoint : Amazon.Lambda.AspNetCoreServer.APIGatewayProxyFunction
{
    protected override void Init(IWebHostBuilder builder)
    {
        builder
            .UseStartup≪Startup≫();
    }

    protected override void Init(IHostBuilder builder)
    {
    }
}
```

The entry point used by Lambda must inherit from one of the three base classes in the `Amazon.Lambda.AspNetCoreServer` package. These three base classes are:
+ `APIGatewayProxyFunction`
+ `APIGatewayHttpApiV2ProxyFunction`
+ `ApplicationLoadBalancerFunction`

The default class used when you create your `LambdaEntryPoint.cs` file using the provided .NET Lambda template is `APIGatewayProxyFunction`. The base class you use in your function depends on which API layer sits in front of your Lambda function.

Each of the three base classes contains a public method named `FunctionHandlerAsync`. The name of this method will form part of the [handler string](csharp-handler.md#csharp-class-library-handlers) Lambda uses to invoke your function. The `FunctionHandlerAsync` method transforms the inbound event payload into the correct ASP.NET format and the ASP.NET response back to a Lambda response payload. For the example `AspNetOnLambda` project shown, the handler string would be as follows.

```
AspNetOnLambda::AspNetOnLambda.LambdaEntryPoint::FunctionHandlerAsync
```

To deploy the API to Lambda, run the following commands to navigate into the directory containing your source code file and deploy your function using CloudFormation.

```
cd AspNetOnLambda/src/AspNetOnLambda
dotnet lambda deploy-serverless
```

**Tip**  
When you deploy an API using the `dotnet lambda deploy-serverless` command, CloudFormation gives your Lambda function a name based on the stack name you specify during the deployment. To give your Lambda function a custom name, edit the `serverless.template` file to add a `FunctionName` property to the `AWS::Serverless::Function` resource. See [Name type](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-name.html) in the *CloudFormation User Guide* to learn more.

## Deploying ASP.NET minimal APIs to Lambda


To deploy an ASP.NET minimal API to Lambda, you can use the .NET Lambda templates to create a new minimal API project. Use the following command to initialize a new minimal API project. In this example, we name the project `MinimalApiOnLambda`.

```
dotnet new serverless.AspNetCoreMinimalAPI -n MinimalApiOnLambda
```

The command creates the following files and directories in your project directory.

```
└── MinimalApiOnLambda
    └── src
        └── MinimalApiOnLambda
            ├── Controllers
            │   └── CalculatorController.cs
            ├── MinimalApiOnLambda.csproj
            ├── Program.cs
            ├── Readme.md
            ├── appsettings.Development.json
            ├── appsettings.json
            ├── aws-lambda-tools-defaults.json
            └── serverless.template
```

The `Program.cs` file contains the following code.

```
var builder = WebApplication.CreateBuilder(args);

// Add services to the container.
builder.Services.AddControllers();

// Add AWS Lambda support. When application is run in Lambda Kestrel is swapped out as the web server with Amazon.Lambda.AspNetCoreServer. This
// package will act as the webserver translating request and responses between the Lambda event source and ASP.NET Core.
builder.Services.AddAWSLambdaHosting(LambdaEventSource.RestApi);

var app = builder.Build();


app.UseHttpsRedirection();
app.UseAuthorization();
app.MapControllers();

app.MapGet("/", () => "Welcome to running ASP.NET Core Minimal API on AWS Lambda");

app.Run();
```

To configure your minimal API to run on Lambda, you may need to edit this code so that requests and responses between Lambda and ASP.NET Core are properly translated. By default, the function is configured for a REST API event source. For an HTTP API or application load balancer, replace `(LambdaEventSource.RestApi)` with one of the following options:
+ `(LambdaEventSource.HttpAPi)`
+ `(LambdaEventSource.ApplicationLoadBalancer)`

To deploy your minimal API to Lambda, run the following commands to navigate into the directory containing your source code file and deploy your function using CloudFormation.

```
cd MinimalApiOnLambda/src/MinimalApiOnLambda
dotnet lambda deploy-serverless
```

# Working with layers for .NET Lambda functions
Layers

We don't recommend using [layers](chapter-layers.md) to manage dependencies for Lambda functions written in .NET. This is because .NET is a compiled language, and your functions still have to manually load any shared assemblies into memory during the [Init](lambda-runtime-environment.md#runtimes-lifecycle-ib) phase, which can increase cold start times. Using layers not only complicates the deployment process, but also prevents you from taking advantage of built-in compiler optimizations.

To use external dependencies with your .NET handlers, include them directly in your deployment package at compile time. By doing so, you simplify the deployment process and also take advantage of built-in .NET compiler optimizations. For an example of how to import and use dependencies like NuGet packages in your function, see [Define Lambda function handler in C\$1](csharp-handler.md).

# Deploy .NET Lambda functions with container images
Deploy container images

There are three ways to build a container image for a .NET Lambda function:
+ [Using an AWS base image for .NET](#csharp-image-instructions)

  The [AWS base images](images-create.md#runtimes-images-lp) are preloaded with a language runtime, a runtime interface client to manage the interaction between Lambda and your function code, and a runtime interface emulator for local testing.
+ [Using an AWS OS-only base image](images-create.md#runtimes-images-provided)

  [AWS OS-only base images](https://gallery.ecr.aws/lambda/provided) contain an Amazon Linux distribution and the [runtime interface emulator](https://github.com/aws/aws-lambda-runtime-interface-emulator/). These images are commonly used to create container images for compiled languages, such as [Go](go-image.md#go-image-provided) and [Rust](lambda-rust.md), and for a language or language version that Lambda doesn't provide a base image for, such as Node.js 19. You can also use OS-only base images to implement a [custom runtime](runtimes-custom.md). To make the image compatible with Lambda, you must include the [the runtime interface client for .NET](#csharp-image-clients) in the image.
+ [Using a non-AWS base image](#csharp-image-clients)

  You can use an alternative base image from another container registry, such as Alpine Linux or Debian. You can also use a custom image created by your organization. To make the image compatible with Lambda, you must include the [the runtime interface client for .NET](#csharp-image-clients) in the image.

**Tip**  
To reduce the time it takes for Lambda container functions to become active, see [Use multi-stage builds](https://docs.docker.com/build/building/multi-stage/) in the Docker documentation. To build efficient container images, follow the [Best practices for writing Dockerfiles](https://docs.docker.com/develop/develop-images/dockerfile_best-practices/).

This page explains how to build, test, and deploy container images for Lambda.

**Topics**
+ [

## AWS base images for .NET
](#csharp-image-base)
+ [

## Using an AWS base image for .NET
](#csharp-image-instructions)
+ [

## Using an alternative base image with the runtime interface client
](#csharp-image-clients)

## AWS base images for .NET


AWS provides the following base images for .NET:


| Tags | Runtime | Operating system | Dockerfile | Deprecation | 
| --- | --- | --- | --- | --- | 
| 10 | .NET 10 | Amazon Linux 2023 | [Dockerfile for .NET 10 on GitHub](https://github.com/aws/aws-lambda-base-images/blob/dotnet10/Dockerfile.dotnet10) |   Nov 14, 2028   | 
| 9 | .NET 9 | Amazon Linux 2023 | [Dockerfile for .NET 9 on GitHub](https://github.com/aws/aws-lambda-base-images/blob/dotnet9/Dockerfile.dotnet9) |   Nov 10, 2026   | 
| 8 | .NET 8 | Amazon Linux 2023 | [Dockerfile for .NET 8 on GitHub](https://github.com/aws/aws-lambda-base-images/blob/dotnet8/Dockerfile.dotnet8) |   Nov 10, 2026   | 

Amazon ECR repository: [gallery.ecr.aws/lambda/dotnet](https://gallery.ecr.aws/lambda/dotnet)

## Using an AWS base image for .NET
Using an AWS base image

### Prerequisites


To complete the steps in this section, you must have the following:
+ [.NET SDK](https://dotnet.microsoft.com/download) – The following steps use the .NET 8 base image. Make sure that your .NET version matches the version of the [base image](https://gallery.ecr.aws/lambda/dotnet) that you specify in your Dockerfile.
+ [Docker](https://docs.docker.com/get-docker) (minimum version 25.0.0)
+ The Docker [buildx plugin](https://github.com/docker/buildx/blob/master/README.md).

### Creating and deploying an image using a base image


In the following steps, you use [Amazon.Lambda.Templates](https://github.com/aws/aws-lambda-dotnet#dotnet-cli-templates) and [Amazon.Lambda.Tools](https://github.com/aws/aws-extensions-for-dotnet-cli#aws-lambda-amazonlambdatools) to create a .NET project. Then, you build a Docker image, upload the image to Amazon ECR, and deploy it to a Lambda function.

1. Install the [Amazon.Lambda.Templates](https://github.com/aws/aws-lambda-dotnet#dotnet-cli-templates) NuGet package.

   ```
   dotnet new install Amazon.Lambda.Templates
   ```

1. Create a .NET project using the `lambda.image.EmptyFunction` template.

   ```
   dotnet new lambda.image.EmptyFunction --name MyFunction --region us-east-1
   ```

   The project files are stored in the `MyFunction/src/MyFunction` directory:
   + **aws-lambda-tools-defaults.json**: Specifies the command line options for deploying your Lambda function.
   + **Function.cs**: Your Lambda handler function code. This is a C\$1 template that includes the default `Amazon.Lambda.Core` library and a default `LambdaSerializer` attribute. For more information about serialization requirements and options, see [Serialization in C\$1 Lambda functions](csharp-handler.md#csharp-handler-serializer). You can use the provided code for testing, or replace it with your own.
   + **MyFunction.csproj**: A .NET [project file](https://learn.microsoft.com/en-us/dotnet/core/project-sdk/overview#project-files), which lists the files and assemblies that comprise your application.
   + **Dockerfile**: You can use the provided Dockerfile for testing, or replace it with your own. If you use your own, make sure to:
     + Set the `FROM` property to the [URI of the base image](https://gallery.ecr.aws/lambda/dotnet). The base image and the `TargetFramework` in the `MyFunction.csproj` file must both use the same .NET version. For example, to use .NET 9:
       + Dockerfile: `FROM public.ecr.aws/lambda/dotnet:9`
       + MyFunction.csproj: `<TargetFramework>net9.0</TargetFramework>`
     + Set the `CMD` argument to the Lambda function handler. This should match the `image-command` in `aws-lambda-tools-defaults.json`.

1. Install the Amazon.Lambda.Tools [.NET Global Tool](https://aws.amazon.com/blogs/developer/net-core-global-tools-for-aws/).

   ```
   dotnet tool install -g Amazon.Lambda.Tools
   ```

   If Amazon.Lambda.Tools is already installed, make sure that you have the latest version.

   ```
   dotnet tool update -g Amazon.Lambda.Tools
   ```

1. Change the directory to `MyFunction/src/MyFunction`, if you're not there already.

   ```
   cd src/MyFunction
   ```

1. Use Amazon.Lambda.Tools to build the Docker image, push it to a new Amazon ECR repository, and deploy the Lambda function.

   For `--function-role`, specify the role name—not the Amazon Resource Name (ARN)—of the [execution role](lambda-intro-execution-role.md) for the function. For example, `lambda-role`.

   ```
   dotnet lambda deploy-function MyFunction --function-role lambda-role
   ```

   For more information about the Amazon.Lambda.Tools .NET Global Tool, see the [AWS Extensions for .NET CLI](https://github.com/aws/aws-extensions-for-dotnet-cli) repository on GitHub.

1. Invoke the function.

   ```
   dotnet lambda invoke-function MyFunction --payload "Testing the function"
   ```

   If everything is successful, you see a response similar to the following:

   ```
   Payload:
   {"Lower":"testing the function","Upper":"TESTING THE FUNCTION"}
   
   Log Tail:
   INIT_REPORT Init Duration: 9999.81 ms   Phase: init     Status: timeout
   START RequestId: 12378346-f302-419b-b1f2-deaa1e8423ed Version: $LATEST
   END RequestId: 12378346-f302-419b-b1f2-deaa1e8423ed
   REPORT RequestId: 12378346-f302-419b-b1f2-deaa1e8423ed  Duration: 3173.06 ms    Billed Duration: 3174 ms        Memory Size: 512 MB     Max Memory Used: 24 MB
   ```

1. Delete the Lambda function.

   ```
   dotnet lambda delete-function MyFunction
   ```

## Using an alternative base image with the runtime interface client
Using a non-AWS base image

If you use an [OS-only base image](images-create.md#runtimes-images-provided) or an alternative base image, you must include the runtime interface client in your image. The runtime interface client extends the [Runtime API](runtimes-api.md), which manages the interaction between Lambda and your function code.

The following example demonstrates how to build a container image for .NET using a non-AWS base image, and how to add the [Amazon.Lambda.RuntimeSupport package](https://github.com/aws/aws-lambda-dotnet/blob/master/Libraries/src/Amazon.Lambda.RuntimeSupport/README.md#using-amazonlambdaruntimesupport-as-a-class-library), which is the Lambda runtime interface client for .NET. The example Dockerfile uses the Microsoft .NET 8 base image.

### Prerequisites


To complete the steps in this section, you must have the following:
+ [.NET SDK](https://dotnet.microsoft.com/download) – The following steps use a .NET 9 base image. Make sure that your .NET version matches the version of the base image that you specify in your Dockerfile.
+ [Docker](https://docs.docker.com/get-docker) (minimum version 25.0.0)
+ The Docker [buildx plugin](https://github.com/docker/buildx/blob/master/README.md).

### Creating and deploying an image using an alternative base image


1. Install the [Amazon.Lambda.Templates](https://github.com/aws/aws-lambda-dotnet#dotnet-cli-templates) NuGet package.

   ```
   dotnet new install Amazon.Lambda.Templates
   ```

1. Create a .NET project using the `lambda.CustomRuntimeFunction` template. This template includes the [Amazon.Lambda.RuntimeSupport](https://github.com/aws/aws-lambda-dotnet/blob/master/Libraries/src/Amazon.Lambda.RuntimeSupport/README.md#using-amazonlambdaruntimesupport-as-a-class-library) package.

   ```
   dotnet new lambda.CustomRuntimeFunction --name MyFunction --region us-east-1
   ```

1. Navigate to the `MyFunction/src/MyFunction` directory. This is where the project files are stored. Examine the following files:
   + **aws-lambda-tools-defaults.json** – This file is where you specify the command line options when deploying your Lambda function.
   + **Function.cs** – The code contains a class with a `Main` method that initializes the `Amazon.Lambda.RuntimeSupport` library as the bootstrap. The `Main` method is the entry point for the function's process. The `Main` method wraps the function handler in a wrapper that the bootstrap can work with. For more information, see [Using Amazon.Lambda.RuntimeSupport as a class library](https://github.com/aws/aws-lambda-dotnet/blob/master/Libraries/src/Amazon.Lambda.RuntimeSupport/README.md#using-amazonlambdaruntimesupport-as-a-class-library) in the GitHub repository.
   + **MyFunction.csproj** – A .NET [project file](https://learn.microsoft.com/en-us/dotnet/core/project-sdk/overview#project-files), which lists the files and assemblies that comprise your application.
   + **Readme.md** – This file contains more information about the sample Lambda function.

1. Open the `aws-lambda-tools-defaults.json` file and Add the following lines:

   ```
     "package-type": "image",
     "docker-host-build-output-dir": "./bin/Release/lambda-publish"
   ```
   + **package-type**: Defines the deployment package as a container image.
   + **docker-host-build-output-dir**: Sets the output directory for the build process.  
**Example aws-lambda-tools-defaults.json**  

   ```
   {
     "Information": [
       "This file provides default values for the deployment wizard inside Visual Studio and the AWS Lambda commands added to the .NET Core CLI.",
       "To learn more about the Lambda commands with the .NET Core CLI execute the following command at the command line in the project root directory.",
       "dotnet lambda help",
       "All the command line options for the Lambda command can be specified in this file."
     ],
     "profile": "",
     "region": "us-east-1",
     "configuration": "Release",
     "function-runtime": "provided.al2023",
     "function-memory-size": 256,
     "function-timeout": 30,
     "function-handler": "bootstrap",
     "msbuild-parameters": "--self-contained true",
     "package-type": "image",
     "docker-host-build-output-dir": "./bin/Release/lambda-publish"
   }
   ```

1. Create a Dockerfile in the `MyFunction/src/MyFunction` directory. The following example Dockerfile uses a Microsoft .NET base image instead of an [AWS base image](#csharp-image-base).
   + Set the `FROM` property to the base image identifier. The base image and the `TargetFramework` in the `MyFunction.csproj` file must both use the same .NET version.
   + Use the `COPY` command to copy the function into the `/var/task` directory.
   + Set the `ENTRYPOINT` to the module that you want the Docker container to run when it starts. In this case, the module is the bootstrap, which initializes the `Amazon.Lambda.RuntimeSupport` library.

   Note that the example Dockerfile does not include a [USER instruction](https://docs.docker.com/reference/dockerfile/#user). When you deploy a container image to Lambda, Lambda automatically defines a default Linux user with least-privileged permissions. This is different from standard Docker behavior which defaults to the `root` user when no `USER` instruction is provided.  
**Example Dockerfile**  

   ```
   # You can also pull these images from DockerHub amazon/aws-lambda-dotnet:8
   FROM mcr.microsoft.com/dotnet/runtime:9.0
   
   # Set the image's internal work directory
   WORKDIR /var/task
     
   # Copy function code to Lambda-defined environment variable
   COPY "bin/Release/net9.0/linux-x64"  .
     
   # Set the entrypoint to the bootstrap
   ENTRYPOINT ["/usr/bin/dotnet", "exec", "/var/task/bootstrap.dll"]
   ```

1. Install the Amazon.Lambda.Tools [.NET Global Tools extension](https://aws.amazon.com/blogs/developer/net-core-global-tools-for-aws/).

   ```
   dotnet tool install -g Amazon.Lambda.Tools
   ```

   If Amazon.Lambda.Tools is already installed, make sure that you have the latest version.

   ```
   dotnet tool update -g Amazon.Lambda.Tools
   ```

1. Use Amazon.Lambda.Tools to build the Docker image, push it to a new Amazon ECR repository, and deploy the Lambda function.

   For `--function-role`, specify the role name—not the Amazon Resource Name (ARN)—of the [execution role](lambda-intro-execution-role.md) for the function. For example, `lambda-role`.

   ```
   dotnet lambda deploy-function MyFunction --function-role lambda-role
   ```

   For more information about the Amazon.Lambda.Tools .NET CLI extension, see the [AWS Extensions for .NET CLI](https://github.com/aws/aws-extensions-for-dotnet-cli) repository on GitHub.

1. Invoke the function.

   ```
   dotnet lambda invoke-function MyFunction --payload "Testing the function"
   ```

   If everything is successful, you see the following:

   ```
   Payload:
   "TESTING THE FUNCTION"
   
   Log Tail:
   START RequestId: id Version: $LATEST
   END RequestId: id
   REPORT RequestId: id  Duration: 0.99 ms       Billed Duration: 1 ms         Memory Size: 256 MB     Max Memory Used: 12 MB
   ```

1. Delete the Lambda function.

   ```
   dotnet lambda delete-function MyFunction
   ```

# Compile .NET Lambda function code to a native runtime format
Native AOT compilation

.NET 8 supports native ahead-of-time (AOT) compilation. With native AOT, you can compile your Lambda function code to a native runtime format, which removes the need to compile .NET code at runtime. Native AOT compilation can reduce the cold start time for Lambda functions that you write in .NET. For more information, see [Introducing the .NET 8 runtime for AWS Lambda](https://aws.amazon.com/blogs/compute/introducing-the-net-8-runtime-for-aws-lambda/) on the AWS Compute Blog.

**Topics**
+ [

## Lambda runtime
](#dotnet-native-aot-runtime)
+ [

## Prerequisites
](#dotnet-native-aot-prerequisites)
+ [

## Getting started
](#dotnet-native-aot-getting-started)
+ [

## Serialization
](#dotnet-native-aot-serialization)
+ [

## Trimming
](#dotnet-native-aot-trimming)
+ [

## Troubleshooting
](#dotnet-native-aot-troubleshooting)

## Lambda runtime


To deploy a Lambda function build with native AOT compilation, use the managed .NET 8 Lambda runtime. This runtime supports the use of both x86\$164 and arm64 architectures.

When you deploy a .NET Lambda function without using AOT, your application is first compiled into Intermediate Language (IL) code. At runtime, the just-in-time (JIT) compiler in the Lambda runtime takes the IL code and compiles it into machine code as needed. With a Lambda function that is compiled ahead of time with native AOT, you compile your code into machine code when you deploy your function, so you're not dependent on the .NET runtime or SDK in the Lambda runtime to compile your code before it runs.

One limitation of AOT is that your application code must be compiled in an environment with the same Amazon Linux 2023 (AL2023) operating system that the .NET 8 runtime uses. The .NET Lambda CLI provides functionality to compile your application in a Docker container using an AL2023 image.

To avoid potential issues with cross-architecture compatibility, we strongly recommend that you compile your code in an environment with the same processor architecture that you configure for your function. To learn more about the limitations of cross-architecture compilation, see [Cross-compilation](https://learn.microsoft.com/en-us/dotnet/core/deploying/native-aot/cross-compile) in the Microsoft .NET documentation.

## Prerequisites


**Docker**  
To use native AOT, your function code must be compiled in an environment with the same AL2023 operating system as the .NET 8 runtime. The .NET CLI commands in the following sections use Docker to develop and build Lambda functions in an AL2023 environment.

**.NET 8 SDK**  
Native AOT compilation is a feature of .NET 8. You must install the [.NET 8 SDK](https://dotnet.microsoft.com/en-us/download/dotnet/8.0) on your build machine, not only the runtime.

**Amazon.Lambda.Tools**  
To create your Lambda functions, you use the [https://www.nuget.org/packages/Amazon.Lambda.Tools](https://www.nuget.org/packages/Amazon.Lambda.Tools) [.NET Global Tools extension](https://aws.amazon.com/blogs/developer/net-core-global-tools-for-aws/). To install Amazon.Lambda.Tools, run the following command:  

```
dotnet tool install -g Amazon.Lambda.Tools
```
For more information about the Amazon.Lambda.Tools .NET CLI extension, see the [AWS Extensions for .NET CLI](https://github.com/aws/aws-extensions-for-dotnet-cli) repository on GitHub.

**Amazon.Lambda.Templates**  
To generate your Lambda function code, use the [https://www.nuget.org/packages/Amazon.Lambda.Templates](https://www.nuget.org/packages/Amazon.Lambda.Templates) NuGet package. To install this template package, run the following command:  

```
dotnet new install Amazon.Lambda.Templates
```

## Getting started


Both the .NET Global CLI and the AWS Serverless Application Model (AWS SAM) provide getting started templates for building applications using native AOT. To build your first native AOT Lambda function, carry out the steps in the following instructions.

**To initialize and deploy a native AOT compiled Lambda function**

1. Initialize a new project using the native AOT template and then navigate into the directory containing the created `.cs` and `.csproj` files. In this example, we name our function `NativeAotSample`.

   ```
   dotnet new lambda.NativeAOT -n NativeAotSample
   cd ./NativeAotSample/src/NativeAotSample
   ```

   The `Function.cs` file created by the native AOT template contains the following function code.

   ```
   using Amazon.Lambda.Core;
   using Amazon.Lambda.RuntimeSupport;
   using Amazon.Lambda.Serialization.SystemTextJson;
   using System.Text.Json.Serialization;
   
   namespace NativeAotSample;
   
   public class Function
   {
       /// <summary>
       /// The main entry point for the Lambda function. The main function is called once during the Lambda init phase. It
       /// initializes the .NET Lambda runtime client passing in the function handler to invoke for each Lambda event and
       /// the JSON serializer to use for converting Lambda JSON format to the .NET types.
       /// </summary>
       private static async Task Main()
       {
           Func<string, ILambdaContext, string> handler = FunctionHandler;
           await LambdaBootstrapBuilder.Create(handler, new SourceGeneratorLambdaJsonSerializer<LambdaFunctionJsonSerializerContext>())
               .Build()
               .RunAsync();
       }
   
       /// <summary>
       /// A simple function that takes a string and does a ToUpper.
       ///
       /// To use this handler to respond to an AWS event, reference the appropriate package from
       /// https://github.com/aws/aws-lambda-dotnet#events
       /// and change the string input parameter to the desired event type. When the event type
       /// is changed, the handler type registered in the main method needs to be updated and the LambdaFunctionJsonSerializerContext
       /// defined below will need the JsonSerializable updated. If the return type and event type are different then the
       /// LambdaFunctionJsonSerializerContext must have two JsonSerializable attributes, one for each type.
       ///
       // When using Native AOT extra testing with the deployed Lambda functions is required to ensure
       // the libraries used in the Lambda function work correctly with Native AOT. If a runtime
       // error occurs about missing types or methods the most likely solution will be to remove references to trim-unsafe
       // code or configure trimming options. This sample defaults to partial TrimMode because currently the AWS
       // SDK for .NET does not support trimming. This will result in a larger executable size, and still does not
       // guarantee runtime trimming errors won't be hit.
       /// </summary>
       /// <param name="input"></param>
       /// <param name="context"></param>
       /// <returns></returns>
       public static string FunctionHandler(string input, ILambdaContext context)
       {
           return input.ToUpper();
       }
   }
   
   /// <summary>
   /// This class is used to register the input event and return type for the FunctionHandler method with the System.Text.Json source generator.
   /// There must be a JsonSerializable attribute for each type used as the input and return type or a runtime error will occur
   /// from the JSON serializer unable to find the serialization information for unknown types.
   /// </summary>
   [JsonSerializable(typeof(string))]
   public partial class LambdaFunctionJsonSerializerContext : JsonSerializerContext
   {
       // By using this partial class derived from JsonSerializerContext, we can generate reflection free JSON Serializer code at compile time
       // which can deserialize our class and properties. However, we must attribute this class to tell it what types to generate serialization code for.
       // See https://docs.microsoft.com/en-us/dotnet/standard/serialization/system-text-json-source-generation
   ```

   Native AOT compiles your application into a single, native binary. The entrypoint of that binary is the `static Main` method. Within `static Main`, the Lambda runtime is bootstrapped and the `FunctionHandler` method set up. As part of the runtime bootstrap, a source generated serializer is configured using `new SourceGeneratorLambdaJsonSerializer<LambdaFunctionJsonSerializerContext>()`

1. To deploy your application to Lambda, ensure that Docker is running in your local environment and run the following command.

   ```
   dotnet lambda deploy-function
   ```

   Behind the scenes, the .NET global CLI downloads an AL2023 Docker image and compiles your application code inside a running container. The compiled binary is output back to your local filesystem before being deployed to Lambda.

1. Test your function by running the following command. Replace `<FUNCTION_NAME>` with the name you chose for your function in the deployment wizard.

   ```
   dotnet lambda invoke-function <FUNCTION_NAME> --payload "hello world"
   ```

   The response from the CLI includes performance details for the cold start (initialization duration) and total run time for your function invocation.

1. To delete the AWS resources you created by following the preceding steps, run the following command. Replace `<FUNCTION_NAME>` with the name you chose for your function in the deployment wizard. By deleting AWS resources that you're no longer using, you prevent unnecessary charges being billed to your AWS account.

   ```
   dotnet lambda delete-function <FUNCTION_NAME>
   ```

## Serialization


To deploy functions to Lambda using native AOT, your function code must use [source generated serialization](https://learn.microsoft.com/en-us/dotnet/standard/serialization/system-text-json/source-generation-modes?pivots=dotnet-8-0). Instead of using run-time reflection to gather the metadata needed to access object properties for serialization, source generators generate C\$1 source files that are compiled when you build your application. To configure your source generated serializer correctly, ensure that you include any input and output objects your function uses, as well as any custom types. For example, a Lambda function that receives events from an API Gateway REST API and returns a custom `Product` type would include a serializer defined as follows.

```
[JsonSerializable(typeof(APIGatewayProxyRequest))]
[JsonSerializable(typeof(APIGatewayProxyResponse))]
[JsonSerializable(typeof(Product))]
public partial class CustomSerializer : JsonSerializerContext
{
}
```

## Trimming


Native AOT trims your application code as part of the compilation to ensure that the binary is as small as possible. .NET 8 for Lambda provides improved trimming support compared to previous versions of .NET. Support has been added to the [Lambda runtime libraries](https://github.com/aws/aws-lambda-dotnet/pull/1596), [AWS .NET SDK](https://github.com/aws/aws-sdk-net/pulls?q=is%3Apr+trimming), [.NET Lambda Annotations](https://github.com/aws/aws-lambda-dotnet/pull/1610), and .NET 8 itself.

These improvements offer the potential to eliminate build-time trimming warnings, but .NET will never be completely trim safe. This means that parts of libraries that your function relies on may be trimmed out as part of the compilation step. You can manage this by defining `TrimmerRootAssemblies` as part of your `.csproj` file as shown in the following example. 

```
<ItemGroup>
    <TrimmerRootAssembly Include="AWSSDK.Core" />
    <TrimmerRootAssembly Include="AWSXRayRecorder.Core" />
    <TrimmerRootAssembly Include="AWSXRayRecorder.Handlers.AwsSdk" />
    <TrimmerRootAssembly Include="Amazon.Lambda.APIGatewayEvents" />
    <TrimmerRootAssembly Include="bootstrap" />
    <TrimmerRootAssembly Include="Shared" />
</ItemGroup>
```

Note that when you receive a trim warning, adding the class that generates the warning to `TrimmerRootAssembly` might not resolve the issue. A trim warning indicates that the class is trying to access some other class that can't be determined until runtime. To avoid runtime errors, add this second class to `TrimmerRootAssembly`.

To learn more about managing trim warnings, see [Introduction to trim warnings](https://learn.microsoft.com/en-us/dotnet/core/deploying/trimming/fixing-warnings) in the Microsoft .NET documentation.

## Troubleshooting


**Error: Cross-OS native compilation is not supported.**  
Your version of the Amazon.Lambda.Tools .NET Core global tool is out of date. Update to the latest version and try again.

**Docker: image operating system "linux" cannot be used on this platform.**  
Docker on your system is configured to use Windows containers. Swap to Linux containers to run the native AOT build environment.

For more information about common errors, see the [AWS NativeAOT for .NET](https://github.com/awslabs/dotnet-nativeaot-labs#common-errors) repository on GitHub.

# Using the Lambda context object to retrieve C\$1 function information
Context

When Lambda runs your function, it passes a context object to the [handler](csharp-handler.md). This object provides properties with information about the invocation, function, and execution environment.

**Context properties**
+ `FunctionName` – The name of the Lambda function.
+ `FunctionVersion` – The [version](configuration-versions.md) of the function.
+ `InvokedFunctionArn` – The Amazon Resource Name (ARN) that's used to invoke the function. Indicates if the invoker specified a version number or alias.
+ `MemoryLimitInMB` – The amount of memory that's allocated for the function.
+ `AwsRequestId` – The identifier of the invocation request.
+ `LogGroupName` – The log group for the function.
+ `LogStreamName` – The log stream for the function instance.
+ `RemainingTime` (`TimeSpan`) – The number of milliseconds left before the execution times out.
+ `Identity` – (mobile apps) Information about the Amazon Cognito identity that authorized the request.
+ `ClientContext` – (mobile apps) Client context that's provided to Lambda by the client application.
+ `Logger` The [logger object](csharp-logging.md) for the function.

You can use information in the `ILambdaContext` object to output information about your function's invocation for monitoring purposes. The following code provides an example of how to add context information to a structured logging framework. In this example, the function adds `AwsRequestId` to the log outputs. The function also uses the `RemainingTime` property to cancel an inflight task if the Lambda function timeout is about to be reached.

```
[assembly: LambdaSerializer(typeof(Amazon.Lambda.Serialization.SystemTextJson.DefaultLambdaJsonSerializer))]

namespace GetProductHandler;

public class Function
{
    private readonly IDatabaseRepository _repo;
    
    public Function()
    {
        this._repo = new DatabaseRepository();
    }
    
    public async Task<APIGatewayProxyResponse> FunctionHandler(APIGatewayProxyRequest request, ILambdaContext context)
    {
        Logger.AppendKey("AwsRequestId", context.AwsRequestId);
        
        var id = request.PathParameters["id"];

        using var cts = new CancellationTokenSource();
        
        try
        {
            cts.CancelAfter(context.RemainingTime.Add(TimeSpan.FromSeconds(-1)));
            
            var databaseRecord = await this._repo.GetById(id, cts.Token);
            
            return new APIGatewayProxyResponse 
            {
                StatusCode = (int)HttpStatusCode.OK,
                Body = JsonSerializer.Serialize(databaseRecord)
            };
        }
        catch (Exception ex)
        {
            return new APIGatewayProxyResponse 
            {
                StatusCode = (int)HttpStatusCode.InternalServerError,
                Body = JsonSerializer.Serialize(new { error = ex.Message })
            };
        }
        finally
        {
            cts.Cancel();
        }
    }
}
```

# Log and monitor C\$1 Lambda functions
Logging

AWS Lambda automatically monitors Lambda functions and sends log entries to Amazon CloudWatch. Your Lambda function comes with a CloudWatch Logs log group and a log stream for each instance of your function. The Lambda runtime environment sends details about each invocation and other output from your function's code to the log stream. For more information about CloudWatch Logs, see [Sending Lambda function logs to CloudWatch Logs](monitoring-cloudwatchlogs.md).

**Topics**
+ [

## Creating a function that returns logs
](#csharp-logging-output)
+ [

## Using Lambda advanced logging controls with .NET
](#csharp-logging-advanced)
+ [

## Additional logging tools and libraries
](#csharp-tools-libraries)
+ [

## Using Powertools for AWS Lambda (.NET) and AWS SAM for structured logging
](#dotnet-logging-sam)
+ [

## Viewing logs in the Lambda console
](#csharp-logging-console)
+ [

## Viewing logs in the CloudWatch console
](#csharp-logging-cwconsole)
+ [

## Viewing logs using the AWS Command Line Interface (AWS CLI)
](#csharp-logging-cli)
+ [

## Deleting logs
](#csharp-logging-delete)

## Creating a function that returns logs
Creating a function that returns logs

To output logs from your function code, you can use the [ILambdaLogger](https://github.com/aws/aws-lambda-dotnet/blob/master/Libraries/src/Amazon.Lambda.Core/ILambdaLogger.cs) on the context object, the methods on the [Console class](https://docs.microsoft.com/en-us/dotnet/api/system.console), or any logging library that writes to `stdout` or `stderr`.

The .NET runtime logs the `START`, `END`, and `REPORT` lines for each invocation. The report line provides the following details.

**REPORT line data fields**
+ **RequestId** – The unique request ID for the invocation.
+ **Duration** – The amount of time that your function's handler method spent processing the event.
+ **Billed Duration** – The amount of time billed for the invocation.
+ **Memory Size** – The amount of memory allocated to the function.
+ **Max Memory Used** – The amount of memory used by the function. When invocations share an execution environment, Lambda reports the maximum memory used across all invocations. This behavior might result in a higher than expected reported value.
+ **Init Duration** – For the first request served, the amount of time it took the runtime to load the function and run code outside of the handler method.
+ **XRAY TraceId** – For traced requests, the [AWS X-Ray trace ID](services-xray.md).
+ **SegmentId** – For traced requests, the X-Ray segment ID.
+ **Sampled** – For traced requests, the sampling result.

## Using Lambda advanced logging controls with .NET


To give you more control over how your functions’ logs are captured, processed, and consumed, you can configure the following logging options for supported .NET runtimes:
+ **Log format** - select between plain text and structured JSON format for your function’s logs
+ **Log level** - for logs in JSON format, choose the detail level of the logs Lambda sends to CloudWatch, such as ERROR, DEBUG, or INFO
+ **Log group** - choose the CloudWatch log group your function sends logs to

For more information about these logging options, and instructions on how to configure your function to use them, see [Configuring advanced logging controls for Lambda functions](monitoring-logs.md#monitoring-cloudwatchlogs-advanced).

To use the log format and log level options with your .NET Lambda functions, see the guidance in the following sections.

### Using structured JSON log format with .NET


If you select JSON for your function's log format, Lambda will send logs output using [ILambdaLogger](https://github.com/aws/aws-lambda-dotnet/blob/master/Libraries/src/Amazon.Lambda.Core/ILambdaLogger.cs) as structured JSON. Each JSON log object contains at least five key value pairs with the following keys:
+ `"timestamp"` - the time the log message was generated
+ `"level"` - the log level assigned to the message
+ `"requestId"` - the unique request ID for the function invocation
+ `"traceId"` - the `_X_AMZN_TRACE_ID` environment variable
+ `"message"` - the contents of the log message

The `ILambdaLogger` instance can add additional key value pairs, for example when logging exceptions. You can also supply your own additional parameters as described in the section [Customer-provided log parameters](#csharp-logging-advanced-JSON-user-supplied).

**Note**  
If your code already uses another logging library to produce JSON-formatted logs, ensure that your function's log format is set to plain text. Setting the log format to JSON will result in your log outputs being double-encoded.

The following example logging command shows how to write a log message with the level `INFO`.

**Example .NET logging code**  

```
context.Logger.LogInformation("Fetching cart from database");
```

You can also use a generic log method that takes the log level as an argument as shown in the following example.

```
context.Logger.Log(LogLevel.Information, "Fetching cart from database");
```

The log output by these example code snippets would be captured in CloudWatch Logs as follows:

**Example JSON log record**  

```
{
    "timestamp": "2025-09-07T01:30:06.977Z",
    "level": "Information",
    "requestId": "8f711428-7e55-46f9-ae88-2a65d4f85fc5",
    "traceId": "1-6408af34-50f56f5b5677a7d763973804",
    "message": "Fetching cart from database"
}
```

**Note**  
If you configure your function's log format to to use plain text rather than JSON, then the log level captured in the message follows the Microsoft convention of using a four-character label. For example, a log level of `Debug` is represented in the message as `dbug`.  
When you configure your function to use JSON formatted logs, the log level captured in the log uses the full label as shown in the example JSON log record.

If you don't assign a level to your log output, Lambda will automatically assign it the level INFO.

#### Logging exceptions in JSON


When using structured JSON logging with `ILambdaLogger`, you can log exceptions in your code as shown in the following example.

**Example usage of exception logging**  

```
try
{
    connection.ExecuteQuery(query);
}
catch(Exception e)
{
    context.Logger.LogWarning(e, "Error executing query");
}
```

The log format output by this code is shown in the following example JSON. Note that the `message` property in the JSON is populated using the message argument provided in the `LogWarning` call, while the `errorMessage` property comes from the `Message` property of the exception itself.

**Example JSON log record**  

```
{
    "timestamp": "2025-09-07T01:30:06.977Z",
    "level": "Warning",
    "requestId": "8f711428-7e55-46f9-ae88-2a65d4f85fc5",
    "traceId": "1-6408af34-50f56f5b5677a7d763973804",
    "message": "Error executing query",
    "errorType": "System.Data.SqlClient.SqlException",
    "errorMessage": "Connection closed",
    "stackTrace": ["<call exception.StackTrace>"]
}
```

If your function's logging format is set to JSON, Lambda also outputs JSON-formatted log messages when your code throws an uncaught exception. The following example code snippet and log message show how uncaught exceptions are logged.

**Example exception code**  

```
throw new ApplicationException("Invalid data");
```

**Example JSON log record**  

```
{
    "timestamp": "2025-09-07T01:30:06.977Z",
    "level": "Error",
    "requestId": "8f711428-7e55-46f9-ae88-2a65d4f85fc5",
    "traceId": "1-6408af34-50f56f5b5677a7d763973804",
    "message": "Invalid data",
    "errorType": "System.ApplicationException",
    "errorMessage": "Invalid data",
    "stackTrace": ["<call exception.StackTrace>"]
}
```

#### Customer-provided log parameters


With JSON-formatted log messages, you can supply additional log parameters and include these in the log `message`. The following code snippet example shows a command to add two user-supplied parameters labeled `retryAttempt` and `uri`. In the example, the value of these parameters come from the `retryAttempt` and `uriDestination` arguments passed into the logging command.

**Example JSON logging command with additional parameters**  

```
context.Logger.LogInformation("Starting retry {retryAttempt} to make GET request to {uri}", retryAttempt, uriDestination);
```

The log message output by this command is shown in the following example JSON.

**Example JSON log record**  

```
{
    "timestamp": "2025-09-07T01:30:06.977Z",
    "level": "Information",
    "requestId": "8f711428-7e55-46f9-ae88-2a65d4f85fc5",
    "traceId": "1-6408af34-50f56f5b5677a7d763973804",
    "message": "Starting retry 1 to make GET request to http://example.com/",
    "retryAttempt": 1,
    "uri": "http://example.com/"
}
```

**Tip**  
You can also use positional properties instead of names when specifying additional parameters. For example, the logging command in the previous example could also be written as follows:  

```
context.Logger.LogInformation("Starting retry {0} to make GET request to {1}", retryAttempt, uriDestination);
```

Note that when you supply additional logging parameters, Lambda captures them as top-level properties in the JSON log record. This approach differs from some popular .NET logging libraries such as `Serilog`, which captures additional parameters in a separate child object.

If the argument you supply for an additional parameter is a complex object, by default Lambda uses the `ToString()` method to supply the value. To indicate that an argument should be JSON serialized, use the `@` prefix as shown in the following code snippet. In this example, `User` is an object with `FirstName` and `LastName` properties.

**Example JSON logging command with JSON serialized object**  

```
context.Logger.LogInformation("User {@user} logged in", User);
```

The log message output by this command is shown in the following example JSON.

**Example JSON log record**  

```
{
    "timestamp": "2025-09-07T01:30:06.977Z",
    "level": "Information",
    "requestId": "8f711428-7e55-46f9-ae88-2a65d4f85fc5",
    "traceId": "1-6408af34-50f56f5b5677a7d763973804",
    "message": "User {@user} logged in",
    "user": 
    {
        "FirstName": "John",
        "LastName": "Doe"
    }
}
```

If the argument for an additional parameter is an array or implements `IList` or `IDictionary`, then Lambda adds the argument to the JSON log message as an array as shown in the following example JSON log record. In this example, `{users}` takes an `IList` argument containing instances of the `User` property with the same format as the previous example. Lambda converts this `IList` into an array, with each value being created using the `ToString` method.

**Example JSON log record with an `IList` argument**  

```
{
    "timestamp": "2025-09-07T01:30:06.977Z",
    "level": "Information",
    "requestId": "8f711428-7e55-46f9-ae88-2a65d4f85fc5",
    "traceId": "1-6408af34-50f56f5b5677a7d763973804",
    "message": "{users} have joined the group",
    "users": 
    [
        "Rosalez, Alejandro",
        "Stiles, John"       
    ] 
}
```

You can also JSON serialize the list by using the `@` prefix in your logging command. In the following example JSON log record, the `users` property is JSON serialized.

**Example JSON log record with a JSON serialized `IList` argument**  

```
{
    "timestamp": "2025-09-07T01:30:06.977Z",
    "level": "Information",
    "requestId": "8f711428-7e55-46f9-ae88-2a65d4f85fc5",
    "traceId": "1-6408af34-50f56f5b5677a7d763973804",
    "message": "{@users} have joined the group",
    "users": 
    [
        {
            "FirstName": "Alejandro",
            "LastName": "Rosalez"
        },
        {
            "FirstName": "John",
            "LastName": "Stiles"
        }        
    ] 
}
```

### Using log-level filtering with .NET


By configuring log-level filtering, you can choose to send only logs of a certain detail level or lower to CloudWatch Logs. To learn how to configure log-level filtering for your function, see [Log-level filtering](monitoring-cloudwatchlogs-log-level.md).

For AWS Lambda to filter your log messages by log level, you can either use JSON formatted logs or use the .NET `Console` methods to output log messages. To create JSON formatted logs, [configure your function's log type to JSON](monitoring-cloudwatchlogs-logformat.md#monitoring-cloudwatchlogs-set-format) and use the `ILambdaLogger` instance.

With JSON-formatted logs, Lambda filters your log outputs using the “level” key value pair in the JSON object described in [Using structured JSON log format with .NET](#csharp-logging-advanced-JSON).

If you use the .NET `Console` methods to write messages to CloudWatch Logs, Lambda applies log levels to your messages as follows:
+ **Console.WriteLine method** - Lambda applies a log-level of `INFO`
+ **Console.Error method** - Lambda applies a log-level of `ERROR`

When you configure your function to use log-level filtering, you must select from the following options for the level of logs you want Lambda to send to CloudWatch Logs. Note the mapping of the log levels used by Lambda with the standard Microsoft levels used by the .NET `ILambdaLogger`.


| Lambda log level | Equivalent Microsoft level | Standard usage | 
| --- | --- | --- | 
| TRACE (most detail) | Trace | The most fine-grained information used to trace the path of your code's execution | 
| DEBUG | Debug | Detailed information for system debugging | 
| INFO | Information | Messages that record the normal operation of your function | 
| WARN | Warning | Messages about potential errors that may lead to unexpected behavior if unaddressed | 
| ERROR | Error | Messages about problems that prevent the code from performing as expected | 
| FATAL (least detail) | Critical | Messages about serious errors that cause the application to stop functioning | 

Lambda sends logs of the selected detail level and lower to CloudWatch. For example, if you configure a log level of WARN, Lambda will send logs corresponding to the WARN, ERROR, and FATAL levels.

## Additional logging tools and libraries
Tools and libraries

[Powertools for AWS Lambda (.NET)](https://docs.aws.amazon.com/powertools/dotnet/) is a developer toolkit to implement Serverless best practices and increase developer velocity. The [Logging utility](https://docs.aws.amazon.com/powertools/dotnet/core/logging/) provides a Lambda optimized logger which includes additional information about function context across all your functions with output structured as JSON. Use this utility to do the following:
+ Capture key fields from the Lambda context, cold start and structures logging output as JSON
+ Log Lambda invocation events when instructed (disabled by default)
+ Print all the logs only for a percentage of invocations via log sampling (disabled by default)
+ Append additional keys to structured log at any point in time
+ Use a custom log formatter (Bring Your Own Formatter) to output logs in a structure compatible with your organization’s Logging RFC

## Using Powertools for AWS Lambda (.NET) and AWS SAM for structured logging


Follow the steps below to download, build, and deploy a sample Hello World C\$1 application with integrated [Powertools for AWS Lambda (.NET)](https://docs.powertools.aws.dev/lambda-dotnet) modules using the AWS SAM. This application implements a basic API backend and uses Powertools for emitting logs, metrics, and traces. It consists of an Amazon API Gateway endpoint and a Lambda function. When you send a GET request to the API Gateway endpoint, the Lambda function invokes, sends logs and metrics using Embedded Metric Format to CloudWatch, and sends traces to AWS X-Ray. The function returns a `hello world` message.

**Prerequisites**

To complete the steps in this section, you must have the following:
+ .NET 8
+ [AWS CLI version 2](https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html)
+ [AWS SAM CLI version 1.75 or later](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-install.html). If you have an older version of the AWS SAM CLI, see [Upgrading the AWS SAM CLI](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/manage-sam-cli-versions.html#manage-sam-cli-versions-upgrade).

**Deploy a sample AWS SAM application**

1. Initialize the application using the Hello World TypeScript template.

   ```
   sam init --app-template hello-world-powertools-dotnet --name sam-app --package-type Zip --runtime dotnet6 --no-tracing
   ```

1. Build the app.

   ```
   cd sam-app && sam build
   ```

1. Deploy the app.

   ```
   sam deploy --guided
   ```

1. Follow the on-screen prompts. To accept the default options provided in the interactive experience, press `Enter`.
**Note**  
For **HelloWorldFunction may not have authorization defined, Is this okay?**, make sure to enter `y`.

1. Get the URL of the deployed application:

   ```
   aws cloudformation describe-stacks --stack-name sam-app --query 'Stacks[0].Outputs[?OutputKey==`HelloWorldApi`].OutputValue' --output text
   ```

1. Invoke the API endpoint:

   ```
   curl -X GET <URL_FROM_PREVIOUS_STEP>
   ```

   If successful, you'll see this response:

   ```
   {"message":"hello world"}
   ```

1. To get the logs for the function, run [sam logs](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-cli-command-reference-sam-logs.html). For more information, see [Working with logs](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-logging.html) in the *AWS Serverless Application Model Developer Guide*.

   ```
   sam logs --stack-name sam-app
   ```

   The log output looks like this:

   ```
   2025/02/20/[$LATEST]4eaf8445ba7a4a93b999cb17fbfbecd8 2025-09-20T14:15:27.988000 INIT_START Runtime Version: dotnet:6.v13        Runtime Version ARN: arn:aws:lambda:ap-southeast-2::runtime:699f346a05dae24c58c45790bc4089f252bf17dae3997e79b17d939a288aa1ec
   2025/02/20/[$LATEST]4eaf8445ba7a4a93b999cb17fbfbecd8 2025-09-20T14:15:28.229000 START RequestId: bed25b38-d012-42e7-ba28-f272535fb80e Version: $LATEST
   2025/02/20/[$LATEST]4eaf8445ba7a4a93b999cb17fbfbecd8 2025-09-20T14:15:29.259000 2025-09-20T14:15:29.201Z        bed25b38-d012-42e7-ba28-f272535fb80e    info   {"_aws":{"Timestamp":1676902528962,"CloudWatchMetrics":[{"Namespace":"sam-app-logging","Metrics":[{"Name":"ColdStart","Unit":"Count"}],"Dimensions":[["FunctionName"],["Service"]]}]},"FunctionName":"sam-app-HelloWorldFunction-haKIoVeose2p","Service":"PowertoolsHelloWorld","ColdStart":1}
   2025/02/20/[$LATEST]4eaf8445ba7a4a93b999cb17fbfbecd8 2025-09-20T14:15:30.479000 2025-09-20T14:15:30.479Z        bed25b38-d012-42e7-ba28-f272535fb80e    info   {"ColdStart":true,"XrayTraceId":"1-63f3807f-5dbcb9910c96f50742707542","CorrelationId":"d3d4de7f-4ccc-411a-a549-4d67b2fdc015","FunctionName":"sam-app-HelloWorldFunction-haKIoVeose2p","FunctionVersion":"$LATEST","FunctionMemorySize":256,"FunctionArn":"arn:aws:lambda:ap-southeast-2:123456789012:function:sam-app-HelloWorldFunction-haKIoVeose2p","FunctionRequestId":"bed25b38-d012-42e7-ba28-f272535fb80e","Timestamp":"2025-09-20T14:15:30.4602970Z","Level":"Information","Service":"PowertoolsHelloWorld","Name":"AWS.Lambda.Powertools.Logging.Logger","Message":"Hello world API - HTTP 200"}
   2025/02/20/[$LATEST]4eaf8445ba7a4a93b999cb17fbfbecd8 2025-09-20T14:15:30.599000 2025-09-20T14:15:30.599Z        bed25b38-d012-42e7-ba28-f272535fb80e    info   {"_aws":{"Timestamp":1676902528922,"CloudWatchMetrics":[{"Namespace":"sam-app-logging","Metrics":[{"Name":"ApiRequestCount","Unit":"Count"}],"Dimensions":[["Service"]]}]},"Service":"PowertoolsHelloWorld","ApiRequestCount":1}
   2025/02/20/[$LATEST]4eaf8445ba7a4a93b999cb17fbfbecd8 2025-09-20T14:15:30.680000 END RequestId: bed25b38-d012-42e7-ba28-f272535fb80e
   2025/02/20/[$LATEST]4eaf8445ba7a4a93b999cb17fbfbecd8 2025-09-20T14:15:30.680000 REPORT RequestId: bed25b38-d012-42e7-ba28-f272535fb80e  Duration: 2450.99 ms   Billed Duration: 2692 ms Memory Size: 256 MB     Max Memory Used: 74 MB  Init Duration: 240.05 ms
   XRAY TraceId: 1-63f3807f-5dbcb9910c96f50742707542       SegmentId: 16b362cd5f52cba0
   ```

1. This is a public API endpoint that is accessible over the internet. We recommend that you delete the endpoint after testing.

   ```
   sam delete
   ```

### Managing log retention


Log groups aren't deleted automatically when you delete a function. To avoid storing logs indefinitely, delete the log group, or configure a retention period after which CloudWatch automatically deletes the logs. To set up log retention, add the following to your AWS SAM template:

```
Resources:
  HelloWorldFunction:
    Type: AWS::Serverless::Function
    Properties:
      # Omitting other properties

  LogGroup:
    Type: AWS::Logs::LogGroup
    Properties:
      LogGroupName: !Sub "/aws/lambda/${HelloWorldFunction}"
      RetentionInDays: 7
```

## Viewing logs in the Lambda console


You can use the Lambda console to view log output after you invoke a Lambda function.

If your code can be tested from the embedded **Code** editor, you will find logs in the **execution results**. When you use the console test feature to invoke a function, you'll find **Log output** in the **Details** section.

## Viewing logs in the CloudWatch console


You can use the Amazon CloudWatch console to view logs for all Lambda function invocations.

**To view logs on the CloudWatch console**

1. Open the [Log groups page](https://console.aws.amazon.com/cloudwatch/home?#logs:) on the CloudWatch console.

1. Choose the log group for your function (**/aws/lambda/*your-function-name***).

1. Choose a log stream.

Each log stream corresponds to an [instance of your function](lambda-runtime-environment.md). A log stream appears when you update your Lambda function, and when additional instances are created to handle concurrent invocations. To find logs for a specific invocation, we recommend instrumenting your function with AWS X-Ray. X-Ray records details about the request and the log stream in the trace.

## Viewing logs using the AWS Command Line Interface (AWS CLI)


The AWS CLI is an open-source tool that enables you to interact with AWS services using commands in your command line shell. To complete the steps in this section, you must have the [AWS CLI version 2](https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html).

You can use the [AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-welcome.html) to retrieve logs for an invocation using the `--log-type` command option. The response contains a `LogResult` field that contains up to 4 KB of base64-encoded logs from the invocation.

**Example retrieve a log ID**  
The following example shows how to retrieve a *log ID* from the `LogResult` field for a function named `my-function`.  

```
aws lambda invoke --function-name my-function out --log-type Tail
```
You should see the following output:  

```
{
    "StatusCode": 200,
    "LogResult": "U1RBUlQgUmVxdWVzdElkOiA4N2QwNDRiOC1mMTU0LTExZTgtOGNkYS0yOTc0YzVlNGZiMjEgVmVyc2lvb...",
    "ExecutedVersion": "$LATEST"
}
```

**Example decode the logs**  
In the same command prompt, use the `base64` utility to decode the logs. The following example shows how to retrieve base64-encoded logs for `my-function`.  

```
aws lambda invoke --function-name my-function out --log-type Tail \
--query 'LogResult' --output text --cli-binary-format raw-in-base64-out | base64 --decode
```
The **cli-binary-format** option is required if you're using AWS CLI version 2. To make this the default setting, run `aws configure set cli-binary-format raw-in-base64-out`. For more information, see [AWS CLI supported global command line options](https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-options.html#cli-configure-options-list) in the *AWS Command Line Interface User Guide for Version 2*.  
You should see the following output:  

```
START RequestId: 57f231fb-1730-4395-85cb-4f71bd2b87b8 Version: $LATEST
"AWS_SESSION_TOKEN": "AgoJb3JpZ2luX2VjELj...", "_X_AMZN_TRACE_ID": "Root=1-5d02e5ca-f5792818b6fe8368e5b51d50;Parent=191db58857df8395;Sampled=0"",ask/lib:/opt/lib",
END RequestId: 57f231fb-1730-4395-85cb-4f71bd2b87b8
REPORT RequestId: 57f231fb-1730-4395-85cb-4f71bd2b87b8  Duration: 79.67 ms      Billed Duration: 80 ms         Memory Size: 128 MB     Max Memory Used: 73 MB
```
The `base64` utility is available on Linux, macOS, and [Ubuntu on Windows](https://docs.microsoft.com/en-us/windows/wsl/install-win10). macOS users may need to use `base64 -D`.

**Example get-logs.sh script**  
In the same command prompt, use the following script to download the last five log events. The script uses `sed` to remove quotes from the output file, and sleeps for 15 seconds to allow time for the logs to become available. The output includes the response from Lambda and the output from the `get-log-events` command.   
Copy the contents of the following code sample and save in your Lambda project directory as `get-logs.sh`.  
The **cli-binary-format** option is required if you're using AWS CLI version 2. To make this the default setting, run `aws configure set cli-binary-format raw-in-base64-out`. For more information, see [AWS CLI supported global command line options](https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-options.html#cli-configure-options-list) in the *AWS Command Line Interface User Guide for Version 2*.  

```
#!/bin/bash
aws lambda invoke --function-name my-function --cli-binary-format raw-in-base64-out --payload '{"key": "value"}' out
sed -i'' -e 's/"//g' out
sleep 15
aws logs get-log-events --log-group-name /aws/lambda/my-function --log-stream-name stream1 --limit 5
```

**Example macOS and Linux (only)**  
In the same command prompt, macOS and Linux users may need to run the following command to ensure the script is executable.  

```
chmod -R 755 get-logs.sh
```

**Example retrieve the last five log events**  
In the same command prompt, run the following script to get the last five log events.  

```
./get-logs.sh
```
You should see the following output:  

```
{
    "StatusCode": 200,
    "ExecutedVersion": "$LATEST"
}
{
    "events": [
        {
            "timestamp": 1559763003171,
            "message": "START RequestId: 4ce9340a-b765-490f-ad8a-02ab3415e2bf Version: $LATEST\n",
            "ingestionTime": 1559763003309
        },
        {
            "timestamp": 1559763003173,
            "message": "2019-06-05T19:30:03.173Z\t4ce9340a-b765-490f-ad8a-02ab3415e2bf\tINFO\tENVIRONMENT VARIABLES\r{\r  \"AWS_LAMBDA_FUNCTION_VERSION\": \"$LATEST\",\r ...",
            "ingestionTime": 1559763018353
        },
        {
            "timestamp": 1559763003173,
            "message": "2019-06-05T19:30:03.173Z\t4ce9340a-b765-490f-ad8a-02ab3415e2bf\tINFO\tEVENT\r{\r  \"key\": \"value\"\r}\n",
            "ingestionTime": 1559763018353
        },
        {
            "timestamp": 1559763003218,
            "message": "END RequestId: 4ce9340a-b765-490f-ad8a-02ab3415e2bf\n",
            "ingestionTime": 1559763018353
        },
        {
            "timestamp": 1559763003218,
            "message": "REPORT RequestId: 4ce9340a-b765-490f-ad8a-02ab3415e2bf\tDuration: 26.73 ms\tBilled Duration: 27 ms \tMemory Size: 128 MB\tMax Memory Used: 75 MB\t\n",
            "ingestionTime": 1559763018353
        }
    ],
    "nextForwardToken": "f/34783877304859518393868359594929986069206639495374241795",
    "nextBackwardToken": "b/34783877303811383369537420289090800615709599058929582080"
}
```

## Deleting logs


Log groups aren't deleted automatically when you delete a function. To avoid storing logs indefinitely, delete the log group, or [configure a retention period](https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/Working-with-log-groups-and-streams.html#SettingLogRetention) after which logs are deleted automatically.

# Instrumenting C\$1 code in AWS Lambda
Tracing

Lambda integrates with AWS X-Ray to help you trace, debug, and optimize Lambda applications. You can use X-Ray to trace a request as it traverses resources in your application, which may include Lambda functions and other AWS services.

To send tracing data to X-Ray, you can use one of three SDK libraries:
+ [AWS Distro for OpenTelemetry (ADOT)](https://aws.amazon.com/otel) – A secure, production-ready, AWS-supported distribution of the OpenTelemetry (OTel) SDK.
+ [AWS X-Ray SDK for .NET](https://docs.aws.amazon.com/xray/latest/devguide/xray-sdk-dotnet.html) – An SDK for generating and sending trace data to X-Ray.
+ [Powertools for AWS Lambda (.NET)](https://docs.aws.amazon.com/powertools/dotnet/) – A developer toolkit to implement Serverless best practices and increase developer velocity.

Each of the SDKs offer ways to send your telemetry data to the X-Ray service. You can then use X-Ray to view, filter, and gain insights into your application's performance metrics to identify issues and opportunities for optimization.

**Important**  
The X-Ray and Powertools for AWS Lambda SDKs are part of a tightly integrated instrumentation solution offered by AWS. The ADOT Lambda Layers are part of an industry-wide standard for tracing instrumentation that collect more data in general, but may not be suited for all use cases. You can implement end-to-end tracing in X-Ray using either solution. To learn more about choosing between them, see [Choosing between the AWS Distro for Open Telemetry and X-Ray SDKs](https://docs.aws.amazon.com/xray/latest/devguide/xray-instrumenting-your-app.html#xray-instrumenting-choosing).

**Topics**
+ [

## Using Powertools for AWS Lambda (.NET) and AWS SAM for tracing
](#dotnet-tracing-sam)
+ [

## Using the X-Ray SDK to instrument your .NET functions
](#dotnet-xray-sdk)
+ [

## Activating tracing with the Lambda console
](#dotnet-tracing-console)
+ [

## Activating tracing with the Lambda API
](#dotnet-tracing-api)
+ [

## Activating tracing with CloudFormation
](#dotnet-tracing-cloudformation)
+ [

## Interpreting an X-Ray trace
](#dotnet-tracing-interpretation)

## Using Powertools for AWS Lambda (.NET) and AWS SAM for tracing


Follow the steps below to download, build, and deploy a sample Hello World C\$1 application with integrated [Powertools for AWS Lambda (.NET)](https://docs.powertools.aws.dev/lambda-dotnet) modules using the AWS SAM. This application implements a basic API backend and uses Powertools for emitting logs, metrics, and traces. It consists of an Amazon API Gateway endpoint and a Lambda function. When you send a GET request to the API Gateway endpoint, the Lambda function invokes, sends logs and metrics using Embedded Metric Format to CloudWatch, and sends traces to AWS X-Ray. The function returns a hello world message.

**Prerequisites**

To complete the steps in this section, you must have the following:
+ .NET 8
+ [AWS CLI version 2](https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html)
+ [AWS SAM CLI version 1.75 or later](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-install.html). If you have an older version of the AWS SAM CLI, see [Upgrading the AWS SAM CLI](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/manage-sam-cli-versions.html#manage-sam-cli-versions-upgrade).

**Deploy a sample AWS SAM application**

1. Initialize the application using the Hello World TypeScript template.

   ```
   sam init --app-template hello-world-powertools-dotnet --name sam-app --package-type Zip --runtime dotnet6 --no-tracing
   ```

1. Build the app.

   ```
   cd sam-app && sam build
   ```

1. Deploy the app.

   ```
   sam deploy --guided
   ```

1. Follow the on-screen prompts. To accept the default options provided in the interactive experience, press `Enter`.
**Note**  
For **HelloWorldFunction may not have authorization defined, Is this okay?**, make sure to enter `y`.

1. Get the URL of the deployed application:

   ```
   aws cloudformation describe-stacks --stack-name sam-app --query 'Stacks[0].Outputs[?OutputKey==`HelloWorldApi`].OutputValue' --output text
   ```

1. Invoke the API endpoint:

   ```
   curl <URL_FROM_PREVIOUS_STEP>
   ```

   If successful, you'll see this response:

   ```
   {"message":"hello world"}
   ```

1. To get the traces for the function, run [sam traces](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-cli-command-reference-sam-traces.html).

   ```
   sam traces
   ```

   The trace output looks like this:

   ```
   New XRay Service Graph
     Start time: 2023-02-20 23:05:16+08:00
     End time: 2023-02-20 23:05:16+08:00
     Reference Id: 0 - AWS::Lambda - sam-app-HelloWorldFunction-pNjujb7mEoew - Edges: [1]
      Summary_statistics:
        - total requests: 1
        - ok count(2XX): 1
        - error count(4XX): 0
        - fault count(5XX): 0
        - total response time: 2.814
     Reference Id: 1 - AWS::Lambda::Function - sam-app-HelloWorldFunction-pNjujb7mEoew - Edges: []
      Summary_statistics:
        - total requests: 1
        - ok count(2XX): 1
        - error count(4XX): 0
        - fault count(5XX): 0
        - total response time: 2.429
     Reference Id: 2 - (Root) AWS::ApiGateway::Stage - sam-app/Prod - Edges: [0]
      Summary_statistics:
        - total requests: 1
        - ok count(2XX): 1
        - error count(4XX): 0
        - fault count(5XX): 0
        - total response time: 2.839
     Reference Id: 3 - client - sam-app/Prod - Edges: [2]
      Summary_statistics:
        - total requests: 0
        - ok count(2XX): 0
        - error count(4XX): 0
        - fault count(5XX): 0
        - total response time: 0
   
   XRay Event [revision 3] at (2023-02-20T23:05:16.521000) with id (1-63f38c2c-270200bf1d292a442c8e8a00) and duration (2.877s)
    - 2.839s - sam-app/Prod [HTTP: 200]
      - 2.836s - Lambda [HTTP: 200]
    - 2.814s - sam-app-HelloWorldFunction-pNjujb7mEoew [HTTP: 200]
    - 2.429s - sam-app-HelloWorldFunction-pNjujb7mEoew
      - 0.230s - Initialization
      - 2.389s - Invocation
        - 0.600s - ## FunctionHandler
          - 0.517s - Get Calling IP
      - 0.039s - Overhead
   ```

1. This is a public API endpoint that is accessible over the internet. We recommend that you delete the endpoint after testing.

   ```
   sam delete
   ```

X-Ray doesn't trace all requests to your application. X-Ray applies a sampling algorithm to ensure that tracing is efficient, while still providing a representative sample of all requests. The sampling rate is 1 request per second and 5 percent of additional requests. You can't configure the X-Ray sampling rate for your functions.

## Using the X-Ray SDK to instrument your .NET functions


You can instrument your function code to record metadata and trace downstream calls. To record detail about calls that your function makes to other resources and services, use the AWS X-Ray SDK for .NET. To get the SDK, add the `AWSXRayRecorder` packages to your project file.

```
<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <TargetFramework>net8.0</TargetFramework>
    <GenerateRuntimeConfigurationFiles>true</GenerateRuntimeConfigurationFiles>
    <AWSProjectType>Lambda</AWSProjectType>
  </PropertyGroup>
  <ItemGroup>
    <PackageReference Include="Amazon.Lambda.Core" Version="2.1.0" />
    <PackageReference Include="Amazon.Lambda.SQSEvents" Version="2.1.0" />
    <PackageReference Include="Amazon.Lambda.Serialization.Json" Version="2.1.0" />
    <PackageReference Include="AWSSDK.Core" Version="3.7.103.24" />
    <PackageReference Include="AWSSDK.Lambda" Version="3.7.104.3" />
    <PackageReference Include="AWSXRayRecorder.Core" Version="2.13.0" />
    <PackageReference Include="AWSXRayRecorder.Handlers.AwsSdk" Version="2.11.0" />
  </ItemGroup>
</Project>
```

There are a range of Nuget packages that provide auto-instrumentation for AWS SDKs, Entity Framework and HTTP requests. To see the complete set of configuration options refer to [AWS X-Ray SDK for .NET](https://docs.aws.amazon.com/xray/latest/devguide/xray-sdk-dotnet.html) in the *AWS X-Ray Developer Guide*.

Once you have added the desired Nuget packages, configure auto-instrumentation. Best practice is to perform this configuration outside of your function's handler function. This allows you to take advantage of execution environment re-use to improve the performance of your function. In the following code example, the `RegisterXRayForAllServices` method is called in the function constructor to add instrumentation for all AWS SDK calls.

```
[assembly: LambdaSerializer(typeof(Amazon.Lambda.Serialization.SystemTextJson.DefaultLambdaJsonSerializer))]

namespace GetProductHandler;

public class Function
{
    private readonly IDatabaseRepository _repo;
    
    public Function()
    {
        // Add auto instrumentation for all AWS SDK calls
        // It is important to call this method before initializing any SDK clients
        AWSSDKHandler.RegisterXRayForAllServices();
        this._repo = new DatabaseRepository();
    }
    
    public async Task<APIGatewayProxyResponse> FunctionHandler(APIGatewayProxyRequest request)
    {
        var id = request.PathParameters["id"];
        
        var databaseRecord = await this._repo.GetById(id);
        
        return new APIGatewayProxyResponse 
        {
            StatusCode = (int)HttpStatusCode.OK,
            Body = JsonSerializer.Serialize(databaseRecord)
        };
    }
}
```

## Activating tracing with the Lambda console


To toggle active tracing on your Lambda function with the console, follow these steps:

**To turn on active tracing**

1. Open the [Functions page](https://console.aws.amazon.com/lambda/home#/functions) of the Lambda console.

1. Choose a function.

1. Choose **Configuration** and then choose **Monitoring and operations tools**.

1. Under **Additional monitoring tools**, choose **Edit**.

1. Under **CloudWatch Application Signals and AWS X-Ray**, choose **Enable** for **Lambda service traces**.

1. Choose **Save**.

## Activating tracing with the Lambda API


Configure tracing on your Lambda function with the AWS CLI or AWS SDK, use the following API operations:
+ [UpdateFunctionConfiguration](https://docs.aws.amazon.com/lambda/latest/api/API_UpdateFunctionConfiguration.html)
+ [GetFunctionConfiguration](https://docs.aws.amazon.com/lambda/latest/api/API_GetFunctionConfiguration.html)
+ [CreateFunction](https://docs.aws.amazon.com/lambda/latest/api/API_CreateFunction.html)

The following example AWS CLI command enables active tracing on a function named **my-function**.

```
aws lambda update-function-configuration --function-name my-function \
--tracing-config Mode=Active
```

Tracing mode is part of the version-specific configuration when you publish a version of your function. You can't change the tracing mode on a published version.

## Activating tracing with CloudFormation


To activate tracing on an `AWS::Lambda::Function` resource in an CloudFormation template, use the `TracingConfig` property.

**Example [function-inline.yml](https://github.com/awsdocs/aws-lambda-developer-guide/blob/master/templates/function-inline.yml) – Tracing configuration**  

```
Resources:
  function:
    Type: [AWS::Lambda::Function](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-lambda-function.html)
    Properties:
      TracingConfig:
        Mode: Active
      ...
```

For an AWS Serverless Application Model (AWS SAM) `AWS::Serverless::Function` resource, use the `Tracing` property.

**Example [template.yml](https://github.com/awsdocs/aws-lambda-developer-guide/tree/main/sample-apps/blank-nodejs/template.yml) – Tracing configuration**  

```
Resources:
  function:
    Type: [AWS::Serverless::Function](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-resource-function.html)
    Properties:
      Tracing: Active
      ...
```

## Interpreting an X-Ray trace


Your function needs permission to upload trace data to X-Ray. When you activate tracing in the Lambda console, Lambda adds the required permissions to your function's [execution role](lambda-intro-execution-role.md). Otherwise, add the [AWSXRayDaemonWriteAccess](https://console.aws.amazon.com/iam/home#/policies/arn:aws:iam::aws:policy/AWSXRayDaemonWriteAccess) policy to the execution role.

After you've configured active tracing, you can observe specific requests through your application. The [ X-Ray service graph](https://docs.aws.amazon.com/xray/latest/devguide/aws-xray.html#xray-concepts-servicegraph) shows information about your application and all its components. The following example shows an application with two functions. The primary function processes events and sometimes returns errors. The second function at the top processes errors that appear in the first's log group and uses the AWS SDK to call X-Ray, Amazon Simple Storage Service (Amazon S3), and Amazon CloudWatch Logs.

![\[\]](http://docs.aws.amazon.com/lambda/latest/dg/images/sample-errorprocessor-servicemap.png)


X-Ray doesn't trace all requests to your application. X-Ray applies a sampling algorithm to ensure that tracing is efficient, while still providing a representative sample of all requests. The sampling rate is 1 request per second and 5 percent of additional requests. You can't configure the X-Ray sampling rate for your functions.

In X-Ray, a *trace* records information about a request that is processed by one or more *services*. Lambda records 2 segments per trace, which creates two nodes on the service graph. The following image highlights these two nodes:

![\[\]](http://docs.aws.amazon.com/lambda/latest/dg/images/xray-servicemap-function.png)


The first node on the left represents the Lambda service, which receives the invocation request. The second node represents your specific Lambda function. The following example shows a trace with these two segments. Both are named **my-function**, but one has an origin of `AWS::Lambda` and the other has an origin of `AWS::Lambda::Function`. If the `AWS::Lambda` segment shows an error, the Lambda service had an issue. If the `AWS::Lambda::Function` segment shows an error, your function had an issue.

![\[\]](http://docs.aws.amazon.com/lambda/latest/dg/images/V2_sandbox_images/my-function-2-v1.png)


This example expands the `AWS::Lambda::Function` segment to show its three subsegments.

**Note**  
AWS is currently implementing changes to the Lambda service. Due to these changes, you may see minor differences between the structure and content of system log messages and trace segments emitted by different Lambda functions in your AWS account.  
The example trace shown here illustrates the old-style function segment. The differences between the old- and new-style segments are described in the following paragraphs.  
These changes will be implemented during the coming weeks, and all functions in all AWS Regions except the China and GovCloud regions will transition to use the new-format log messages and trace segments.

The old-style function segment contains the following subsegments:
+ **Initialization** – Represents time spent loading your function and running [initialization code](foundation-progmodel.md). This subsegment only appears for the first event that each instance of your function processes.
+ **Invocation** – Represents the time spent running your handler code.
+ **Overhead** – Represents the time the Lambda runtime spends preparing to handle the next event.

The new-style function segment doesn't contain an `Invocation` subsegment. Instead, customer subsegments are attached directly to the function segment. For more information about the structure of the old- and new-style function segments, see [Understanding X-Ray traces](services-xray.md#services-xray-traces).

You can also instrument HTTP clients, record SQL queries, and create custom subsegments with annotations and metadata. For more information, see the [AWS X-Ray SDK for .NET](https://docs.aws.amazon.com/xray/latest/devguide/xray-sdk-dotnet.html) in the *AWS X-Ray Developer Guide*.

**Pricing**  
You can use X-Ray tracing for free each month up to a certain limit as part of the AWS Free Tier. Beyond that threshold, X-Ray charges for trace storage and retrieval. For more information, see [AWS X-Ray pricing](https://aws.amazon.com/xray/pricing/).

# AWS Lambda function testing in C\$1
Testing

**Note**  
See the [Testing functions](testing-guide.md) chapter for a complete introduction to techniques and best practices for testing serverless solutions. 

 Testing serverless functions uses traditional test types and techniques, but you must also consider testing serverless applications as a whole. Cloud-based tests will provide the **most accurate** measure of quality of both your functions and serverless applications. 

 A serverless application architecture includes managed services that provide critical application functionality through API calls. For this reason, your development cycle should include automated tests that verify functionality when your function and services interact. 

 If you do not create cloud-based tests, you could encounter issues due to differences between your local environment and the deployed environment. Your continuous integration process should run tests against a suite of resources provisioned in the cloud before promoting your code to the next deployment environment, such as QA, Staging, or Production. 

 Continue reading this short guide to learn about testing strategies for serverless applications, or visit the [Serverless Test Samples repository](https://github.com/aws-samples/serverless-test-samples) to dive in with practical examples, specific to your chosen language and runtime. 

 ![\[illustration showing the relationship between types of tests\]](http://docs.aws.amazon.com/lambda/latest/dg/images/test-type-illustration2.png) 

 For serverless testing, you will still write *unit*, *integration* and *end-to-end* tests. 
+ **Unit tests** - Tests that run against an isolated block of code. For example, verifying the business logic to calculate the delivery charge given a particular item and destination.
+ **Integration tests** - Tests involving two or more components or services that interact, typically in a cloud environment. For example, verifying a function processes events from a queue.
+ **End-to-end tests** - Tests that verify behavior across an entire application. For example, ensuring infrastructure is set up correctly and that events flow between services as expected to record a customer's order.

## Testing your serverless applications


 You will generally use a mix of approaches to test your serverless application code, including testing in the cloud, testing with mocks, and occasionally testing with emulators. 

### Testing in the cloud


 Testing in the cloud is valuable for all phases of testing, including unit tests, integration tests, and end-to-end tests. You run tests against code deployed in the cloud and interacting with cloud-based services. This approach provides the **most accurate** measure of quality of your code. 

 A convenient way to debug your Lambda function in the cloud is through the console with a test event. A *test event* is a JSON input to your function. If your function does not require input, the event can be an empty JSON document `({})`. The console provides sample events for a variety of service integrations. After creating an event in the console, you can share it with your team to make testing easier and consistent. 

**Note**  
[Testing a function in the console](testing-functions.md) is a quick way to get started, but automating your test cycles ensures application quality and development speed. 

### Testing tools


To accelerate your development cycle, there are a number of tools and techniques you can use when testing your functions. For example, [AWS SAM Accelerate](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/using-sam-cli-sync.html) and [AWS CDK watch mode](https://docs.aws.amazon.com/cdk/v2/guide/cli.html#cli-deploy-watch) both decrease the time required to update cloud environments.

The way you define your Lambda function code makes it simple to add unit tests. Lambda requires a public, parameterless constructor to initialize your class. Introducing a second, internal constructor gives you control of the dependencies your application uses.

```
[assembly: LambdaSerializer(typeof(Amazon.Lambda.Serialization.SystemTextJson.DefaultLambdaJsonSerializer))]

namespace GetProductHandler;

public class Function
{
    private readonly IDatabaseRepository _repo;
    
    public Function(): this(null)
    {
    }
    
    internal Function(IDatabaseRepository repo)
    {
        this._repo = repo ?? new DatabaseRepository();
    }
    
    public async Task<APIGatewayProxyResponse> FunctionHandler(APIGatewayProxyRequest request)
    {
        var id = request.PathParameters["id"];
        
        var databaseRecord = await this._repo.GetById(id);
        
        return new APIGatewayProxyResponse 
        {
            StatusCode = (int)HttpStatusCode.OK,
            Body = JsonSerializer.Serialize(databaseRecord)
        };
    }
}
```

To write a test for this function, you can initialize a new instance of your `Function` class and pass in a mocked implementation of the `IDatabaseRepository`. The below examples uses `XUnit`, `Moq`, and `FluentAssertions` to write a simple test ensuring the `FunctionHandler` returns a 200 status code.

```
using Xunit;
using Moq;
using FluentAssertions;

public class FunctionTests
{
    [Fact]
    public async Task TestLambdaHandler_WhenInputIsValid_ShouldReturn200StatusCode()
    {
        // Arrange
        var mockDatabaseRepository = new Mock<IDatabaseRepository>();
        
        var functionUnderTest = new Function(mockDatabaseRepository.Object);
        
        // Act
        var response = await functionUnderTest.FunctionHandler(new APIGatewayProxyRequest());
        
        // Assert
        response.StatusCode.Should().Be(200);
    }
}
```

For more detailed examples, including examples of asynchronous tests, see the [.NET testing samples repository](https://github.com/aws-samples/serverless-test-samples/tree/main/dotnet-test-samples) on GitHub.