

# Use action types, custom actions, and approval actions
Use action types, custom actions, and approval actions

In AWS CodePipeline, an action is part of the sequence in a stage of a pipeline. It is a task performed on the artifact in that stage. Pipeline actions occur in a specified order, in sequence or in parallel, as determined in the configuration of the stage.

CodePipeline provides support for six types of actions:
+ Source 
+ Build 
+ Test 
+ Deploy 
+ Approval 
+ Invoke 

For information about the AWS service and partner products and services you can integrate into your pipeline based on action type, see [Integrations with CodePipeline action types](integrations-action-type.md).

**Topics**
+ [Working with action types](action-types.md)
+ [Create a custom action for a pipeline](actions-create-custom-action.md)
+ [Tag a custom action in CodePipeline](customactions-tag.md)
+ [Invoke a Lambda function in a pipeline](actions-invoke-lambda-function.md)
+ [Add a manual approval action to a stage](approvals.md)
+ [Add a cross-Region action to a pipeline](actions-create-cross-region.md)
+ [Working with variables](actions-variables.md)

# Working with action types
Working with action types

Action types are preconfigured actions that you as a provider create for customers by using one of the supported integration models in AWS CodePipeline. 

You can request, view, and update action types. If the action type is created for your account as the owner, you can use the AWS CLI to view or update your action type properties and structure. If you are the provider or owner of the action type, your customers can choose the action and add it to their pipelines after it is available in CodePipeline.

**Note**  
You create actions with `custom` in the `owner` field to run with a job worker. You do not create them with an integration model. For information about custom actions, see [Create and add a custom action in CodePipeline](actions-create-custom-action.md).

**Action type components**

The following components make up an action type.
+ **Action type ID** – The *ID* consists of the category, owner, provider, and version. The following example shows an action type ID with an owner of `ThirdParty`, a category of `Test`, a provider named `TestProvider`, and a version of `1`.

  ```
              {
                  "Category": "Test",
                  "Owner": "ThirdParty",
                  "Provider": "TestProvider",
                  "Version": "1"
              },
  ```
+ **Executor configuration** – The integration model, or action engine, specified when the action is created. When you specify the executor for an action type, you choose one of two types:
  + *Lambda:* The action type owner writes the integration as a Lambda function, which is invoked by CodePipeline whenever there is a job available for the action.
  + *JobWorker:* The action type owner writes the integration as a job worker that polls for available jobs on customer pipelines. The job worker then runs the job and submits the job result back to CodePipeline by using CodePipeline APIs.
**Note**  
The job worker integration model is not the preferred integration model.
+ **Input and output artifacts:** Limits for the artifacts that the action type owner designates for customers of the action.
+ **Permissions:** The permissions strategy that designates customers who can access the third-party action type. The permissions strategies available depend on the chosen integration model for the action type.
+ **URLs:** Deep links to resources that the customer can interact with, such as the action type owner's configuration page.

**Topics**
+ [

## Request an action type
](#action-types-request)
+ [

## Add an available action type to a pipeline (console)
](#action-types-in-pipelines)
+ [

## View an action type
](#action-types-view-cli)
+ [

## Update an action type
](#action-types-update-cli)

## Request an action type


When a new CodePipeline action type is requested by a third-party provider, the action type is created for the action type owner in CodePipeline, and the owner can manage and view the action type.

An action type can be either a private or public action. When your action type is created, it is private. To request an action type be changed to a public action, contact the CodePipeline service team.

Before you create your action definition file, executor resources, and action type request for the CodePipeline team, you must choose an integration model.



### Step 1: Choose your integration model


Choose your integration model and then create the configuration for that model. After you choose the integration model, you must configure your integration resources.
+ For the Lambda integration model, you create a Lambda function and add permissions. Add permissions to your integrator Lambda function to provide the CodePipeline service with permissions to invoke it using the CodePipeline service principal: `codepipeline.amazonaws.com`. The permissions can be added using CloudFormation or the command line.
  + Example for adding permissions using CloudFormation:

    ```
      CodePipelineLambdaBasedActionPermission:
        Type: 'AWS::Lambda::Permission'
        Properties:
          Action: 'lambda:invokeFunction'
          FunctionName: {"Fn::Sub": "arn:${AWS::Partition}:lambda:${AWS::Region}:${AWS::AccountId}:function:function-name"}
          Principal: codepipeline.amazonaws.com
    ```
  + [Documentation for command line](https://docs.aws.amazon.com/cli/latest/reference/lambda/add-permission.html)
+ For the job worker integration model, you create an integration with a list of allowed accounts where the job worker polls for jobs with the CodePipeline APIs.

### Step 2: Create an action type definition file


You define an action type in an action type definition file using JSON. In the file, you include the action category, the integration model used to manage the action type, and configuration properties. 

**Note**  
After you create a public action, you can't change the action type property under `properties` from `optional` to `required`. You also can't change the `owner`.

For more information about the action type definition file parameters, see [ActionTypeDeclaration](http://docs.aws.amazon.com/cli/latest/reference/codepipelineAPI_ActionTypeDeclaration.html) and [UpdateActionType](http://docs.aws.amazon.com/cli/latest/reference/codepipelineAPI_UpdateActionType.html) in the [CodePipeline API Reference](http://docs.aws.amazon.com/cli/latest/reference/codepipeline). 

There are eight sections in the action type definition file:
+ `description`: The description for the action type to be updated.
+ `executor`: Information about the executor for an action type that was created with a supported integration model, either `Lambda` or `job worker`. You can only provide either `jobWorkerExecutorConfiguration` or `lambdaExecutorConfiguration`, based on your executor type.
  + `configuration`: Resources for the configuration of the action type, based on the chosen integration model. For the Lambda integration model, use the Lambda function ARN. For the job worker integration model, use the account or list of accounts from where the job worker runs.
  + `jobTimeout`: The timeout in seconds for the job. An action execution can consist of multiple jobs. This is the timeout for a single job, and not for the entire action execution.
**Note**  
For the Lambda integration model, the maximum timeout is 15 minutes.
  + `policyStatementsTemplate`: The policy statement that specifies the permissions in the CodePipeline customer’s account that are needed to successfully run an action execution.
  + `type`: The integration model used to create and update the action type, either `Lambda` or `JobWorker`.
+ `id`: The category, owner, provider, and version ID for the action type: 
  + `category`: The kind of action can be taken in the stage: Source, Build, Deploy, Test, Invoke, or Approval. 
  + `provider`: The provider of the action type being called, such as the provider company or product name. The provider name is supplied when the action type is created.
  + `owner`: The creator of the action type being called: `AWS` or `ThirdParty`.
  + `version`: A string used to version the action type. For the first version, set the version number to 1.
+ `inputArtifactDetails`: The number of artifacts to expect from the previous stage in the pipeline.
+ `outputArtifactDetails`: The number of artifacts to expect from the result from the action type stage. 
+ `permissions`: Details identifying the accounts with permissions to use the action type.
+ `properties`: The parameters required for your project tasks to complete.
  + `description`: The description of the property that is displayed to users.
  + `optional`: Whether the configuration property is optional.
  + `noEcho`: Whether the field value entered by the customer is omitted from the log. If `true`, then the value is redacted when returned with a GetPipeline API request.
  + `key`: Whether the configuration property is a key.
  + `queryable`: Whether the property is used with polling. An action type can have up to one queryable property. If it has one, that property must be both required and not secret.
  + `name`: The property name that is displayed to users.
+ `urls`: A list of the URLs CodePipeline displays to your users.
  + `entityUrlTemplate`: URL to the external resources for the action type, such as a configuration page.
  + `executionUrlTemplate`: URL to the details for the latest run of the action.
  + `revisionUrlTemplate`: URL displayed in the CodePipeline console to the page where customers can update or change the configuration of the external action.
  + `thirdPartyConfigurationUrl`: URL of a page where users can sign up for an external service and perform initial configuration of the action provided by that service.

The following code shows an example action type definition file.

```
{
   "actionType": { 
      "description": "string",
      "executor": { 
         "configuration": { 
            "jobWorkerExecutorConfiguration": { 
               "pollingAccounts": [ "string" ],
               "pollingServicePrincipals": [ "string" ]
            },
            "lambdaExecutorConfiguration": { 
               "lambdaFunctionArn": "string"
            }
         },
         "jobTimeout": number,
         "policyStatementsTemplate": "string",
         "type": "string"
      },
      "id": { 
         "category": "string",
         "owner": "string",
         "provider": "string",
         "version": "string"
      },
      "inputArtifactDetails": { 
         "maximumCount": number,
         "minimumCount": number
      },
      "outputArtifactDetails": { 
         "maximumCount": number,
         "minimumCount": number
      },
      "permissions": { 
         "allowedAccounts": [ "string" ]
      },
      "properties": [ 
         { 
            "description": "string",
            "key": boolean,
            "name": "string",
            "noEcho": boolean,
            "optional": boolean,
            "queryable": boolean
         }
      ],
      "urls": { 
         "configurationUrl": "string",
         "entityUrlTemplate": "string",
         "executionUrlTemplate": "string",
         "revisionUrlTemplate": "string"
      }
   }
}
```



### Step 3: Register Your Integration with CodePipeline
Step 3: Register Your Integration with CodePipeline

To register your action type with CodePipeline, you contact the CodePipeline service team with your request.

The CodePipeline service team registers the new action type integration by making changes in the service codebase. CodePipeline registers two new actions: a *public action* and a *private action*. You use the private action for testing, and then when ready, you activate the public action to serve customer traffic.

**To register a request for a Lambda integration**
+ Send a request to the CodePipeline service team using the following form.

  ```
  This issue will track the onboarding of [Name] in CodePipeline.
  
  
  [Contact engineer] will be the primary point of contact for this integration.
  
  Name of the action type as you want it to appear to customers: Example.com Testing
  
  Initial onboard checklist:
  
  1. Attach an action type definition file in JSON format. This includes the schema for the action type
  
  2. A list of test accounts for the allowlist which can access the new action type [{account, account_name}]
  
  3. The Lambda function ARN
  
  4. List of AWS Regions where your action will be available
  
  5. Will this be available as a public action?
  ```

**To register a request for a job worker integration**
+ Send a request to the CodePipeline service team using the following form.

  ```
  This issue will track the onboarding of [Name] in CodePipeline.
  
  [Contact engineer] will be the primary point of contact for this integration.
  
  
  Name of the action type as you want it to appear to customers: Example.com Testing
  
  Initial onboard checklist:
  
  1. Attach an action type definition file in JSON format. This includes the schema for the action type.
  
  2. A list of test accounts for the allowlist which can access the new action type [{account, account_name}]
  
  3. URL information:
  Website URL: https://www.example.com/%TestThirdPartyName%/%TestVersionNumber%
  
  Example URL pattern where customers will be able to review their configuration information for the action: https://www.example.com/%TestThirdPartyName%/%customer-ID%/%CustomerActionConfiguration%
  
  Example runtime URL pattern: https://www.example.com/%TestThirdPartyName%/%customer-ID%/%TestRunId%
  
  4. List of AWS Regions where your action will be available
  
  5. Will this be available as a public action?
  ```

### Step 4: Activate Your New Integration


Contact the CodePipeline service team when you are ready to use the new integration publicly.

## Add an available action type to a pipeline (console)


You add your action type to a pipeline so that you can test it. You can do this by creating a new pipeline or editing an existing one. 

**Note**  
If your action type is a source, build, or deploy category action, you can add it by creating a pipeline. If your action type is in the test category, you must add it by editing an existing pipeline.

**To add your action type to an existing pipeline from the CodePipeline console**

1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

1. In the list of pipelines, choose the pipeline where you want to add the action type.

1. On the summary view page of the pipeline, choose **Edit**.

1. Choose to edit the stage. In the stage where you want to add your action type, choose **Add action group**. The **Edit action ** page displays.

1. On the **Edit action** page, in **Action name**, enter a name for the action. This is the name that displays for the stage in your pipeline.

1. In **Action provider**, choose your action type from the list. 

   Note that the value in the list is based on the `provider` specified in the action type definition file.

1. In **Input artifacts**, enter the artifact name in this format:

   `Artifactname::FileName`

   Note that the minimum and maximum quantities allowed are defined based on the `inputArtifactDetails` specified in the action type definition file.

1. Choose **Connect to <Action\$1Name>**.

   A browser window opens and connects to the website you have created for your action type.

1. Log in to your website as a customer and complete the steps a customer takes to use your action type. Your steps will vary depending on your action category, website, and configuration, but usually includes a completion action that returns the customer to the **Edit action** page.

1. In the CodePipeline **Edit action** page, the additional configuration fields for the action display. The fields that display are the configuration properties that you specified in the action definition file. Enter theinformation in the fields that are customized for your action type.

   For example, if the action definition file specified a property named `Host`, then a field with the label **Host** is shown on the **Edit action** page for your action.

1. In **Output artifacts**, enter the artifact name in this format:

   `Artifactname::FileName`

   Note that the minimum and maximum quantities allowed are defined based on the `outputArtifactDetails` specified in the action type definition file.

1. Choose **Done** to return to the pipeline details page.
**Note**  
Your customers can optionally use the CLI to add the action type to their pipeline. 

1. To test your action, commit a change to the source specified in the source stage of the pipeline or follow the steps in [Manually Start a Pipeline](https://docs.aws.amazon.com/codepipeline/latest/userguide/how-to-manually-start.html).

To create a pipeline with your action type, follow the steps in [Create a pipeline, stages, and actions](pipelines-create.md) and choose your action type from as many stages as you will test.

## View an action type


You can use the CLI to view your action type. Use the **get-action-type** command to view action types that have been created using an integration model.

**To view an action type**

1. Create an input JSON file and name the file `file.json`. Add your action type ID in JSON format as shown in the following example.

   ```
   {
       "category": "Test",
       "owner": "ThirdParty",
       "provider": "TestProvider",
       "version": "1"
   }
   ```

1. In a terminal window or at the command line, run the **get-action-type** command.

   ```
   aws codepipeline get-action-type --cli-input-json file://file.json
   ```

   This command returns the action definition output for an action type. This example shows an action type that was created with the Lambda integration model.

   ```
   {
       "actionType": {
           "executor": {
               "configuration": {
                   "lambdaExecutorConfiguration": {
                       "lambdaFunctionArn": "arn:aws:lambda:us-west-2:<account-id>:function:my-function"
                   }
               },
               "type": "Lambda"
           },
           "id": {
               "category": "Test",
               "owner": "ThirdParty",
               "provider": "TestProvider",
               "version": "1"
           },
           "inputArtifactDetails": {
               "minimumCount": 0,
               "maximumCount": 1
           },
           "outputArtifactDetails": {
               "minimumCount": 0,
               "maximumCount": 1
           },
           "permissions": {
               "allowedAccounts": [
                   "<account-id>"
               ]
           },
           "properties": []
       }
   }
   ```

## Update an action type


You can use the CLI to edit action types that are created with an integration model.

For a public action type, you can't update the owner, you can't change optional properties to required, and you can only add new optional properties.

1. Use the `get-action-type` command to get the structure for your action type. Copy the structure.

1. Create an input JSON file and name it `action.json`. Paste the action type structure you copied in the previous step into it. Update any parameters you want to change. You can also add optional parameters. 

   For more information about the parameters for the input file, see the action definition file description in [Step 2: Create an action type definition file](#action-type-definition-file).

   The following example shows how to update an example action type created with the Lambda integration model. This example makes the following changes:
   + Changes the `provider` name to `TestProvider1`.
   + Add a job timeout limit of 900 seconds.
   + Adds an action configuration property named `Host` that is displayed to the customer using the action.

     ```
     {
         "actionType": {
             "executor": {
                 "configuration": {
                     "lambdaExecutorConfiguration": {
                         "lambdaFunctionArn": "arn:aws:lambda:us-west-2:<account-id>:function:my-function"
                     }
                 },
                 "type": "Lambda",
                 "jobTimeout": 900 
             },
             "id": {
                 "category": "Test",
                 "owner": "ThirdParty",
                 "provider": "TestProvider1",
                 "version": "1"
             },
             "inputArtifactDetails": {
                 "minimumCount": 0,
                 "maximumCount": 1
             },
             "outputArtifactDetails": {
                 "minimumCount": 0,
                 "maximumCount": 1
             },
             "permissions": {
                 "allowedAccounts": [
                     "account-id"
                 ]
             },
             "properties": {
              "description": "Owned build action parameter description",
              "optional": true,
              "noEcho": false,
              "key": true,
              "queryable": false,
              "name": "Host"
              }
         }
     }
     ```

1. At the terminal or command line, run the **update-action-type** command

   ```
   aws codepipeline update-action-type --cli-input-json file://action.json
   ```

   This command returns the action type output to match your updated parameters.

# Create and add a custom action in CodePipeline
Create a custom action for a pipeline

AWS CodePipeline includes a number of actions that help you configure build, test, and deploy resources for your automated release process. If your release process includes activities that are not included in the default actions, such as an internally developed build process or a test suite, you can create a custom action for that purpose and include it in your pipeline. You can use the AWS CLI to create custom actions in pipelines associated with your AWS account.

You can create custom actions for the following AWS CodePipeline action categories:
+ A custom build action that builds or transforms the items
+ A custom deploy action that deploys items to one or more servers, websites, or repositories
+ A custom test action that configures and runs automated tests
+ A custom invoke action that runs functions

When you create a custom action, you must also create a job worker that will poll CodePipeline for job requests for this custom action, execute the job, and return the status result to CodePipeline. This job worker can be located on any computer or resource as long as it has access to the public endpoint for CodePipeline. To easily manage access and security, consider hosting your job worker on an Amazon EC2 instance. 

The following diagram shows a high-level view of a pipeline that includes a custom build action:

![\[A high-level view of a pipeline that includes a custom build action.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/PipelineCustomActionCS.png)


When a pipeline includes a custom action as part of a stage, the pipeline will create a job request. A custom job worker detects that request and performs that job (in this example, a custom process using third-party build software). When the action is complete, the job worker returns either a success result or a failure result. If a success result is received, the pipeline will provide the revision and its artifacts to the next action. If a failure is returned, the pipeline will not provide the revision to the next action in the pipeline.

**Note**  
These instructions assume that you have already completed the steps in [Getting started with CodePipeline](getting-started-codepipeline.md).

**Topics**
+ [

## Create a custom action
](#actions-create-custom-action-cli)
+ [

## Create a job worker for your custom action
](#actions-create-custom-action-job-worker)
+ [

## Add a custom action to a pipeline
](#actions-create-custom-action-add)

## Create a custom action


**To create a custom action with the AWS CLI**

1. Open a text editor and create a JSON file for your custom action that includes the action category, the action provider, and any settings required by your custom action. For example, to create a custom build action that requires only one property, your JSON file might look like this:

   ```
   {
       "category": "Build",
       "provider": "My-Build-Provider-Name",
       "version": "1",
       "settings": {
           "entityUrlTemplate": "https://my-build-instance/job/{Config:ProjectName}/",
           "executionUrlTemplate": "https://my-build-instance/job/{Config:ProjectName}/lastSuccessfulBuild/{ExternalExecutionId}/"
       },
       "configurationProperties": [{
           "name": "ProjectName",
           "required": true,
           "key": true,
           "secret": false,
           "queryable": false,
           "description": "The name of the build project must be provided when this action is added to the pipeline.",
           "type": "String"
       }],
       "inputArtifactDetails": {
           "maximumCount": integer,
           "minimumCount": integer
       },
       "outputArtifactDetails": {
           "maximumCount": integer,
           "minimumCount": integer
       },
       "tags": [{
         "key": "Project",
         "value": "ProjectA"
       }]
   }
   ```

   This example adds tagging to the custom action by including the `Project` tag key and `ProjectA` value on the custom action. For more information about tagging resources in CodePipeline, see [Tagging resources](tag-resources.md).

   There are two properties included in the JSON file, `entityUrlTemplate` and `executionUrlTemplate`. You can refer to a name in the configuration properties of the custom action within the URL templates by following the format of `{Config:name}`, as long as the configuration property is both required and not secret. For example, in the sample above, the `entityUrlTemplate` value refers to the configuration property *ProjectName*.
   + `entityUrlTemplate`: the static link that provides information about the service provider for the action. In the example, the build system includes a static link to each build project. The link format will vary, depending on your build provider (or, if you are creating a different action type, such as test, other service provider). You must provide this link format so that when the custom action is added, the user can choose this link to open a browser to a page on your website that provides the specifics for the build project (or test environment).
   + `executionUrlTemplate`: the dynamic link that will be updated with information about the current or most recent run of the action. When your custom job worker updates the status of a job (for example, success, failure, or in progress), it will also provide an `externalExecutionId` that will be used to complete the link. This link can be used to provide details about the run of an action. 

   For example, when you view the action in the pipeline, you see the following two links:  
![\[Links in the CodePipeline console lead to more information about the run of a pipeline.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/codepipeline-calinksexplained.png)

   ![\[1\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/number-1.png) This static link appears after you add your custom action and points to the address in `entityUrlTemplate`, which you specify when you create your custom action.

   ![\[2\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/number-2.png) This dynamic link is updated after every run of the action and points to the address in `executionUrlTemplate`, which you specify when you create your custom action.

   For more information about these link types, as well as `RevisionURLTemplate` and `ThirdPartyURL`, see [ActionTypeSettings](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_ActionTypeSettings.html) and [CreateCustomActionType](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_CreateCustomActionType.html) in the [CodePipeline API Reference](https://docs.aws.amazon.com/codepipeline/latest/APIReference/). For more information about action structure requirements and how to create an action, see [CodePipeline pipeline structure reference](reference-pipeline-structure.md).

1. Save the JSON file and give it a name you can easily remember (for example, *MyCustomAction*.json).

1. Open a terminal session (Linux, OS X, Unix) or command prompt (Windows) on a computer where you have installed the AWS CLI.

1. Use the AWS CLI to run the **aws codepipeline create-custom-action-type** command, specifying the name of the JSON file you just created.

   For example, to create a build custom action:
**Important**  
Be sure to include `file://` before the file name. It is required in this command.

   ```
   aws codepipeline create-custom-action-type --cli-input-json file://MyCustomAction.json
   ```

1. This command returns the entire structure of the custom action you created, as well as the `JobList` action configuration property, which is added for you. When you add the custom action to a pipeline, you can use `JobList` to specify which projects from the provider you can poll for jobs. If you do not configure this, all available jobs will be returned when your custom job worker polls for jobs. 

   For example, the preceding command might return a structure similar to the following:

   ```
   {
       "actionType": {
           "inputArtifactDetails": {               
               "maximumCount": 1,                
               "minimumCount": 1
          },
          "actionConfigurationProperties": [
               {
                   "secret": false,
                   "required": true,
                   "name": "ProjectName",
                   "key": true,
                   "description": "The name of the build project must be provided when this action is added to the pipeline."
               }
           ],
           "outputArtifactDetails": {               
               "maximumCount": 0,                
               "minimumCount": 0
           },
           "id": {
               "category": "Build",
               "owner": "Custom",
               "version": "1",
               "provider": "My-Build-Provider-Name"
           },
           "settings": {
               "entityUrlTemplate": "https://my-build-instance/job/{Config:ProjectName}/",
               "executionUrlTemplate": "https://my-build-instance/job/mybuildjob/lastSuccessfulBuild/{ExternalExecutionId}/"
           }
       }
   }
   ```
**Note**  
As part of the output of the **create-custom-action-type** command, the `id` section includes `"owner": "Custom"`. CodePipeline automatically assigns `Custom` as the owner of custom action types. This value can't be assigned or changed when you use the **create-custom-action-type** command or the **update-pipeline** command.

## Create a job worker for your custom action


Custom actions require a job worker that will poll CodePipeline for job requests for the custom action, execute the job, and return the status result to CodePipeline. The job worker can be located on any computer or resource as long as it has access to the public endpoint for CodePipeline. 

There are many ways to design your job worker. The following sections provide some practical guidance for developing your custom job worker for CodePipeline.

**Topics**
+ [

### Choose and configure a permissions management strategy for your job worker
](#actions-create-custom-action-permissions)
+ [

### Develop a job worker for your custom action
](#actions-create-custom-action-job-worker-workflow)
+ [

### Custom job worker architecture and examples
](#actions-create-custom-action-job-worker-common)

### Choose and configure a permissions management strategy for your job worker


To develop a custom job worker for your custom action in CodePipeline, you will need a strategy for the integration of user and permission management. 

The simplest strategy is to add the infrastructure you need for your custom job worker by creating Amazon EC2 instances with an IAM instance role, which allow you to easily scale up the resources you need for your integration. You can use the built-in integration with AWS to simplify the interaction between your custom job worker and CodePipeline. 

**To set up Amazon EC2 instances**

1. Learn more about Amazon EC2 and determine whether it is the right choice for your integration. For information, see [Amazon EC2 - Virtual Server Hosting](http://aws.amazon.com/ec2).

1. Get started creating your Amazon EC2 instances. For information, see [Getting Started with Amazon EC2 Linux Instances](https://docs.aws.amazon.com/AWSEC2/latest/GettingStartedGuide/).

Another strategy to consider is using identity federation with IAM to integrate your existing identity provider system and resources. This strategy is particularly useful if you already have a corporate identity provider or are already configured to support users using web identity providers. Identity federation allows you to grant secure access to AWS resources, including CodePipeline, without having to create or manage IAM users. You can use features and policies for password security requirements and credential rotation. You can use sample applications as templates for your own design. 

**To set up identity federation**

1. Learn more about IAM identity federation. For information, see [Manage Federation](http://aws.amazon.com/iam/details/manage-federation/).

1.  Review the examples in [Scenarios for Granting Temporary Access](https://docs.aws.amazon.com/STS/latest/UsingSTS/STSUseCases.html) to identify the scenario for temporary access that best fits the needs of your custom action.

1. Review code examples of identity federation relevant to your infrastructure, such as:
   + [Identity Federation Sample Application for an Active Directory Use Case](http://aws.amazon.com/code/1288653099190193)

1. Get started configuring identity federation. For information, see [Identity Providers and Federation](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_providers.html) in *IAM User Guide*.

Ceate one of the following to use under your AWS account when running your custom action and job worker.



Users need programmatic access if they want to interact with AWS outside of the AWS Management Console. The way to grant programmatic access depends on the type of user that's accessing AWS.

To grant users programmatic access, choose one of the following options.


****  

| Which user needs programmatic access? | To | By | 
| --- | --- | --- | 
| IAM | (Recommended) Use console credentials as temporary credentials to sign programmatic requests to the AWS CLI, AWS SDKs, or AWS APIs. |  Following the instructions for the interface that you want to use. [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/actions-create-custom-action.html)  | 
|  Workforce identity (Users managed in IAM Identity Center)  | Use temporary credentials to sign programmatic requests to the AWS CLI, AWS SDKs, or AWS APIs. |  Following the instructions for the interface that you want to use. [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/actions-create-custom-action.html)  | 
| IAM | Use temporary credentials to sign programmatic requests to the AWS CLI, AWS SDKs, or AWS APIs. | Following the instructions in [Using temporary credentials with AWS resources](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_temp_use-resources.html) in the IAM User Guide. | 
| IAM | (Not recommended)Use long-term credentials to sign programmatic requests to the AWS CLI, AWS SDKs, or AWS APIs. |  Following the instructions for the interface that you want to use. [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/actions-create-custom-action.html)  | 

The following is an example policy you might create for use with your custom job worker. This policy is meant as an example only and is provided as-is.

------
#### [ JSON ]

****  

```
{
  "Version":"2012-10-17",		 	 	 
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
        "codepipeline:PollForJobs",
        "codepipeline:AcknowledgeJob",
        "codepipeline:GetJobDetails",
        "codepipeline:PutJobSuccessResult",
        "codepipeline:PutJobFailureResult"
      ],
      "Resource": [
        "arn:aws:codepipeline:us-east-2::actionType:custom/Build/MyBuildProject/1/"  
      ]              
    }
  ]
}
```

------

**Note**  
Consider using the `AWSCodePipelineCustomActionAccess` managed policy.

### Develop a job worker for your custom action


After you've chosen your permissions management strategy, you should consider how your job worker will interact with CodePipeline. The following high-level diagram shows the workflow of a custom action and job worker for a build process.

![\[The workflow of a custom action and job worker for a build process.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/PipelineCustomAgent.png)


1. Your job worker polls CodePipeline for jobs using `PollForJobs`.

1. When a pipeline is triggered by a change in its source stage (for example, when a developer commits a change), the automated release process begins. The process continues until the stage at which your custom action has been configured. When it reaches your action in this stage, CodePipeline queues a job. This job will appear if your job worker calls `PollForJobs` again to get status. Take the job detail from `PollForJobs` and pass it back to your job worker. 

1. The job worker calls `AcknowledgeJob` to send CodePipeline a job acknowledgment. CodePipeline returns an acknowledgment that indicates the job worker should continue the job (`InProgress`), or, if you have more than one job worker polling for jobs and another job worker has already claimed the job, an `InvalidNonceException` error response will be returned. After the `InProgress` acknowledgment, CodePipeline waits for results to be returned.

1. The job worker initiates your custom action on the revision, and then your action runs. Along with any other actions, your custom action returns a result to the job worker. In the example of a build custom action, the action pulls artifacts from the Amazon S3 bucket, builds them, and pushes successfully built artifacts back to the Amazon S3 bucket. 

1. While the action is running, the job worker can call `PutJobSuccessResult` with a continuation token (the serialization of the state of the job generated by the job worker, for example a build identifier in JSON format, or an Amazon S3 object key), as well as the `ExternalExecutionId` information that will be used to populate the link in `executionUrlTemplate`. This will update the console view of the pipeline with a working link to specific action details while it is in progress. Although not required, it is a best practice because it enables users to view the status of your custom action while it runs. 

   Once `PutJobSuccessResult` is called, the job is considered complete. A new job is created in CodePipeline that includes the continuation token. This job will appear if your job worker calls `PollForJobs` again. This new job can be used to check on the state of the action, and either returns with a continuation token, or returns without a continuation token once the action is complete.
**Note**  
If your job worker performs all the work for a custom action, you should consider breaking your job worker processing into at least two steps. The first step establishes the details page for your action. Once you have created the details page, you can serialize the state of the job worker and return it as a continuation token, subject to size limits (see [Quotas in AWS CodePipeline](limits.md)). For example, you could write the state of the action into the string you use as the continuation token. The second step (and subsequent steps) of your job worker processing perform the actual work of the action. The final step returns success or failure to CodePipeline, with no continuation token on the final step.

   For more information about using the continuation token, see the specifications for `PutJobSuccessResult` in the [CodePipeline API Reference](http://docs.aws.amazon.com/codepipeline/latest/APIReference).

1. Once the custom action completes, the job worker returns the result of the custom action to CodePipeline by calling one of two APIs: 
   + `PutJobSuccessResult` without a continuation token, which indicates the custom action ran successfully
   + `PutJobFailureResult`, which indicates the custom action did not run successfully

   Depending on the result, the pipeline will either continue on to the next action (success) or stop (failure).

### Custom job worker architecture and examples


After you have mapped out your high-level workflow, you can create your job worker. Although the specifics of your custom action will ultimately determine what is needed for your job worker, most job workers for custom actions include the following functionality:
+ Polling for jobs from CodePipeline using `PollForJobs`. 
+ Acknowledging jobs and returning results to CodePipeline using `AcknowledgeJob`, `PutJobSuccessResult`, and `PutJobFailureResult`.
+ Retrieving artifacts from and/or putting artifacts into the Amazon S3 bucket for the pipeline. To download artifacts from the Amazon S3 bucket, you must create an Amazon S3 client that uses Signature Version 4 signing (Sig V4). Sig V4 is required for AWS KMS.

  To upload artifacts to the Amazon S3 bucket, you must additionally configure the Amazon S3 `[PutObject](https://docs.aws.amazon.com/AmazonS3/latest/API/SOAPPutObject.html)` request to use encryption. Currently only AWS Key Management Service (AWS KMS) is supported for encryption. AWS KMS uses AWS KMS keys. In order to know whether to use an AWS managed key or a customer managed key to upload artifacts, your custom job worker must look at the [job data](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_JobData.html) and check the [encryption key](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_EncryptionKey.html) property. If the property is set, you should use that customer managed key ID when configuring AWS KMS. If the key property is null, you use the AWS managed key. CodePipeline uses the AWS managed key unless otherwise configured.

  For an example that shows how to create the AWS KMS parameters in Java or .NET, see [Specifying the AWS Key Management Service in Amazon S3 Using the AWS SDKs](https://docs.aws.amazon.com/AmazonS3/latest/userguide/kms-using-sdks.html). For more information about the Amazon S3 bucket for CodePipeline, see [CodePipeline concepts](concepts.md).

A more complex example of a custom job worker is available on GitHub. This sample is open source and provided as-is.
+ [Sample Job Worker for CodePipeline](https://github.com/awslabs/aws-codepipeline-custom-job-worker): Download the sample from the GitHub repository.

## Add a custom action to a pipeline


After you have a job worker, you can add your custom action to a pipeline by creating a new one and choosing it when you use the Create Pipeline wizard, by editing an existing pipeline and adding the custom action, or by using the AWS CLI, the SDKs, or the APIs.

**Note**  
You can create a pipeline in the Create Pipeline wizard that includes a custom action if it is a build or deploy action. If your custom action is in the test category, you must add it by editing an existing pipeline.

**Topics**
+ [

### Add a custom action to an existing pipeline (CLI)
](#actions-create-custom-action-add-cli)

### Add a custom action to an existing pipeline (CLI)


You can use the AWS CLI to add a custom action to an existing pipeline.

1. Open a terminal session (Linux, macOS, or Unix) or command prompt (Windows) and run the **get-pipeline** command to copy the pipeline structure you want to edit into a JSON file. For example, for a pipeline named **MyFirstPipeline**, you would type the following command: 

   ```
   aws codepipeline get-pipeline --name MyFirstPipeline >pipeline.json
   ```

   This command returns nothing, but the file you created should appear in the directory where you ran the command.

1. Open the JSON file in any text editor and modify the structure of the file to add your custom action to an existing stage. 
**Note**  
If you want your action to run in parallel with another action in that stage, make sure you assign it the same `runOrder` value as that action.

   For example, to modify the structure of a pipeline to add a stage named Build and to add a build custom action to that stage, you might modify the JSON to add the Build stage before a deployment stage as follows:

   ```
   ,
       {
         "name": "MyBuildStage",
         "actions":  [
                 {
                   "inputArtifacts": [
                   {
                      "name": "MyApp"
                    }
                      ],
                       "name": "MyBuildCustomAction",
                       "actionTypeId": {
                           "category": "Build",
                           "owner": "Custom",
                           "version": "1",
                           "provider": "My-Build-Provider-Name"
                       },
                       "outputArtifacts": [
                           {
                             "name": "MyBuiltApp"
                           }
                       ],
                       "configuration": {
                           "ProjectName": "MyBuildProject"
                       },
                       "runOrder": 1
                   }
               ]
           },      
           {
               "name": "Staging",
               "actions": [
                       {
                           "inputArtifacts": [
                               {
                                   "name": "MyBuiltApp"
                               }
                           ],
                           "name": "Deploy-CodeDeploy-Application",
                           "actionTypeId": {
                               "category": "Deploy",
                               "owner": "AWS",
                               "version": "1",
                               "provider": "CodeDeploy"
                           },
                           "outputArtifacts": [],
                           "configuration": {
                               "ApplicationName": "CodePipelineDemoApplication",
                               "DeploymentGroupName": "CodePipelineDemoFleet"
                           },
                           "runOrder": 1
                       }
                   ]
                }      
       ]
   }
   ```

1. To apply your changes, run the **update-pipeline** command, specifying the pipeline JSON file, similar to the following:
**Important**  
Be sure to include `file://` before the file name. It is required in this command.

   ```
   aws codepipeline update-pipeline --cli-input-json file://pipeline.json
   ```

   This command returns the entire structure of the edited pipeline.

1. Open the CodePipeline console and choose the name of the pipeline you just edited.

   The pipeline shows your changes. The next time you make a change to the source location, the pipeline will run that revision through the revised structure of the pipeline.

# Tag a custom action in CodePipeline
Tag a custom action in CodePipeline

Tags are key-value pairs associated with AWS resources. You can use the console or the CLI to apply tags to your custom actions in CodePipeline. For information about CodePipeline resource tagging, use cases, tag key and value constraints, and supported resource types, see [Tagging resources](tag-resources.md).

You can add, remove, and update the values of tags in a custom action. You can add up to 50 tags to each custom action. 

**Topics**
+ [

## Add tags to a custom action
](#customactions-tag-add)
+ [

## View tags for a custom action
](#customactions-tag-list)
+ [

## Edit tags for a custom action
](#customactions-tag-update)
+ [

## Remove tags from a custom action
](#customactions-tag-delete)

## Add tags to a custom action


Follow these steps to use the AWS CLI to add a tag to a custom action. To add a tag to a custom action when you create it, see [Create and add a custom action in CodePipeline](actions-create-custom-action.md).

In these steps, we assume that you have already installed a recent version of the AWS CLI or updated to the current version. For more information, see [Installing the AWS Command Line Interface](https://docs.aws.amazon.com/cli/latest/userguide/installing.html).

At the terminal or command line, run the **tag-resource** command, specifying the Amazon Resource Name (ARN) of the custom action where you want to add tags and the key and value of the tag you want to add. You can add more than one tag to a custom action. For example, to tag a custom action with two tags, a tag key named *TestActionType* with the tag value of *UnitTest*, and a tag key named *ApplicationName* with the tag value of *MyApplication*:

```
aws codepipeline tag-resource --resource-arn arn:aws:codepipeline:us-west-2:account-id:actiontype:Owner/Category/Provider/Version --tags key=TestActionType,value=UnitTest key=ApplicationName,value=MyApplication
```

If successful, this command returns nothing.

## View tags for a custom action


Follow these steps to use the AWS CLI to view the AWS tags for a custom action. If no tags have been added, the returned list is empty.

At the terminal or command line, run the **list-tags-for-resource** command. For example, to view a list of tag keys and tag values for a custom action with the ARN `arn:aws:codepipeline:us-west-2:account-id:actiontype:Owner/Category/Provider/Version`:

```
aws codepipeline list-tags-for-resource --resource-arn arn:aws:codepipeline:us-west-2:account-id:actiontype:Owner/Category/Provider/Version
```

If successful, this command returns information similar to the following:

```
{
    "tags": {
        "TestActionType": "UnitTest",
        "ApplicationName": "MyApplication"
    }
}
```

## Edit tags for a custom action


Follow these steps to use the AWS CLI to edit a tag for a custom action. You can change the value for an existing key or add another key. You can also remove tags from a custom action, as shown in the next section.

At the terminal or command line, run the **tag-resource** command, specifying the Amazon Resource Name (ARN) of the custom action where you want to update a tag and specify the tag key and tag value:

```
aws codepipeline tag-resource --resource-arn arn:aws:codepipeline:us-west-2:account-id:actiontype:Owner/Category/Provider/Version --tags key=TestActionType,value=IntegrationTest
```

## Remove tags from a custom action


Follow these steps to use the AWS CLI to remove a tag from a custom action. When you remove tags from the associated resource, the tags are deleted.

**Note**  
If you delete a custom action, all tag associations are removed from the deleted custom action. You do not have to remove tags before deleting a custom action.

At the terminal or command line, run the **untag-resource** command, specifying the ARN of the custom action where you want to remove tags and the tag key of the tag you want to remove. For example, to remove a tag on a custom action with the tag key *TestActionType*:

```
aws codepipeline untag-resource --resource-arn arn:aws:codepipeline:us-west-2:account-id:actiontype:Owner/Category/Provider/Version --tag-keys TestActionType
```

If successful, this command returns nothing. To verify the tags associated with the custom action, run the **list-tags-for-resource** command.

# Invoke an AWS Lambda function in a pipeline in CodePipeline
Invoke a Lambda function in a pipeline

[AWS Lambda](https://docs.aws.amazon.com/lambda/latest/dg/) is a compute service that lets you run code without provisioning or managing servers. You can create Lambda functions and add them as actions in your pipelines. Because Lambda allows you to write functions to perform almost any task, you can customize the way your pipeline works. 

**Important**  
Do not log the JSON event that CodePipeline sends to Lambda because this can result in user credentials being logged in CloudWatch Logs. The CodePipeline role uses a JSON event to pass temporary credentials to Lambda in the `artifactCredentials` field. For an example event, see [Example JSON event](#actions-invoke-lambda-function-json-event-example).

Here are some ways Lambda functions can be used in pipelines:
+ To create resources on demand in one stage of a pipeline using CloudFormation and delete them in another stage.
+ To deploy application versions with zero downtime in AWS Elastic Beanstalk with a Lambda function that swaps CNAME values.
+ To deploy to Amazon ECS Docker instances.
+ To back up resources before building or deploying by creating an AMI snapshot.
+ To add integration with third-party products to your pipeline, such as posting messages to an IRC client.

**Note**  
Creating and running Lambda functions might result in charges to your AWS account. For more information, see [Pricing](http://aws.amazon.com/lambda/pricing/).

This topic assumes you are familiar with AWS CodePipeline and AWS Lambda and know how to create pipelines, functions, and the IAM policies and roles on which they depend. This topic shows you how to:
+ Create a Lambda function that tests whether a webpage was deployed successfully.
+ Configure the CodePipeline and Lambda execution roles and the permissions required to run the function as part of the pipeline.
+ Edit a pipeline to add the Lambda function as an action. 
+ Test the action by manually releasing a change.

**Note**  
When using cross-Region Lambda invoke action in CodePipeline, the status of the lambda execution using the [PutJobSuccessResult](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_PutJobSuccessResult.html) and [PutJobFailureResult](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_PutJobFailureResult.html) should be sent to the AWS Region where the Lambda function is present and not to the Region where CodePipeline exists.

This topic includes sample functions to demonstrate the flexibility of working with Lambda functions in CodePipeline: 
+ [Basic Lambda function](#LambdaSample1)
  + Creating a basic Lambda function to use with CodePipeline.
  + Returning success or failure results to CodePipeline in the **Details** link for the action.
+ [Sample Python function that uses an AWS CloudFormation template](#actions-invoke-lambda-function-samples-python-cloudformation)
  + Using JSON-encoded user parameters to pass multiple configuration values to the function (`get_user_params`).
  + Interacting with .zip artifacts in an artifact bucket (`get_template`).
  + Using a continuation token to monitor a long-running asynchronous process (`continue_job_later`). This allows the action to continue and the function to succeed even if it exceeds a fifteen-minute runtime (a limit in Lambda).

Each sample function includes information about the permissions you must add to the role. For information about limits in AWS Lambda, see [Limits](https://docs.aws.amazon.com/lambda/latest/dg/limits.html) in the *AWS Lambda Developer Guide*.

**Important**  
The sample code, roles, and policies included in this topic are examples only, and are provided as-is.

**Topics**
+ [

## Step 1: Create a pipeline
](#actions-invoke-lambda-function-create-test-pipeline)
+ [

## Step 2: Create the Lambda function
](#actions-invoke-lambda-function-create-function)
+ [

## Step 3: Add the Lambda function to a pipeline in the CodePipeline console
](#actions-invoke-lambda-function-add-action)
+ [

## Step 4: Test the pipeline with the Lambda function
](#actions-invoke-lambda-function-test-function)
+ [

## Step 5: Next steps
](#actions-invoke-lambda-function-next-steps)
+ [

## Example JSON event
](#actions-invoke-lambda-function-json-event-example)
+ [

## Additional sample functions
](#actions-invoke-lambda-function-samples)

## Step 1: Create a pipeline


In this step, you create a pipeline to which you later add the Lambda function. This is the same pipeline you created in [CodePipeline tutorials](tutorials.md). If that pipeline is still configured for your account and is in the same Region where you plan to create the Lambda function, you can skip this step.

**To create the pipeline**

1. Follow the first three steps in [Tutorial: Create a simple pipeline (S3 bucket)](tutorials-simple-s3.md) to create an Amazon S3 bucket, CodeDeploy resources, and a two-stage pipeline. Choose the Amazon Linux option for your instance types. You can use any name you want for the pipeline, but the steps in this topic use MyLambdaTestPipeline. 

1. On the status page for your pipeline, in the CodeDeploy action, choose **Details**. On the deployment details page for the deployment group, choose an instance ID from the list. 

1. In the Amazon EC2 console, on the **Details** tab for the instance, copy the IP address in **Public IPv4 address** (for example, **192.0.2.4**). You use this address as the target of the function in AWS Lambda.

**Note**  
The default service role policy for CodePipeline includes the Lambda permissions required to invoke the function. However, if you have modified the default service role or selected a different one, make sure the policy for the role allows the `lambda:InvokeFunction` and `lambda:ListFunctions` permissions. Otherwise, pipelines that include Lambda actions fail.

## Step 2: Create the Lambda function


In this step, you create a Lambda function that makes an HTTP request and checks for a line of text on a webpage. As part of this step, you must also create an IAM policy and Lambda execution role. For more information, see [Permissions Model](https://docs.aws.amazon.com/lambda/latest/dg/intro-permission-model.html#lambda-intro-execution-role) in the *AWS Lambda Developer Guide*. 

**To create the execution role**

1. Sign in to the AWS Management Console and open the IAM console at [https://console.aws.amazon.com/iam/](https://console.aws.amazon.com/iam/).

1. Choose **Policies**, and then choose **Create Policy**. Choose the **JSON** tab, and then paste the following policy into the field.

------
#### [ JSON ]

****  

   ```
   {
     "Version":"2012-10-17",		 	 	  
     "Statement": [
       {
         "Action": [ 
           "logs:*"
         ],
         "Effect": "Allow", 
         "Resource": "arn:aws:logs:*:*:*"
       },
       {
         "Action": [
           "codepipeline:PutJobSuccessResult",
           "codepipeline:PutJobFailureResult"
           ],
           "Effect": "Allow",
           "Resource": "*"
        }
     ]
   }
   ```

------

1. Choose **Review policy**.

1. On the **Review policy** page, in **Name**, type a name for the policy (for example, **CodePipelineLambdaExecPolicy**). In **Description**, enter **Enables Lambda to execute code**. 

   Choose **Create Policy**.
**Note**  
These are the minimum permissions required for a Lambda function to interoperate with CodePipeline and Amazon CloudWatch. If you want to expand this policy to allow functions that interact with other AWS resources, you should modify this policy to allow the actions required by those Lambda functions.

1. On the policy dashboard page, choose **Roles**, and then choose **Create role**.

1. On the **Create role** page, choose **AWS service**. Choose **Lambda**, and then choose **Next: Permissions**.

1. On the **Attach permissions policies** page, select the check box next to **CodePipelineLambdaExecPolicy**, and then choose **Next: Tags**. Choose **Next: Review**.

1. On the **Review** page, in **Role name**, enter the name, and then choose **Create role**.<a name="LambdaSample1"></a>

**To create the sample Lambda function to use with CodePipeline**

1. Sign in to the AWS Management Console and open the AWS Lambda console at [https://console.aws.amazon.com/lambda/](https://console.aws.amazon.com/lambda/).

1. On the **Functions** page, choose **Create function**.
**Note**  
If you see a **Welcome** page instead of the **Lambda** page, choose **Get Started Now**.

1. On the **Create function** page, choose **Author from scratch**. In **Function name**, enter a name for your Lambda function (for example, **MyLambdaFunctionForAWSCodePipeline**). In **Runtime**, choose **Node.js 20.x**.

1. Under **Role**, select **Choose an existing role**. In **Existing role**, choose your role, and then choose **Create function**.

   The detail page for your created function opens.

1. Copy the following code into the **Function code** box:
**Note**  
The event object, under the CodePipeline.job key, contains the [job details](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_JobDetails.html). For a full example of the JSON event CodePipeline returns to Lambda, see [Example JSON event](#actions-invoke-lambda-function-json-event-example).

   ```
   import { CodePipelineClient, PutJobSuccessResultCommand, PutJobFailureResultCommand } from "@aws-sdk/client-codepipeline";
   import http from 'http';
   import assert from 'assert';
    
   export const handler = (event, context) => {
    
       const codepipeline = new CodePipelineClient();
    
       // Retrieve the Job ID from the Lambda action
       const jobId = event["CodePipeline.job"].id;
    
       // Retrieve the value of UserParameters from the Lambda action configuration in CodePipeline, in this case a URL which will be
       // health checked by this function.
       const url = event["CodePipeline.job"].data.actionConfiguration.configuration.UserParameters;
    
       // Notify CodePipeline of a successful job
       const putJobSuccess = async function(message) {
           const command = new PutJobSuccessResultCommand({
               jobId: jobId
           });
            try {
                await codepipeline.send(command);
                context.succeed(message);
            } catch (err) {
                context.fail(err); 
            }
       };
    
       // Notify CodePipeline of a failed job
       const putJobFailure = async function(message) {
           const command = new PutJobFailureResultCommand({
               jobId: jobId,
               failureDetails: {
                   message: JSON.stringify(message),
                   type: 'JobFailed',
                   externalExecutionId: context.awsRequestId
               }
           });
           await codepipeline.send(command);
           context.fail(message);
       };
    
       // Validate the URL passed in UserParameters
       if(!url || url.indexOf('http://') === -1) {
           putJobFailure('The UserParameters field must contain a valid URL address to test, including http:// or https://');  
           return;
       }
    
       // Helper function to make a HTTP GET request to the page.
       // The helper will test the response and succeed or fail the job accordingly
       const getPage = function(url, callback) {
           var pageObject = {
               body: '',
               statusCode: 0,
               contains: function(search) {
                   return this.body.indexOf(search) > -1;
               }
           };
           http.get(url, function(response) {
               pageObject.body = '';
               pageObject.statusCode = response.statusCode;
    
               response.on('data', function (chunk) {
                   pageObject.body += chunk;
               });
    
               response.on('end', function () {
                   callback(pageObject);
               });
    
               response.resume();
           }).on('error', function(error) {
               // Fail the job if our request failed
               putJobFailure(error);
           });
       };
    
       getPage(url, function(returnedPage) {
           try {
               // Check if the HTTP response has a 200 status
               assert(returnedPage.statusCode === 200);
               // Check if the page contains the text "Congratulations"
               // You can change this to check for different text, or add other tests as required
               assert(returnedPage.contains('Congratulations'));
    
               // Succeed the job
               putJobSuccess("Tests passed.");
           } catch (ex) {
               // If any of the assertions failed then fail the job
               putJobFailure(ex);
           }
       });
   };
   ```

1. Leave **Handler** at the default value, and leave **Role** at the default, **CodePipelineLambdaExecRole**. 

1. In **Basic settings**, for **Timeout**, enter **20** seconds.

1. Choose **Save**.

## Step 3: Add the Lambda function to a pipeline in the CodePipeline console


In this step, you add a new stage to your pipeline, and then add a Lambda action that calls your function to that stage.

**To add a stage**

1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

1. On the **Welcome** page, choose the pipeline you created.

1. On the pipeline view page, choose **Edit**.

1. On the **Edit** page, choose **\$1 Add stage** to add a stage after the deployment stage with the CodeDeploy action. Enter a name for the stage (for example, **LambdaStage**), and choose **Add stage**.
**Note**  
You can also choose to add your Lambda action to an existing stage. For demonstration purposes, we are adding the Lambda function as the only action in a stage to allow you to easily view its progress as artifacts progress through a pipeline.

1. Choose **\$1 Add action group**. In **Edit action**, in **Action name**, enter a name for your Lambda action (for example, **MyLambdaAction**). In **Provider**, choose **AWS Lambda**. In **Function name**, choose or enter the name of your Lambda function (for example, **MyLambdaFunctionForAWSCodePipeline**). In **User parameters**, specify the IP address for the Amazon EC2 instance you copied earlier (for example, **http://*192.0.2.4***), and then choose **Done**. 
**Note**  
This topic uses an IP address, but in a real-world scenario, you could provide your registered website name instead (for example, **http://*www.example.com***). For more information about event data and handlers in AWS Lambda, see [Programming Model](https://docs.aws.amazon.com/lambda/latest/dg/programming-model-v2.html) in the *AWS Lambda Developer Guide*.

1. On the **Edit action** page, choose **Save**.

## Step 4: Test the pipeline with the Lambda function


To test the function, release the most recent change through the pipeline. 

**To use the console to run the most recent version of an artifact through a pipeline**

1. On the pipeline details page, choose **Release change**. This runs the most recent revision available in each source location specified in a source action through the pipeline.

1. When the Lambda action is complete, choose the **Details** link to view the log stream for the function in Amazon CloudWatch, including the billed duration of the event. If the function failed, the CloudWatch log provides information about the cause.

## Step 5: Next steps


Now that you've successfully created a Lambda function and added it as an action in a pipeline, you can try the following:
+ Add more Lambda actions to your stage to check other webpages.
+ Modify the Lambda function to check for a different text string.
+ [Explore Lambda functions](https://docs.aws.amazon.com/lambda/latest/dg/use-cases.html) and create and add your own Lambda functions to pipelines.

![\[An AWS Lambda action running through a pipeline.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/codepipeline-lambda-action-add-2-inprog-pol.png)


After you have finished experimenting with the Lambda function, consider removing it from your pipeline, deleting it from AWS Lambda, and deleting the role from IAM to avoid possible charges. For more information, see [Edit a pipeline in CodePipeline](pipelines-edit.md), [Delete a pipeline in CodePipeline](pipelines-delete.md), and [Deleting Roles or Instance Profiles](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_manage_delete.html).

## Example JSON event


The following example shows a sample JSON event sent to Lambda by CodePipeline. The structure of this event is similar to the response to the `[GetJobDetails API](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_GetJobDetails.html)`, but without the `actionTypeId` and `pipelineContext` data types. Two action configuration details, `FunctionName` and `UserParameters`, are included in both the JSON event and the response to the `GetJobDetails` API. The values in *red italic text* are examples or explanations, not real values. 

```
{
    "CodePipeline.job": {
        "id": "11111111-abcd-1111-abcd-111111abcdef",
        "accountId": "111111111111",
        "data": {
            "actionConfiguration": {
                "configuration": {
                    "FunctionName": "MyLambdaFunctionForAWSCodePipeline",
                    "UserParameters": "some-input-such-as-a-URL"
                }
            },
            "inputArtifacts": [
                {
                    "location": {
                        "s3Location": {
                            "bucketName": "the name of the bucket configured as the pipeline artifact store in Amazon S3, for example codepipeline-us-east-2-1234567890",
                            "objectKey": "the name of the application, for example CodePipelineDemoApplication.zip"
                        },
                        "type": "S3"
                    },
                    "revision": null,
                    "name": "ArtifactName"
                }
            ],
            "outputArtifacts": [],
            "artifactCredentials": {
                "secretAccessKey": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY",
                "sessionToken": "MIICiTCCAfICCQD6m7oRw0uXOjANBgkqhkiG9w
 0BAQUFADCBiDELMAkGA1UEBhMCVVMxCzAJBgNVBAgTAldBMRAwDgYDVQQHEwdTZ
 WF0dGxlMQ8wDQYDVQQKEwZBbWF6b24xFDASBgNVBAsTC0lBTSBDb25zb2xlMRIw
 EAYDVQQDEwlUZXN0Q2lsYWMxHzAdBgkqhkiG9w0BCQEWEG5vb25lQGFtYXpvbi5
 jb20wHhcNMTEwNDI1MjA0NTIxWhcNMTIwNDI0MjA0NTIxWjCBiDELMAkGA1UEBh
 MCVVMxCzAJBgNVBAgTAldBMRAwDgYDVQQHEwdTZWF0dGxlMQ8wDQYDVQQKEwZBb
 WF6b24xFDASBgNVBAsTC0lBTSBDb25zb2xlMRIwEAYDVQQDEwlUZXN0Q2lsYWMx
 HzAdBgkqhkiG9w0BCQEWEG5vb25lQGFtYXpvbi5jb20wgZ8wDQYJKoZIhvcNAQE
 BBQADgY0AMIGJAoGBAMaK0dn+a4GmWIWJ21uUSfwfEvySWtC2XADZ4nB+BLYgVI
 k60CpiwsZ3G93vUEIO3IyNoH/f0wYK8m9TrDHudUZg3qX4waLG5M43q7Wgc/MbQ
 ITxOUSQv7c7ugFFDzQGBzZswY6786m86gpEIbb3OhjZnzcvQAaRHhdlQWIMm2nr
 AgMBAAEwDQYJKoZIhvcNAQEFBQADgYEAtCu4nUhVVxYUntneD9+h8Mg9q6q+auN
 KyExzyLwaxlAoo7TJHidbtS4J5iNmZgXL0FkbFFBjvSfpJIlJ00zbhNYS5f6Guo
 EDmFJl0ZxBHjJnyp378OD8uTs7fLvjx79LjSTbNYiytVbZPQUQ5Yaxu2jXnimvw
 3rrszlaEXAMPLE=",
                "accessKeyId": "AKIAIOSFODNN7EXAMPLE"
            },
            "continuationToken": "A continuation token if continuing job",
            "encryptionKey": { 
              "id": "arn:aws:kms:us-west-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab",
              "type": "KMS"
            }
        }
    }
}
```

## Additional sample functions


The following sample Lambda functions demonstrate additional functionality you can use for your pipelines in CodePipeline. To use these functions, you might have to modify the policy for the Lambda execution role, as noted in the introduction for each sample.

**Topics**
+ [

### Sample Python function that uses an AWS CloudFormation template
](#actions-invoke-lambda-function-samples-python-cloudformation)

### Sample Python function that uses an AWS CloudFormation template


The following sample shows a function that creates or updates a stack based on a supplied CloudFormation template. The template creates an Amazon S3 bucket. It is for demonstration purposes only, to minimize costs. Ideally, you should delete the stack before you upload anything to the bucket. If you upload files to the bucket, you cannot delete the bucket when you delete the stack. You must manually delete everything in the bucket before you can delete the bucket itself. 

This Python sample assumes you have a pipeline that uses an Amazon S3 bucket as a source action, or that you have access to a versioned Amazon S3 bucket you can use with the pipeline. You create the CloudFormation template, compress it, and upload it to that bucket as a .zip file. You must then add a source action to your pipeline that retrieves this .zip file from the bucket.

**Note**  
When Amazon S3 is the source provider for your pipeline, you may zip your source file or files into a single .zip and upload the .zip to your source bucket. You may also upload a single unzipped file; however, downstream actions that expect a .zip file will fail.

This sample demonstrates:
+ The use of JSON-encoded user parameters to pass multiple configuration values to the function (`get_user_params`).
+ The interaction with .zip artifacts in an artifact bucket (`get_template`).
+ The use of a continuation token to monitor a long-running asynchronous process (`continue_job_later`). This allows the action to continue and the function to succeed even if it exceeds a fifteen-minute runtime (a limit in Lambda).

To use this sample Lambda function, the policy for the Lambda execution role must have `Allow` permissions in CloudFormation, Amazon S3, and CodePipeline, as shown in this sample policy:

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Action": [
                "logs:*"
            ],
            "Effect": "Allow",
            "Resource": "arn:aws:logs:*:*:*"
        },
        {
            "Action": [
                "codepipeline:PutJobSuccessResult",
                "codepipeline:PutJobFailureResult"
            ],
            "Effect": "Allow",
            "Resource": "*"
        },
        {
            "Action": [
                "cloudformation:DescribeStacks",
                "cloudformation:CreateStack",
                "cloudformation:UpdateStack"
            ],
            "Effect": "Allow",
            "Resource": "*"
        },
        {
            "Action": [
                "s3:*"
            ],
            "Effect": "Allow",
            "Resource": "*"
        }
    ]
}
```

------

To create the CloudFormation template, open any plain-text editor and copy and paste the following code:

```
{
  "AWSTemplateFormatVersion" : "2010-09-09",
  "Description" : "CloudFormation template which creates an S3 bucket",
  "Resources" : {
    "MySampleBucket" : {
      "Type" : "AWS::S3::Bucket",
      "Properties" : {
      }
    }
  },
  "Outputs" : {
    "BucketName" : {
      "Value" : { "Ref" : "MySampleBucket" },
      "Description" : "The name of the S3 bucket"
    }
  } 
}
```

Save this as a JSON file with the name **template.json** in a directory named **template-package**. Create a compressed (.zip) file of this directory and file named **template-package.zip**, and upload the compressed file to a versioned Amazon S3 bucket. If you already have a bucket configured for your pipeline, you can use it. Next, edit your pipeline to add a source action that retrieves the .zip file. Name the output for this action *MyTemplate*. For more information, see [Edit a pipeline in CodePipeline](pipelines-edit.md).

**Note**  
The sample Lambda function expects these file names and compressed structure. However, you can substitute your own CloudFormation template for this sample. If you use your own template, make sure you modify the policy for the Lambda execution role to allow any additional functionality required by your CloudFormation template.

**To add the following code as a function in Lambda**

1. Open the Lambda console and choose **Create function**.

1. On the **Create function** page, choose **Author from scratch**. In **Function name**, enter a name for your Lambda function.

1. In **Runtime**, choose **Python 2.7**.

1. Under **Choose or create an execution role**, select **Use an existing role**. In **Existing role**, choose your role, and then choose **Create function**.

   The detail page for your created function opens.

1. Copy the following code into the **Function code** box:

   ```
   from __future__ import print_function
   from boto3.session import Session
   
   import json
   import urllib
   import boto3
   import zipfile
   import tempfile
   import botocore
   import traceback
   
   print('Loading function')
   
   cf = boto3.client('cloudformation')
   code_pipeline = boto3.client('codepipeline')
   
   def find_artifact(artifacts, name):
       """Finds the artifact 'name' among the 'artifacts'
       
       Args:
           artifacts: The list of artifacts available to the function
           name: The artifact we wish to use
       Returns:
           The artifact dictionary found
       Raises:
           Exception: If no matching artifact is found
       
       """
       for artifact in artifacts:
           if artifact['name'] == name:
               return artifact
               
       raise Exception('Input artifact named "{0}" not found in event'.format(name))
   
   def get_template(s3, artifact, file_in_zip):
       """Gets the template artifact
       
       Downloads the artifact from the S3 artifact store to a temporary file
       then extracts the zip and returns the file containing the CloudFormation
       template.
       
       Args:
           artifact: The artifact to download
           file_in_zip: The path to the file within the zip containing the template
           
       Returns:
           The CloudFormation template as a string
           
       Raises:
           Exception: Any exception thrown while downloading the artifact or unzipping it
       
       """
       tmp_file = tempfile.NamedTemporaryFile()
       bucket = artifact['location']['s3Location']['bucketName']
       key = artifact['location']['s3Location']['objectKey']
       
       with tempfile.NamedTemporaryFile() as tmp_file:
           s3.download_file(bucket, key, tmp_file.name)
           with zipfile.ZipFile(tmp_file.name, 'r') as zip:
               return zip.read(file_in_zip)   
      
   def update_stack(stack, template):
       """Start a CloudFormation stack update
       
       Args:
           stack: The stack to update
           template: The template to apply
           
       Returns:
           True if an update was started, false if there were no changes
           to the template since the last update.
           
       Raises:
           Exception: Any exception besides "No updates are to be performed."
       
       """
       try:
           cf.update_stack(StackName=stack, TemplateBody=template)
           return True
           
       except botocore.exceptions.ClientError as e:
           if e.response['Error']['Message'] == 'No updates are to be performed.':
               return False
           else:
               raise Exception('Error updating CloudFormation stack "{0}"'.format(stack), e)
   
   def stack_exists(stack):
       """Check if a stack exists or not
       
       Args:
           stack: The stack to check
           
       Returns:
           True or False depending on whether the stack exists
           
       Raises:
           Any exceptions raised .describe_stacks() besides that
           the stack doesn't exist.
           
       """
       try:
           cf.describe_stacks(StackName=stack)
           return True
       except botocore.exceptions.ClientError as e:
           if "does not exist" in e.response['Error']['Message']:
               return False
           else:
               raise e
   
   def create_stack(stack, template):
       """Starts a new CloudFormation stack creation
       
       Args:
           stack: The stack to be created
           template: The template for the stack to be created with
           
       Throws:
           Exception: Any exception thrown by .create_stack()
       """
       cf.create_stack(StackName=stack, TemplateBody=template)
    
   def get_stack_status(stack):
       """Get the status of an existing CloudFormation stack
       
       Args:
           stack: The name of the stack to check
           
       Returns:
           The CloudFormation status string of the stack such as CREATE_COMPLETE
           
       Raises:
           Exception: Any exception thrown by .describe_stacks()
           
       """
       stack_description = cf.describe_stacks(StackName=stack)
       return stack_description['Stacks'][0]['StackStatus']
     
   def put_job_success(job, message):
       """Notify CodePipeline of a successful job
       
       Args:
           job: The CodePipeline job ID
           message: A message to be logged relating to the job status
           
       Raises:
           Exception: Any exception thrown by .put_job_success_result()
       
       """
       print('Putting job success')
       print(message)
       code_pipeline.put_job_success_result(jobId=job)
     
   def put_job_failure(job, message):
       """Notify CodePipeline of a failed job
       
       Args:
           job: The CodePipeline job ID
           message: A message to be logged relating to the job status
           
       Raises:
           Exception: Any exception thrown by .put_job_failure_result()
       
       """
       print('Putting job failure')
       print(message)
       code_pipeline.put_job_failure_result(jobId=job, failureDetails={'message': message, 'type': 'JobFailed'})
    
   def continue_job_later(job, message):
       """Notify CodePipeline of a continuing job
       
       This will cause CodePipeline to invoke the function again with the
       supplied continuation token.
       
       Args:
           job: The JobID
           message: A message to be logged relating to the job status
           continuation_token: The continuation token
           
       Raises:
           Exception: Any exception thrown by .put_job_success_result()
       
       """
       
       # Use the continuation token to keep track of any job execution state
       # This data will be available when a new job is scheduled to continue the current execution
       continuation_token = json.dumps({'previous_job_id': job})
       
       print('Putting job continuation')
       print(message)
       code_pipeline.put_job_success_result(jobId=job, continuationToken=continuation_token)
   
   def start_update_or_create(job_id, stack, template):
       """Starts the stack update or create process
       
       If the stack exists then update, otherwise create.
       
       Args:
           job_id: The ID of the CodePipeline job
           stack: The stack to create or update
           template: The template to create/update the stack with
       
       """
       if stack_exists(stack):
           status = get_stack_status(stack)
           if status not in ['CREATE_COMPLETE', 'ROLLBACK_COMPLETE', 'UPDATE_COMPLETE']:
               # If the CloudFormation stack is not in a state where
               # it can be updated again then fail the job right away.
               put_job_failure(job_id, 'Stack cannot be updated when status is: ' + status)
               return
           
           were_updates = update_stack(stack, template)
           
           if were_updates:
               # If there were updates then continue the job so it can monitor
               # the progress of the update.
               continue_job_later(job_id, 'Stack update started')  
               
           else:
               # If there were no updates then succeed the job immediately 
               put_job_success(job_id, 'There were no stack updates')    
       else:
           # If the stack doesn't already exist then create it instead
           # of updating it.
           create_stack(stack, template)
           # Continue the job so the pipeline will wait for the CloudFormation
           # stack to be created.
           continue_job_later(job_id, 'Stack create started') 
   
   def check_stack_update_status(job_id, stack):
       """Monitor an already-running CloudFormation update/create
       
       Succeeds, fails or continues the job depending on the stack status.
       
       Args:
           job_id: The CodePipeline job ID
           stack: The stack to monitor
       
       """
       status = get_stack_status(stack)
       if status in ['UPDATE_COMPLETE', 'CREATE_COMPLETE']:
           # If the update/create finished successfully then
           # succeed the job and don't continue.
           put_job_success(job_id, 'Stack update complete')
           
       elif status in ['UPDATE_IN_PROGRESS', 'UPDATE_ROLLBACK_IN_PROGRESS', 
       'UPDATE_ROLLBACK_COMPLETE_CLEANUP_IN_PROGRESS', 'CREATE_IN_PROGRESS', 
       'ROLLBACK_IN_PROGRESS', 'UPDATE_COMPLETE_CLEANUP_IN_PROGRESS']:
           # If the job isn't finished yet then continue it
           continue_job_later(job_id, 'Stack update still in progress') 
          
       else:
           # If the Stack is a state which isn't "in progress" or "complete"
           # then the stack update/create has failed so end the job with
           # a failed result.
           put_job_failure(job_id, 'Update failed: ' + status)
   
   def get_user_params(job_data):
       """Decodes the JSON user parameters and validates the required properties.
       
       Args:
           job_data: The job data structure containing the UserParameters string which should be a valid JSON structure
           
       Returns:
           The JSON parameters decoded as a dictionary.
           
       Raises:
           Exception: The JSON can't be decoded or a property is missing.
           
       """
       try:
           # Get the user parameters which contain the stack, artifact and file settings
           user_parameters = job_data['actionConfiguration']['configuration']['UserParameters']
           decoded_parameters = json.loads(user_parameters)
               
       except Exception as e:
           # We're expecting the user parameters to be encoded as JSON
           # so we can pass multiple values. If the JSON can't be decoded
           # then fail the job with a helpful message.
           raise Exception('UserParameters could not be decoded as JSON')
       
       if 'stack' not in decoded_parameters:
           # Validate that the stack is provided, otherwise fail the job
           # with a helpful message.
           raise Exception('Your UserParameters JSON must include the stack name')
       
       if 'artifact' not in decoded_parameters:
           # Validate that the artifact name is provided, otherwise fail the job
           # with a helpful message.
           raise Exception('Your UserParameters JSON must include the artifact name')
       
       if 'file' not in decoded_parameters:
           # Validate that the template file is provided, otherwise fail the job
           # with a helpful message.
           raise Exception('Your UserParameters JSON must include the template file name')
       
       return decoded_parameters
       
   def setup_s3_client(job_data):
       """Creates an S3 client
       
       Uses the credentials passed in the event by CodePipeline. These
       credentials can be used to access the artifact bucket.
       
       Args:
           job_data: The job data structure
           
       Returns:
           An S3 client with the appropriate credentials
           
       """
       key_id = job_data['artifactCredentials']['accessKeyId']
       key_secret = job_data['artifactCredentials']['secretAccessKey']
       session_token = job_data['artifactCredentials']['sessionToken']
       
       session = Session(aws_access_key_id=key_id,
           aws_secret_access_key=key_secret,
           aws_session_token=session_token)
       return session.client('s3', config=botocore.client.Config(signature_version='s3v4'))
   
   def lambda_handler(event, context):
       """The Lambda function handler
       
       If a continuing job then checks the CloudFormation stack status
       and updates the job accordingly.
       
       If a new job then kick of an update or creation of the target
       CloudFormation stack.
       
       Args:
           event: The event passed by Lambda
           context: The context passed by Lambda
           
       """
       try:
           # Extract the Job ID
           job_id = event['CodePipeline.job']['id']
           
           # Extract the Job Data 
           job_data = event['CodePipeline.job']['data']
           
           # Extract the params
           params = get_user_params(job_data)
           
           # Get the list of artifacts passed to the function
           artifacts = job_data['inputArtifacts']
           
           stack = params['stack']
           artifact = params['artifact']
           template_file = params['file']
           
           if 'continuationToken' in job_data:
               # If we're continuing then the create/update has already been triggered
               # we just need to check if it has finished.
               check_stack_update_status(job_id, stack)
           else:
               # Get the artifact details
               artifact_data = find_artifact(artifacts, artifact)
               # Get S3 client to access artifact with
               s3 = setup_s3_client(job_data)
               # Get the JSON template file out of the artifact
               template = get_template(s3, artifact_data, template_file)
               # Kick off a stack update or create
               start_update_or_create(job_id, stack, template)  
   
       except Exception as e:
           # If any other exceptions which we didn't expect are raised
           # then fail the job and log the exception message.
           print('Function failed due to exception.') 
           print(e)
           traceback.print_exc()
           put_job_failure(job_id, 'Function exception: ' + str(e))
         
       print('Function complete.')   
       return "Complete."
   ```

1. Leave **Handler** at the default value, and leave **Role** at the name you selected or created earlier, **CodePipelineLambdaExecRole**. 

1. In **Basic settings**, for **Timeout**, replace the default of 3 seconds with **20**.

1. Choose **Save**.

1. From the CodePipeline console, edit the pipeline to add the function as an action in a stage in your pipeline. Choose **Edit** for the pipeline stage you want to change, and choose **Add action group**. On the **Edit action** page, in **Action name**, enter a name for your action. In **Action provider**, choose **Lambda**. 

   Under **Input artifacts**, choose `MyTemplate`. In **UserParameters**, you must provide a JSON string with three parameters:
   + Stack name
   + CloudFormation template name and path to the file
   + Input artifact

   Use curly brackets (\$1 \$1) and separate the parameters with commas. For example, to create a stack named *MyTestStack*, for a pipeline with the input artifact *MyTemplate*, in **UserParameters**, enter: \$1"stack":"*MyTestStack*","file":"template-package/template.json","artifact":"*MyTemplate*"\$1.
**Note**  
Even though you have specified the input artifact in **UserParameters**, you must also specify this input artifact for the action in **Input artifacts**.

1. Save your changes to the pipeline, and then manually release a change to test the action and Lambda function.

# Add a manual approval action to a stage
Add a manual approval action to a stage

In AWS CodePipeline, you can add an approval action to a stage in a pipeline at the point where you want the pipeline execution to stop so that someone with the required AWS Identity and Access Management permissions can approve or reject the action. 

If the action is approved, the pipeline execution resumes. If the action is rejected—or if no one approves or rejects the action within seven days of the pipeline reaching the action and stopping—the result is the same as an action failing, and the pipeline execution does not continue.

You might use manual approvals for these reasons:
+ You want someone to perform a code review or change management review before a revision is allowed into the next stage of a pipeline.
+ You want someone to perform manual quality assurance testing on the latest version of an application, or to confirm the integrity of a build artifact, before it is released.
+ You want someone to review new or updated text before it is published to a company website.

## Configuration options for manual approval actions in CodePipeline
Configuration options for manual approval actions

CodePipeline provides three configuration options you can use to tell approvers about the approval action. 

**Publish Approval Notifications** You can configure an approval action to publish a message to an Amazon Simple Notification Service topic when the pipeline stops at the action. Amazon SNS delivers the message to every endpoint subscribed to the topic. You must use a topic created in the same AWS Region as the pipeline that will include the approval action. When you create a topic, we recommend you give it a name that will identify its purpose, in formats such as `MyFirstPipeline-us-east-2-approval`. 

When you publish approval notifications to Amazon SNS topics, you can choose from formats such as email or SMS recipients, SQS queues, HTTP/HTTPS endpoints, or AWS Lambda functions you invoke using Amazon SNS. For information about Amazon SNS topic notifications, see the following topics:
+ [What Is Amazon Simple Notification Service?](https://docs.aws.amazon.com/sns/latest/dg/welcome.html)
+ [Create a Topic in Amazon SNS](https://docs.aws.amazon.com/sns/latest/dg/CreateTopic.html)
+ [Sending Amazon SNS Messages to Amazon SQS Queues](https://docs.aws.amazon.com/sns/latest/dg/SendMessageToSQS.html)
+ [Subscribing a Queue to an Amazon SNS Topic](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqssubscribe.html)
+ [Sending Amazon SNS Messages to HTTP/HTTPS Endpoints](https://docs.aws.amazon.com/sns/latest/dg/SendMessageToHttp.html)
+ [Invoking Lambda Functions Using Amazon SNS Notifications](https://docs.aws.amazon.com/sns/latest/dg/sns-lambda.html)

For the structure of the JSON data generated for an approval action notification, see [JSON data format for manual approval notifications in CodePipeline](approvals-json-format.md).

**Specify a URL for Review** As part of the configuration of the approval action, you can specify a URL to be reviewed. The URL might be a link to a web application you want approvers to test or a page with more information about your approval request. The URL is included in the notification that is published to the Amazon SNS topic. Approvers can use the console or CLI to view it. 

**Enter Comments for Approvers** When you create an approval action, you can also add comments that are displayed to those who receive the notifications or those who view the action in the console or CLI response.

**No Configuration Options** You can also choose not to configure any of these three options. You might not need them if, for example, you can notify someone directly that the action is ready for their review, or you simply want the pipeline to stop until you decide to approve the action yourself. 

## Setup and workflow overview for approval actions in CodePipeline
Setup and workflow overview for approval actions

The following is an overview for setting up and using manual approvals. 

1. You grant the IAM permissions required for approving or rejecting approval actions to one or more IAM roles in your organization. 

1. (Optional) If you are using Amazon SNS notifications, you ensure that the service role you use in your CodePipeline operations has permission to access Amazon SNS resources. 

1. (Optional) If you are using Amazon SNS notifications, you create an Amazon SNS topic and add one or more subscribers or endpoints to it. 

1. When you use the AWS CLI to create the pipeline or after you have used the CLI or console to create the pipeline, you add an approval action to a stage in the pipeline. 

   If you are using notifications, you include the Amazon Resource Name (ARN) of the Amazon SNS topic in the configuration of the action. (An ARN is a unique identifier for an Amazon resource. ARNs for Amazon SNS topics are structured like *arn:aws:sns:us-east-2:80398EXAMPLE:MyApprovalTopic*. For more information, see [Amazon Resource Names (ARNs) and AWS service namespaces](https://docs.aws.amazon.com/general/latest/gr/aws-arns-and-namespaces.html) in the *Amazon Web Services General Reference*.)

1. The pipeline stops when it reaches the approval action. If an Amazon SNS topic ARN was included in the configuration of the action, a notification is published to the Amazon SNS topic, and a message is delivered to any subscribers to the topic or subscribed endpoints, with a link to review the approval action in the console.

1. An approver examines the target URL and reviews comments, if any.

1. Using the console, CLI, or SDK, the approver provides a summary comment and submits a response:
   + Approved: The pipeline execution resumes.
   + Rejected: The stage status is changed to "Failed" and the pipeline execution does not resume. 

   If no response is submitted within seven days, the action is marked as "Failed."

# Grant approval permissions to an IAM user in CodePipeline


Before IAM users in your organization can approve or reject approval actions, they must be granted permissions to access pipelines and to update the status of approval actions. You can grant permission to access all pipelines and approval actions in your account by attaching the `AWSCodePipelineApproverAccess` managed policy to an IAM user, role, or group; or you can to grant limited permissions by specifying the individual resources that can be accessed by an IAM user, role, or group.

**Note**  
The permissions described in this topic grant very limited access. To enable a user, role, or group to do more than approve or reject approval actions, you can attach other managed policies. For information about the managed policies available for CodePipeline, see [AWS managed policies for AWS CodePipeline](managed-policies.md).

## Grant approval permission to all pipelines and approval actions


For users who need to perform approval actions in CodePipeline, use the `AWSCodePipelineApproverAccess` managed policy.

To provide access, add permissions to your users, groups, or roles:
+ Users and groups in AWS IAM Identity Center:

  Create a permission set. Follow the instructions in [Create a permission set](https://docs.aws.amazon.com//singlesignon/latest/userguide/howtocreatepermissionset.html) in the *AWS IAM Identity Center User Guide*.
+ Users managed in IAM through an identity provider:

  Create a role for identity federation. Follow the instructions in [Create a role for a third-party identity provider (federation)](https://docs.aws.amazon.com//IAM/latest/UserGuide/id_roles_create_for-idp.html) in the *IAM User Guide*.
+ IAM users:
  + Create a role that your user can assume. Follow the instructions in [Create a role for an IAM user](https://docs.aws.amazon.com//IAM/latest/UserGuide/id_roles_create_for-user.html) in the *IAM User Guide*.
  + (Not recommended) Attach a policy directly to a user or add a user to a user group. Follow the instructions in [Adding permissions to a user (console)](https://docs.aws.amazon.com//IAM/latest/UserGuide/id_users_change-permissions.html#users_change_permissions-add-console) in the *IAM User Guide*.

## Specify approval permission for specific pipelines and approval actions


For users who need to perform approval actions in CodePipeline, use the following custom policy. In the policy below, specify the individual resources a user can access. For example, the following policy grants users the authority to approve or reject only the action named `MyApprovalAction` in the `MyFirstPipeline` pipeline in the US East (Ohio) Region (us-east-2):

**Note**  
The `codepipeline:ListPipelines` permission is required only if IAM users need to access the CodePipeline dashboard to view this list of pipelines. If console access is not required, you can omit `codepipeline:ListPipelines`.

**To use the JSON policy editor to create a policy**

1. Sign in to the AWS Management Console and open the IAM console at [https://console.aws.amazon.com/iam/](https://console.aws.amazon.com/iam/).

1. In the navigation pane on the left, choose **Policies**. 

   If this is your first time choosing **Policies**, the **Welcome to Managed Policies** page appears. Choose **Get Started**.

1. At the top of the page, choose **Create policy**.

1. In the **Policy editor** section, choose the **JSON** option.

1. Enter the following JSON policy document:

   ```
    {
       "Version": "2012-10-17",		 	 	 
       "Statement": [
           {
               "Effect": "Allow",
               "Action": [
                   "codepipeline:ListPipelines"
               ],
               "Resource": [
                   "*"
               ]
           },
           {
               "Effect": "Allow",
               "Action": [
                   "codepipeline:GetPipeline",
                   "codepipeline:GetPipelineState",
                   "codepipeline:GetPipelineExecution"
               ],
               "Resource": "arn:aws:codepipeline:us-east-2:80398EXAMPLE:MyFirstPipeline"
           },
           {
               "Effect": "Allow",
               "Action": [
                   "codepipeline:PutApprovalResult"
               ],
               "Resource": "arn:aws:codepipeline:us-east-2:80398EXAMPLE:MyFirstPipeline/MyApprovalStage/MyApprovalAction"
           }
       ]
   }
   ```

1. Choose **Next**.
**Note**  
You can switch between the **Visual** and **JSON** editor options anytime. However, if you make changes or choose **Next** in the **Visual** editor, IAM might restructure your policy to optimize it for the visual editor. For more information, see [Policy restructuring](https://docs.aws.amazon.com/IAM/latest/UserGuide/troubleshoot_policies.html#troubleshoot_viseditor-restructure) in the *IAM User Guide*.

1. On the **Review and create** page, enter a **Policy name** and a **Description** (optional) for the policy that you are creating. Review **Permissions defined in this policy** to see the permissions that are granted by your policy.

1. Choose **Create policy** to save your new policy.

# Grant Amazon SNS permissions to a CodePipeline service role
Grant Amazon SNS permissions to a service role

If you plan to use Amazon SNS to publish notifications to topics when approval actions require review, the service role you use in your CodePipeline operations must be granted permission to access the Amazon SNS resources. You can use the IAM console to add this permission to your service role.

In the policy below, specify the policy for publishing with SNS. For the following policy, you can name it `SNSPublish`. Use the following policy by attaching it to your service role.

**Important**  
Make sure you are signed in to the AWS Management Console with the same account information you used in [Getting started with CodePipeline](getting-started-codepipeline.md).

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": "sns:Publish",
            "Resource": "*"
        }
    ]
}
```

------

**To use the JSON policy editor to create a policy**

1. Sign in to the AWS Management Console and open the IAM console at [https://console.aws.amazon.com/iam/](https://console.aws.amazon.com/iam/).

1. In the navigation pane on the left, choose **Policies**. 

   If this is your first time choosing **Policies**, the **Welcome to Managed Policies** page appears. Choose **Get Started**.

1. At the top of the page, choose **Create policy**.

1. In the **Policy editor** section, choose the **JSON** option.

1. Enter or paste a JSON policy document. For details about the IAM policy language, see [IAM JSON policy reference](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies.html).

1. Resolve any security warnings, errors, or general warnings generated during [policy validation](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_policy-validator.html), and then choose **Next**. 
**Note**  
You can switch between the **Visual** and **JSON** editor options anytime. However, if you make changes or choose **Next** in the **Visual** editor, IAM might restructure your policy to optimize it for the visual editor. For more information, see [Policy restructuring](https://docs.aws.amazon.com/IAM/latest/UserGuide/troubleshoot_policies.html#troubleshoot_viseditor-restructure) in the *IAM User Guide*.

1. (Optional) When you create or edit a policy in the AWS Management Console, you can generate a JSON or YAML policy template that you can use in CloudFormation templates.

   To do this, in the **Policy editor** choose **Actions**, and then choose **Generate CloudFormation template**. To learn more about CloudFormation, see [AWS Identity and Access Management resource type reference](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/AWS_IAM.html) in the *AWS CloudFormation User Guide*.

1. When you are finished adding permissions to the policy, choose **Next**.

1. On the **Review and create** page, enter a **Policy name** and a **Description** (optional) for the policy that you are creating. Review **Permissions defined in this policy** to see the permissions that are granted by your policy.

1. (Optional) Add metadata to the policy by attaching tags as key-value pairs. For more information about using tags in IAM, see [Tags for AWS Identity and Access Management resources](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_tags.html) in the *IAM User Guide*.

1. Choose **Create policy** to save your new policy.

# Add a manual approval action to a pipeline in CodePipeline
Add a manual approval action

You can add an approval action to a stage in a CodePipeline pipeline at the point where you want the pipeline to stop so someone can manually approve or reject the action. 

**Note**  
Approval actions can't be added to Source stages. Source stages can contain only source actions. 

If you want to use Amazon SNS to send notifications when an approval action is ready for review, you must first complete the following prerequisites: 
+ Grant permission to your CodePipeline service role to access Amazon SNS resources. For information, see [Grant Amazon SNS permissions to a CodePipeline service role](approvals-service-role-permissions.md).
+ Grant permission to one or more IAM identities in your organization to update the status of an approval action. For information, see [Grant approval permissions to an IAM user in CodePipeline](approvals-iam-permissions.md).

In this example, you create a new approval stage and add a manual approval action to the stage. You can also add a manual approval action to an existing stage that contains other actions.

## Add a manual approval action to a CodePipeline pipeline (console)


You can use the CodePipeline console to add an approval action to an existing CodePipeline pipeline. You must use the AWS CLI if you want to add approval actions when you create a new pipeline. 

1. Open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

1. In **Name**, choose the pipeline.

1. On the pipeline details page, choose **Edit**.

1. If you want to add an approval action to a new stage, choose **\$1 Add stage** at the point in the pipeline where you want to add an approval request, and enter a name for the stage. On the **Add stage** page, in **Stage name**, enter your new stage name. For example, add a new stage and name it `Manual_Approval`.

   If you want to add an approval action to an existing stage, choose **Edit stage**. 

1. In the stage where you want to add the approval action, choose **\$1 Add action group**.

1. On the **Edit action** page, do the following:

   1. In **Action name**, enter a name to identify the action.

   1. In **Action provider**, under **Approval**, choose **Manual approval**.

   1. (Optional) In **SNS topic ARN**, choose the name of the topic to be used to send notifications for the approval action.

   1. (Optional) In **URL for review**, enter the URL of the page or application you want the approver to examine. Approvers can access this URL through a link included in the console view of the pipeline. 

   1. (Optional) In **Comments**, enter any other information you want to share with the reviewer.

   1. Choose **Save**.

## Add a manual approval action to a CodePipeline pipeline (CLI)


You can use the CLI to add an approval action to an existing pipeline or when you create a pipeline. You do this by including an approval action, with the Manual approval type, in a stage you are creating or editing. 

For more information about creating and editing pipelines, see [Create a pipeline, stages, and actions](pipelines-create.md) and [Edit a pipeline in CodePipeline](pipelines-edit.md).

To add a stage to a pipeline that includes only an approval action, you would include something similar to the following example when you create or update the pipeline. 

**Note**  
The `configuration` section is optional. This is just a portion, not the entire structure, of the file. For more information, see [CodePipeline pipeline structure reference](reference-pipeline-structure.md).

```
{
    "name": "MyApprovalStage",
    "actions": [
        {
            "name": "MyApprovalAction",
            "actionTypeId": {
                "category": "Approval",
                "owner": "AWS",
                "version": "1",
                "provider": "Manual"
            },
            "inputArtifacts": [],
            "outputArtifacts": [],
            "configuration": {
                "NotificationArn": "arn:aws:sns:us-east-2:80398EXAMPLE:MyApprovalTopic",
                "ExternalEntityLink": "http://example.com",
                "CustomData": "The latest changes include feedback from Bob."},
            "runOrder": 1
        }
    ]
}
```

If the approval action is in a stage with other actions, the section of your JSON file that contains the stage might look similar instead to the following example.

**Note**  
The `configuration` section is optional. This is just a portion, not the entire structure, of the file. For more information, see [CodePipeline pipeline structure reference](reference-pipeline-structure.md).

```
,
{
    "name": "Production",
    "actions": [
        {
            "inputArtifacts": [],
            "name": "MyApprovalAction",
            "actionTypeId": {
                "category": "Approval",
                "owner": "AWS",
                "version": "1",
                "provider": "Manual"
            },
            "outputArtifacts": [],
            "configuration": {
                "NotificationArn": "arn:aws:sns:us-east-2:80398EXAMPLE:MyApprovalTopic",
                "ExternalEntityLink": "http://example.com",
                "CustomData": "The latest changes include feedback from Bob."
            },
            "runOrder": 1
        },
        {
            "inputArtifacts": [
                {
                    "name": "MyApp"
                }
            ],
            "name": "MyDeploymentAction",
            "actionTypeId": {
                "category": "Deploy",
                "owner": "AWS",
                "version": "1",
                "provider": "CodeDeploy"
            },
            "outputArtifacts": [],
            "configuration": {
                "ApplicationName": "MyDemoApplication",
                "DeploymentGroupName": "MyProductionFleet"
            },
            "runOrder": 2
        }
    ]
}
```

# Approve or reject an approval action in CodePipeline
Approve or reject an approval action

When a pipeline includes an approval action, the pipeline execution stops at the point where the action has been added. The pipeline won't resume unless someone manually approves the action. If an approver rejects the action, or if no approval response is received within seven days of the pipeline stopping for the approval action, the pipeline status becomes "Failed."

If the person who added the approval action to the pipeline configured notifications, you might receive an email with the pipeline information and status for approval.

## Approve or reject an approval action (console)


If you receive a notification that includes a direct link to an approval action, choose the **Approve or reject** link, sign in to the console, and then continue with step 7 here. Otherwise, follow all of these steps.

1. Open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

1. On the **All Pipelines** page, choose the name of the pipeline.

1. Locate the stage with the approval action. Choose **Review**.

   The **Review** dialog box displays. The **Details** tab shows the review content and comments.  
![\[The Details tab shows the review content and comments.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/manual-approval-review-details.png)

   The **Revisions** tab shows the source revisions for the execution.  
![\[TThe Revisions tab shows the source revisions for the execution.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/manual-approval-review-revisions.png)

1. On the **Details** tab, view the comments and URL, if any. The message also displays the URL of content for you to review, if one was included. 

1. If a URL was provided, choose the **URL for review** link in the action to open the target webpage, and then review the content.

1. In the **Review** window, enter review comments, such as why you are approving or rejecting the action, and then choose **Approve** or **Reject**.

1. Choose **Submit**.

## Approve or reject an approval request (CLI)


To use the CLI to respond to an approval action, you must first use the **get-pipeline-state** command to retrieve the token associated with latest execution of the approval action. 

1. At a terminal (Linux, macOS, or Unix) or command prompt (Windows), run the [get-pipeline-state](https://docs.aws.amazon.com/cli/latest/reference/codepipeline/get-pipeline-state.html) command on the pipeline that contains the approval action. For example, for a pipeline named *MyFirstPipeline*, enter the following:

   ```
   aws codepipeline get-pipeline-state --name MyFirstPipeline
   ```

1. In the response to the command, locate the `token` value, which appears in `latestExecution` in the `actionStates` section for the approval action, as shown here:

   ```
   {
       "created": 1467929497.204,
       "pipelineName": "MyFirstPipeline",
       "pipelineVersion": 1,
       "stageStates": [
           {
               "actionStates": [
                   {
                       "actionName": "MyApprovalAction",
                       "currentRevision": {
                           "created": 1467929497.204,
                           "revisionChangeId": "CEM7d6Tp7zfelUSLCPPwo234xEXAMPLE",
                           "revisionId": "HYGp7zmwbCPPwo23xCMdTeqIlEXAMPLE"
                       },
                       "latestExecution": {
                           "lastUpdatedBy": "identity",
                           "summary": "The new design needs to be reviewed before release.",
                           "token": "1a2b3c4d-573f-4ea7-a67E-XAMPLETOKEN"
                       }
                   }
   //More content might appear here
   }
   ```

1. In a plain-text editor, create a file where you add the following, in JSON format:
   + The name of the pipeline that contains the approval action.
   + The name of the stage that contains the approval action.
   + The name of the approval action.
   + The token value you collected in the previous step.
   + Your response to the action (Approved or Rejected). The response must be capitalized.
   + Your summary comments.

   For the preceding *MyFirstPipeline* example, your file should look like this:

   ```
   {
     "pipelineName": "MyFirstPipeline",
     "stageName": "MyApprovalStage",
     "actionName": "MyApprovalAction",
     "token": "1a2b3c4d-573f-4ea7-a67E-XAMPLETOKEN",
     "result": {
       "status": "Approved",
       "summary": "The new design looks good. Ready to release to customers."
     }
   }
   ```

1. Save the file with a name like **approvalstage-approved.json**.

1. Run the [put-approval-result](https://docs.aws.amazon.com/cli/latest/reference/codepipeline/put-approval-result.html) command, specifying the name of the approval JSON file, similar to the following:
**Important**  
Be sure to include `file://` before the file name. It is required in this command.

   ```
   aws codepipeline put-approval-result --cli-input-json file://approvalstage-approved.json
   ```

# JSON data format for manual approval notifications in CodePipeline
JSON data format for manual approval notifications

For approval actions that use Amazon SNS notifications, JSON data about the action is created and published to Amazon SNS when the pipeline stops. You can use the JSON output to send messages to Amazon SQS queues or invoke functions in AWS Lambda. 

**Note**  
This guide does not address how to configure notifications using JSON. For information, see [Sending Amazon SNS Messages to Amazon SQS Queues](https://docs.aws.amazon.com/sns/latest/dg/SendMessageToSQS.html) and [Invoking Lambda Functions Using Amazon SNS Notifications](https://docs.aws.amazon.com/sns/latest/dg/sns-lambda.html) in the *Amazon SNS Developer Guide*.

The following example shows the structure of the JSON output available with CodePipeline approvals.

```
{
    "region": "us-east-2",
    "consoleLink": "https://console.aws.amazon.com/codepipeline/home?region=us-east-2#/view/MyFirstPipeline",
    "approval": {
        "pipelineName": "MyFirstPipeline",
        "stageName": "MyApprovalStage",
        "actionName": "MyApprovalAction",
        "token": "1a2b3c4d-573f-4ea7-a67E-XAMPLETOKEN",
        "expires": "2016-07-07T20:22Z",
        "externalEntityLink": "http://example.com",
        "approvalReviewLink": "https://console.aws.amazon.com/codepipeline/home?region=us-east-2#/view/MyFirstPipeline/MyApprovalStage/MyApprovalAction/approve/1a2b3c4d-573f-4ea7-a67E-XAMPLETOKEN",
        "customData": "Review the latest changes and approve or reject within seven days."
    }
}
```

# Add a cross-Region action in CodePipeline
Add a cross-Region action to a pipeline

AWS CodePipeline includes a number of actions that help you configure build, test, and deploy resources for your automated release process. You can add actions to your pipeline that are in an AWS Region different from your pipeline. When an AWS service is the provider for an action, and this action type/provider type are in a different AWS Region from your pipeline, this is a cross-Region action. 

**Note**  
Cross-region actions are supported and can only be created in those AWS Regions where CodePipeline is supported. For a list of the supported AWS Regions for CodePipeline, see [Quotas in AWS CodePipeline](limits.md).

You can use the console, AWS CLI, or CloudFormation to add cross-Region actions in pipelines.

**Note**  
Certain action types in CodePipeline may only be available in certain AWS Regions. Also note that there may be AWS Regions where an action type is available, but a specific AWS provider for that action type is not available.

When you create or edit a pipeline, you must have an artifact bucket in the pipeline Region and then you must have one artifact bucket per Region where you plan to execute an action. For more information about the `ArtifactStores` parameter, see [CodePipeline pipeline structure reference](reference-pipeline-structure.md).

**Note**  
CodePipeline handles the copying of artifacts from one AWS Region to the other Regions when performing cross-region actions.

If you use the console to create a pipeline or cross-Region actions, default artifact buckets are configured by CodePipeline in the Regions where you have actions. When you use the AWS CLI, CloudFormation, or an SDK to create a pipeline or cross-Region actions, you provide the artifact bucket for each Region where you have actions. 

**Note**  
You must create the artifact bucket and encryption key in the same AWS Region as the cross-Region action and in the same account as your pipeline.

You cannot create cross-Region actions for the following action types:
+ Source actions
+ Third-party actions
+ Custom actions

**Note**  
When using cross-Region Lambda invoke action in CodePipeline, the status of the lambda execution using the [PutJobSuccessResult](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_PutJobSuccessResult.html) and [PutJobFailureResult](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_PutJobFailureResult.html) should be sent to the AWS Region where the Lambda function is present and not to the Region where CodePipeline exists.

When a pipeline includes a cross-Region action as part of a stage, CodePipeline replicates only the input artifacts of the cross-Region action from the pipeline Region to the action's Region.

**Note**  
The pipeline Region and the Region where your CloudWatch Events change detection resources are maintained remain the same. The Region where your pipeline is hosted does not change.



## Manage cross-Region actions in a pipeline (console)


You can use the CodePipeline console to add a cross-Region action to an existing pipeline. To create a new pipeline with cross-Region actions using the Create pipeline wizard, see [Create a custom pipeline (console)](pipelines-create.md#pipelines-create-console).

In the console, you create a cross-Region action in a pipeline stage by choosing the action provider and the **Region** field, which lists the resources you have created in that region for that provider. When you add a cross-Region action, CodePipeline uses a separate artifact bucket in the action's region. For more information about cross-Region artifact buckets, see [CodePipeline pipeline structure reference](reference-pipeline-structure.md).

### Add a cross-Region action to a pipeline stage (console)


Use the console to add a cross-Region action to a pipeline.

**Note**  
If the pipeline is running when changes are saved, that execution does not complete.

**To add a cross-Region action**

1. Sign in to the console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

1. Select your pipeline, and then choose **Edit**.

1. At the bottom of the diagram, choose **\$1 Add stage** if you are adding a new stage, or choose **Edit stage** if you want to add the action to an existing stage.

1. On **Edit: <Stage>**, choose **\$1 Add action group** to add a serial action. Or choose **\$1 Add action** to add a parallel action.

1. On the **Edit action** page:

   1. In **Action name**, enter a name for the cross-Region action.

   1. In **Action provider**, choose the action provider.

   1. In **Region**, choose the AWS Region where you have created or plan to create the resource for the action. When the Region is selected, the available resources for that Region are listed for selection. The **Region** field designates where the AWS resources are created for this action type and provider type. This field only displays for actions where the action provider is an AWS service. The **Region** field defaults to the same AWS Region as your pipeline.

   1. In **Input artifacts** choose the appropriate input from the previous stage. For example, if the previous stage is a source stage, choose **SourceArtifact**.

   1. Complete all the required fields for the action provider you are configuring.

   1. In **Output artifacts** choose the appropriate output to the next stage. For example, if the next stage is a deployment stage, choose **BuildArtifact**.

   1. Choose **Save**.

1. On **Edit: <Stage>**, choose **Done**.

1. Choose **Save**.

### Edit a cross-Region action in a pipeline stage (console)


Use the console to edit an existing cross-Region action in a pipeline.

**Note**  
If the pipeline is running when changes are saved, that execution does not complete.

**To edit a cross-Region action**

1. Sign in to the console at [http://console.aws.amazon.com/codesuite/codepipeline/home.](http://console.aws.amazon.com/codesuite/codepipeline/home.)

1. Select your pipeline, and then choose **Edit**.

1. Choose **Edit stage**.

1. On **Edit: <Stage>**, choose the icon to edit an existing action.

1. On the **Edit action** page, make changes to the fields, as appropriate.

1. On **Edit: <Stage>**, choose **Done**.

1. Choose **Save**.

### Delete a cross-Region action from a pipeline stage (console)


Use the console to delete an existing cross-Region action from a pipeline.

**Note**  
If the pipeline is running when changes are saved, that execution does not complete.

**To delete a cross-Region action**

1. Sign in to the console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

1. Select your pipeline, and then choose **Edit**.

1. Choose **Edit stage**.

1. On **Edit: <Stage>**, choose the icon to delete an existing action.

1. On **Edit: <Stage>**, choose **Done**.

1. Choose **Save**.

## Add a cross-Region action to a pipeline (CLI)


You can use the AWS CLI to add a cross-Region action to an existing pipeline.

To create a cross-Region action in a pipeline stage with the AWS CLI, you add the configuration action along with an optional `region` field. You must also have already created an artifact bucket in the action's region. Instead of providing the `artifactStore` parameter of the single-region pipeline, you use the `artifactStores` parameter to include a listing of each Region's artifact bucket.

**Note**  
In this walkthrough and its examples, *RegionA* is the Region where the pipeline is created. It has access to the *RegionA* Amazon S3 bucket used to store pipeline artifacts and the service role used by CodePipeline. *RegionB* is the region where the CodeDeploy application, deployment group, and service role used by CodeDeploy are created. 

### Prerequisites


You must have created the following:
+ A pipeline in *RegionA*. 
+ An Amazon S3 artifact bucket in *RegionB*. 
+ The resources for your action, such as your CodeDeploy application and deployment group for a cross-Region deploy action, in *RegionB*.

### Add a cross-Region action to a pipeline (CLI)


Use the AWS CLI to add a cross-Region action to a pipeline.

**To add a cross-Region action**

1. For a pipeline in *RegionA*, run the **get-pipeline** command to copy the pipeline structure into a JSON file. For example, for a pipeline named `MyFirstPipeline`, run the following command: 

   ```
   aws codepipeline get-pipeline --name MyFirstPipeline >pipeline.json
   ```

   This command returns nothing, but the file you created should appear in the directory where you ran the command.

1. Add the `region` field to add a new stage with your cross-Region action that includes the Region and resources for your action. The following JSON sample adds a Deploy stage with a cross-Region deploy action where the provider is CodeDeploy, in a new region `us-east-1`.

   ```
    {
                   "name": "Deploy",
                   "actions": [
                       {
                           "inputArtifacts": [
                               {
                                   "name": "SourceArtifact"
                               }
                           ],
                           "name": "Deploy",
                           "region": "RegionB",
                           "actionTypeId": {
                               "category": "Deploy",
                               "owner": "AWS",
                               "version": "1",
                               "provider": "CodeDeploy"
                           },
                           "outputArtifacts": [],
                           "configuration": {
                               "ApplicationName": "name",
                               "DeploymentGroupName": "name"
                           },
                           "runOrder": 1
                       }
   ```

1. In the pipeline structure, remove the `artifactStore` field and add the `artifactStores` map for your new cross-Region action. The mapping must include an entry for each AWS Region in which you have actions. For each entry in the mapping, the resources must be in the respective AWS Region. In the example below, `ID-A` is the encryption key ID for *RegionA*, and `ID-B` is the encryption key ID for *RegionB*.

   ```
   "artifactStores":{  
      "RegionA":{  
         "encryptionKey":{  
            "id":"ID-A",
            "type":"KMS"
         },
         "location":"Location1",
         "type":"S3"
      },
      "RegionB":{  
         "encryptionKey":{  
            "id":"ID-B",
            "type":"KMS"
         },
         "location":"Location2",
         "type":"S3"
      }
   }
   ```

   The following JSON example shows the us-west-2 bucket as `my-storage-bucket` and adds the new us-east-1 bucket named `my-storage-bucket-us-east-1`.

   ```
           "artifactStores": {
               "us-west-2": {
                   "type": "S3",
                   "location": "my-storage-bucket"
               },
               "us-east-1": {
                   "type": "S3",
                   "location": "my-storage-bucket-us-east-1"
               }
           },
   ```

1. If you are working with the pipeline structure retrieved using the **get-pipeline** command, remove the `metadata` lines from the JSON file. Otherwise, the **update-pipeline** command cannot use it. Remove the `"metadata": { }` lines and the `"created"`, `"pipelineARN"`, and `"updated"` fields.

   For example, remove the following lines from the structure: 

   ```
   "metadata": {  
     "pipelineArn": "arn:aws:codepipeline:region:account-ID:pipeline-name",
     "created": "date",
     "updated": "date"
     }
   ```

   Save the file.

1. To apply your changes, run the **update-pipeline** command, specifying the pipeline JSON file:
**Important**  
Be sure to include `file://` before the file name. It is required in this command.

   ```
   aws codepipeline update-pipeline --cli-input-json file://pipeline.json
   ```

   This command returns the entire structure of the edited pipeline. The output is similar to the following.

   ```
   {
       "pipeline": {
           "version": 4,
           "roleArn": "ARN",
           "stages": [
               {
                   "name": "Source",
                   "actions": [
                       {
                           "inputArtifacts": [],
                           "name": "Source",
                           "actionTypeId": {
                               "category": "Source",
                               "owner": "AWS",
                               "version": "1",
                               "provider": "CodeCommit"
                           },
                           "outputArtifacts": [
                               {
                                   "name": "SourceArtifact"
                               }
                           ],
                           "configuration": {
                               "PollForSourceChanges": "false",
                               "BranchName": "main",
                               "RepositoryName": "MyTestRepo"
                           },
                           "runOrder": 1
                       }
                   ]
               },
               {
                   "name": "Deploy",
                   "actions": [
                       {
                           "inputArtifacts": [
                               {
                                   "name": "SourceArtifact"
                               }
                           ],
                           "name": "Deploy",
                           "region": "us-east-1",
                           "actionTypeId": {
                               "category": "Deploy",
                               "owner": "AWS",
                               "version": "1",
                               "provider": "CodeDeploy"
                           },
                           "outputArtifacts": [],
                           "configuration": {
                               "ApplicationName": "name",
                               "DeploymentGroupName": "name"
                           },
                           "runOrder": 1
                       }
                   ]
               }
           ],
           "name": "AnyCompanyPipeline",
           "artifactStores": {
               "us-west-2": {
                   "type": "S3",
                   "location": "my-storage-bucket"
               },
               "us-east-1": {
                   "type": "S3",
                   "location": "my-storage-bucket-us-east-1"
               }
           }
       }
   }
   ```
**Note**  
The **update-pipeline** command stops the pipeline. If a revision is being run through the pipeline when you run the **update-pipeline** command, that run is stopped. You must manually start the pipeline to run that revision through the updated pipeline. Use the **`start-pipeline-execution`** command to manually start your pipeline.

1. After you update your pipeline, the cross-Region action is displayed in the console.  
![\[A high-level view of a pipeline that includes a cross-Region action.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/cross-region-icon.png)

## Add a cross-Region action to a pipeline (CloudFormation)


You can use CloudFormation to add a cross-Region action to an existing pipeline.

**To add a cross-Region action with CloudFormation**

1. Add the `Region` parameter to the `ActionDeclaration` resource in your template, as shown in this example:

   ```
   ActionDeclaration:
                 Type: Object
                 Properties:
                   ActionTypeId:
                     Type: ActionTypeId
                     Required: true
                   Configuration:
                     Type: Map
                   InputArtifacts:
                     Type: Array
                     ItemType:
                       Type: InputArtifact
                   Name:
                     Type: String
                     Required: true
                   OutputArtifacts:
                     Type: Array
                     ItemType:
                       Type: OutputArtifact
                   RoleArn:
                     Type: String
                   RunOrder:
                     Type: Integer
                   Region:
                     Type: String
   ```

1. Under `Mappings`, add the region map as shown in this example for a mapping named `SecondRegionMap` that maps values for the keys `RegionA` and `RegionB`. Under the `Pipeline` resource, under the `artifactStore` field, add the `artifactStores` map for your new cross-Region action as follows:

   ```
   Mappings:
     SecondRegionMap:
       RegionA:
         SecondRegion: "RegionB"
       RegionB:
         SecondRegion: "RegionA"
   
   ...
   
             Properties:
               ArtifactStores:
                 -
                   Region: RegionB
                   ArtifactStore:
                     Type: "S3"
                     Location: test-cross-region-artifact-store-bucket-RegionB
                 -
                   Region: RegionA
                   ArtifactStore:
                     Type: "S3"
                     Location: test-cross-region-artifact-store-bucket-RegionA
   ```

   The following YAML example shows the *RegionA* bucket as `us-west-2` and adds the new *RegionB* bucket, `eu-central-1`:

   ```
   Mappings:
     SecondRegionMap:
       us-west-2:
         SecondRegion: "eu-central-1"
       eu-central-1:
         SecondRegion: "us-west-2"
   
   ...
   
             Properties:
               ArtifactStores:
                 -
                   Region: eu-central-1
                   ArtifactStore:
                     Type: "S3"
                     Location: test-cross-region-artifact-store-bucket-eu-central-1
                 -
                   Region: us-west-2
                   ArtifactStore:
                     Type: "S3"
                     Location: test-cross-region-artifact-store-bucket-us-west-2
   ```

1. Save the updated template to your local computer, and then open the CloudFormation console.

1. Choose your stack, and then choose **Create Change Set for Current Stack**. 

1. Upload the template, and then view the changes listed in CloudFormation. These are the changes to be made to the stack. You should see your new resources in the list.

1. Choose **Execute**.

# Working with variables
Working with variables

Some actions in CodePipeline generate variables. To use variables:
+ You assign a namespace to an action to make the variables it produces available to a downstream action configuration.
+ You configure the downstream action to consume the variables generated by the action.

  You can view the details for each action execution to see the values for each output variable that was generated by the action in execution-time.

To see step-by-step examples of using variables:
+ For a tutorial with a Lambda action that uses variables from an upstream action (CodeCommit) and generates output variables, see [Tutorial: Using variables with Lambda invoke actions](tutorials-lambda-variables.md).
+ For a tutorial with a CloudFormation action that references stack output variables from an upstream CloudFormation action, see [Tutorial: Create a pipeline that uses variables from AWS CloudFormation deployment actions](tutorials-cloudformation-action.md).
+ For an example manual approval action with message text that references output variables that resolve to the CodeCommit commit ID and commit message, see [Example: Use variables in manual approvals](#actions-variables-examples-approvals).
+ For an example CodeBuild action with an environment variable that resolves to the GitHub branch name, see [Example: Use a BranchName variable with CodeBuild environment variables](#actions-variables-examples-env-branchname).
+ CodeBuild actions produce as variables all environment variables that were exported as part of the build. For more information, see [CodeBuild action output variables](reference-variables.md#reference-variables-list-configured-codebuild). For a list of the environment variables you can use in CodeBuild, see [ Environment variables in build environments](https://docs.aws.amazon.com/codebuild/latest/userguide/build-env-ref-env-vars.html) in the *AWS CodeBuild User Guide*.

**Topics**
+ [

## Configure actions for variables
](#actions-variables-create)
+ [

## View output variables
](#actions-variables-view)
+ [

## Example: Use variables in manual approvals
](#actions-variables-examples-approvals)
+ [

## Example: Use a BranchName variable with CodeBuild environment variables
](#actions-variables-examples-env-branchname)

## Configure actions for variables


When you add an action to your pipeline, you can assign it a namespace and configure it to consume variables from previous actions.

### Configure actions with variables (console)


This example creates a pipeline with a CodeCommit source action and a CodeBuild build action. The CodeBuild action is configured to consume the variables produced by the CodeCommit action.

If the namespace isn’t specified, the variables are not available for reference in the action configuration. When you use the console to create a pipeline, the namespace for each action is generated automatically.

**To create a pipeline with variables**

1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

1. Choose **Create pipeline**. Enter a name for your pipeline, and then choose **Next**.

1. In **Source**, in **Provider**, choose **CodeCommit**. Choose the CodeCommit repository and branch for the source action, and then choose **Next**.

1. In **Build**, in **Provider**, choose **CodeBuild**. Choose an existing CodeBuild build project name or choose **Create project**. On **Create build project**, create a build project, and then choose **Return to CodePipeline**.

   Under **Environment variables**, choose **Add environment variables**. For example, enter the execution ID with the variable syntax `#{codepipeline.PipelineExecutionId}` and commit ID with the variable syntax `#{SourceVariables.CommitId}`. 
**Note**  
You can enter variable syntax in any action configuration field in the wizard.

1. Choose **Create**.

1. After the pipeline is created, you can view the namespace that was created by the wizard. On the pipeline, choose the icon for the stage you want to view the namespace for. In this example, the source action's auto-generated namespace, `SourceVariables`, is displayed.   
![\[Example: Action Info Screen\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/variables-popup-namespace.png)

**To edit the namespace for an existing action**

1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

1. Choose the pipeline you want to edit, and then choose **Edit**. For the source stage, choose **Edit stage**. Add the CodeCommit action.

1. On **Edit action**, view the **Variable namespace** field. If the existing action was created previously or without using the wizard, you must add a namespace. In **Variable namespace**, enter a namespace name, and then choose **Save**.

**To view output variables**

1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

1. After the pipeline is created and runs successfully, you can view the variables on the **Action execution details** page. For information, see [View variables (console)](#actions-variables-view-console).

### Configure actions for variables (CLI)


When you use the **create-pipeline** command to create a pipeline or the **update-pipeline** command to edit a pipeline, you can reference/use variables in the configuration of an action.

 If the namespace isn't specified, the variables produced by the action are not available to be referenced in any downstream action configuration.

**To configure an action with a namespace**

1. Follow the steps in [Create a pipeline, stages, and actions](pipelines-create.md) to create a pipeline using the CLI. Start an input file to provide the **create-pipeline** command with the `--cli-input-json` parameter. In the pipeline structure, add the `namespace` parameter and specify a name, such as `SourceVariables`.

   ```
   . . . 
   {
             "inputArtifacts": [],
             "name": "Source",
             "region": "us-west-2",
             "namespace": "SourceVariables",
             "actionTypeId": {
               "category": "Source",
               "owner": "AWS",
               "version": "1",
               "provider": "CodeCommit"
             },
             "outputArtifacts": [
   
   . . .
   ```

1. Save the file with a name like **MyPipeline.json**.

1. At a terminal (Linux, macOS, or Unix) or command prompt (Windows), run the [https://docs.aws.amazon.com/cli/latest/reference/codepipeline/get-pipeline-state.html](https://docs.aws.amazon.com/cli/latest/reference/codepipeline/get-pipeline-state.html) command and create the pipeline.

   Call the file you created when you run the [https://docs.aws.amazon.com/cli/latest/reference/codepipeline/retry-stage-execution.html](https://docs.aws.amazon.com/cli/latest/reference/codepipeline/retry-stage-execution.html) command. For example:

   ```
   aws codepipeline create-pipeline --cli-input-json file://MyPipeline.json
   ```

**To configure downstream actions to consume variables**

1. Edit an input file to provide the **update-pipeline** command with the `--cli-input-json` parameter. In the downstream action, add the variable to the configuration for that action. A variable is made up of a namespace and key, separated by a period. For example, to add variables for the pipeline execution ID and the source commit ID, specify the namespace `codepipeline` for the variable `#{codepipeline.PipelineExecutionId}`. Specify the namespace `SourceVariables` for the variable `#{SourceVariables.CommitId}`. 

   ```
   {
       "name": "Build",
       "actions": [
           {
               "outputArtifacts": [
                   {
                       "name": "BuildArtifacts"
                   }
               ],
               "name": "Build",
               "configuration": {
                   "EnvironmentVariables": "[{\"name\":\"Execution_ID\",\"value\":\"#{codepipeline.PipelineExecutionId}\",\"type\":\"PLAINTEXT\"},{\"name\":\"Commit_ID\",\"value\":\"#{SourceVariables.CommitId}\",\"type\":\"PLAINTEXT\"}]",
                   "ProjectName": "env-var-test"
               },
               "inputArtifacts": [
                   {
                       "name": "SourceArtifact"
                   }
               ],
               "region": "us-west-2",
               "actionTypeId": {
                   "provider": "CodeBuild",
                   "category": "Build",
                   "version": "1",
                   "owner": "AWS"
               },
               "runOrder": 1
           }
       ]
   },
   ```

1. Save the file with a name like **MyPipeline.json**.

1. At a terminal (Linux, macOS, or Unix) or command prompt (Windows), run the [https://docs.aws.amazon.com/cli/latest/reference/codepipeline/get-pipeline-state.html](https://docs.aws.amazon.com/cli/latest/reference/codepipeline/get-pipeline-state.html) command and create the pipeline.

   Call the file you created when you run the [https://docs.aws.amazon.com/cli/latest/reference/codepipeline/retry-stage-execution.html](https://docs.aws.amazon.com/cli/latest/reference/codepipeline/retry-stage-execution.html) command. For example:

   ```
   aws codepipeline create-pipeline --cli-input-json file://MyPipeline.json
   ```

## View output variables


You can view the action execution details to view the variables for that action, specific to each execution.

### View variables (console)


You can use the console to view variables for an action.

****

1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

   The names of all pipelines associated with your AWS account are displayed.

1.  In **Name**, choose the name of the pipeline. 

1. Choose **View history**.

1. After the pipeline runs successfully, you can view the variables produced by the source action. Choose **View history**. Choose **Source** in the action list for the pipeline execution to view the action execution details for the CodeCommit action. On the action detail screen, view the variables under **Output variables**.  
![\[Example: Source output variables\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/variables-output.png)

1. After the pipeline runs successfully, you can view the variables consumed by the build action. Choose **View history**. In the action list for the pipeline execution, choose **Build** to view the action execution details for the CodeBuild action. On the action detail page, view the variables under **Action configuration**. The auto-generated namespace is displayed.  
![\[Example: Action Configuration Variables\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/variables-actionconfig-codebuild.png)

   By default, **Action configuration** displays the variable syntax. You can choose **Show resolved configuration** to toggle the list to display the values that were produced during the action execution.  
![\[Example: Resolved Action Configuration Variables\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/variables-actionconfig-codebuild-resolved.png)

### View variables (CLI)


You can use the **list-action-executions** command to view variables for an action.

1. Use the following command: 

   ```
   aws codepipeline list-action-executions
   ```

   The output shows the `outputVariables` parameter as shown here.

   ```
   "outputVariables": {
                       "BranchName": "main",
                       "CommitMessage": "Updated files for test",
                       "AuthorDate": "2019-11-08T22:24:34Z",
                       "CommitId": "d99b0083cc10EXAMPLE",
                       "CommitterDate": "2019-11-08T22:24:34Z",
                       "RepositoryName": "variables-repo"
                   },
   ```

1. Use the following command: 

   ```
   aws codepipeline get-pipeline --name <pipeline-name>
   ```

   In the action configuration for the CodeBuild action, you can view the variables: 

   ```
   {
       "name": "Build",
       "actions": [
           {
               "outputArtifacts": [
                   {
                       "name": "BuildArtifact"
                   }
               ],
               "name": "Build",
               "configuration": {
                   "EnvironmentVariables": "[{\"name\":\"Execution_ID\",\"value\":\"#{codepipeline.PipelineExecutionId}\",\"type\":\"PLAINTEXT\"},{\"name\":\"Commit_ID\",\"value\":\"#{SourceVariables.CommitId}\",\"type\":\"PLAINTEXT\"}]",
                   "ProjectName": "env-var-test"
               },
               "inputArtifacts": [
                   {
                       "name": "SourceArtifact"
                   }
               ],
               "region": "us-west-2",
               "actionTypeId": {
                   "provider": "CodeBuild",
                   "category": "Build",
                   "version": "1",
                   "owner": "AWS"
               },
               "runOrder": 1
           }
       ]
   },
   ```

## Example: Use variables in manual approvals


When you specify a namespace for an action, and that action produces output variables, you can add a manual approval that displays variables in the approval message. This example shows you how to add variable syntax to a manual approval message.

****

1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

   The names of all pipelines associated with your AWS account are displayed. Choose the pipeline you want to add the approval to.

1. To edit your pipeline, choose **Edit**. Add a manual approval after the source action. In **Action name**, enter the name of the approval action.

1. In **Action provider**, choose **Manual approval**.

1. In **URL for review**, add the variable syntax for `CommitId` to your CodeCommit URL. Make sure you use the namespace assigned to your source action. For example, the variable syntax for a CodeCommit action with the default namespace `SourceVariables` is `#{SourceVariables.CommitId}`.

   In **Comments**, in `CommitMessage`, enter the commit message: 

   ```
   Please approve this change. Commit message: #{SourceVariables.CommitMessage}
   ```

1. After the pipeline runs successfully, you can view the variable values in the approval message.

## Example: Use a BranchName variable with CodeBuild environment variables


When you add a CodeBuild action to your pipeline, you can use CodeBuild environment variables to reference a `BranchName` output variable from an upstream source action. With an output variable from an action in CodePipeline, you can create your own CodeBuild environment variables for use in your build commands.

This example shows you how to add output variable syntax from a GitHub source action to a CodeBuild environment variable. The output variable syntax in this example represents the GitHub source action output variable for `BranchName`. After the action runs successfully, the variable resolves to show the GitHub branch name.

1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

   The names of all pipelines associated with your AWS account are displayed. Choose the pipeline you want to add the approval to.

1. To edit your pipeline, choose **Edit**. On the stage that contains your CodeBuild action, choose **Edit stage**.

1. Choose the icon to edit your CodeBuild action.

1. On the **Edit action** page, under **Environment variables**, enter the following: 
   + In **Name**, enter a name for your environment variable.
   + In **Value**, enter the variable syntax for your pipeline output variable, which includes the namespace assigned to your source action. For example, the output variable syntax for a GitHub action with the default namespace `SourceVariables` is `#{SourceVariables.BranchName}`.
   + In **Type**, choose **Plaintext**.

1. After the pipeline runs successfully, you can see how the resolved output variable is the value in the environment variable. Choose one of the following:
   + **CodePipeline console:** Choose your pipeline, and then choose **History**. Choose the most recent pipeline execution.
     + Under **Timeline**, choose the selector for **Source**. This is the source action that generates GitHub output variables. Choose **View execution details**. Under **Output variables**, view the list of output variables generated by this action.
     + Under **Timeline**, choose the selector for **Build**. This is the build action that specifies the CodeBuild environment variables for your build project. Choose **View execution details**. Under **Action configuration**, view your CodeBuild environment variables. Choose **Show resolved configuration**. Your environment variable value is the resolved `BranchName` output variable from the GitHub source action. In this example, the resolved value is `main`.

       For more information, see [View variables (console)](#actions-variables-view-console).
   + **CodeBuild console:** Choose your build project and choose the link for your build run. Under **Environment variables**, your resolved output variable is the value for the CodeBuild environment variable. In this example, the environment variable **Name** is `BranchName` and the **Value** is the resolved `BranchName` output variable from the GitHub source action. In this example, the resolved value is `main`.  
![\[Screen shot showing the resolved variable in the console\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/variable-codebuild-resolved.png)