

AWS Mainframe Modernization Service (Managed Runtime Environment experience) is no longer open to new customers. For capabilities similar to AWS Mainframe Modernization Service (Managed Runtime Environment experience) explore AWS Mainframe Modernization Service (Self-Managed Experience). Existing customers can continue to use the service as normal. For more information, see [AWS Mainframe Modernization availability change](https://docs.aws.amazon.com/m2/latest/userguide/mainframe-modernization-availability-change.html).

# Tutorials for Rocket Software (formerly Micro Focus)
<a name="tutorials-mf"></a>

The tutorials in this section help you to get started with setting up various tasks in the Rocket Software runtime engine for the AWS Mainframe Modernization service. These tutorials are for setting up sample application, using templates with Rocket Enterprise Developer, and setting up Enterprise Analyzer.

**Topics**
+ [

# Tutorial: Setting up the Rocket Software (formerly Micro Focus) build for the BankDemo sample application
](tutorial-build-mf.md)
+ [

# Tutorial: Set up WorkSpaces Applications for use with Rocket Enterprise Analyzer and Rocket Enterprise Developer
](set-up-appstream-mf.md)
+ [

# Tutorial: Use templates with Rocket Enterprise Developer (formerly Micro Focus Enterprise Developer)
](tutorial-templates-ed.md)
+ [

# Tutorial: Set up Enterprise Analyzer on WorkSpaces Applications
](set-up-ea.md)
+ [

# Tutorial: Set up Rocket Enterprise Developer on WorkSpaces Applications
](set-up-ed.md)

# Tutorial: Setting up the Rocket Software (formerly Micro Focus) build for the BankDemo sample application
<a name="tutorial-build-mf"></a>

AWS Mainframe Modernization provides you with the ability to set up builds and continuous integration/continuous delivery (CI/CD) pipelines for your migrated applications. These builds and pipelines use AWS CodeBuild, AWS CodeCommit, and AWS CodePipeline to provide these capabilities. CodeBuild is a fully managed build service that compiles your source code, runs unit tests, and produces artifacts that are ready to deploy. CodeCommit is a version control service that enables you to privately store and manage Git reponsitories in the AWS Cloud. CodePipeline is a continuous delivery service that enables you to model, visualize, and automate the steps required to release your software.

This tutorial demonstrates how to use AWS CodeBuild to compile the BankDemo sample application source code from Amazon S3 and then export the compiled code back to Amazon S3.

AWS CodeBuild is a fully managed continuous integration service that compiles source code, runs tests, and produces software packages that are ready to deploy. With CodeBuild, you can use prepackaged build environments, or you can create custom build environments that use your own build tools. This demo scenario uses the second option. It consists of a CodeBuild build environment that uses a pre-packaged Docker image.

**Important**  
Before you start your mainframe modernization project, we recommend that you learn about the [AWS Migration Acceleration Program (MAP) for Mainframe](https://aws.amazon.com/migration-acceleration-program/mainframe/) or contact [AWS mainframe specialists](mailto: mainframe@amazon.com) to learn about the steps required to modernize a mainframe application.

**Topics**
+ [

## Prerequisites
](#tutorial-build-mf-prerequisites)
+ [

## Step 1: Share the build assets with AWS account
](#tutorial-build-mf-assets)
+ [

## Step 2: Create Amazon S3 buckets
](#tutorial-build-mf-s3)
+ [

## Step 3: Create the build spec file
](#tutorial-build-mf-spec)
+ [

## Step 4: Upload the source files
](#tutorial-build-mf-upload)
+ [

## Step 5: Create IAM policies
](#tutorial-build-mf-IAM-policy)
+ [

## Step 6: Create an IAM role
](#tutorial-build-mf-IAM-role)
+ [

## Step 7: Attach the IAM policies to the IAM role
](#tutorial-build-mf-attach)
+ [

## Step 8: Create the CodeBuild project
](#tutorial-build-mf-create-project)
+ [

## Step 9: Start the build
](#tutorial-build-mf-start)
+ [

## Step 10: Download output artifacts
](#tutorial-build-mf-download-output)
+ [

## Clean up resources
](#tutorial-build-mf-clean)

## Prerequisites
<a name="tutorial-build-mf-prerequisites"></a>

Before you start this tutorial, complete the following prerequisites.
+ Download the [BankDemo sample application](https://d3lkpej5ajcpac.cloudfront.net/demo/mf/BANKDEMO-build.zip) and unzip it to a folder. The source folder contains COBOL programs and Copybooks, and definitions. It also contains a JCL folder for reference, although you do not need to build JCL. The folder also contains the meta files required for the build.
+ In the AWS Mainframe Modernization console, choose **Tools** . In **Analysis, development, and build assets**, choose **Share assets with my AWS account**.

## Step 1: Share the build assets with AWS account
<a name="tutorial-build-mf-assets"></a>

In this step, you ensure that you share the build assets with your AWS account, especially in the Region where assets are being used.

1. Open the AWS Mainframe Modernization console at [https://console.aws.amazon.com/m2/](https://us-west-2.console.aws.amazon.com/m2/home?region=us-west-2#/).

1. In the left navigation, choose **Tools**.

1. In **Analysis, development, and build assets**, choose **Share assets with my AWS account**.

**Important**  
You need to do this step once in every AWS Region where you intend to do builds.

## Step 2: Create Amazon S3 buckets
<a name="tutorial-build-mf-s3"></a>

In this step, you create two Amazon S3 buckets. The first is an input bucket to hold the source code, and the other is an output bucket to hold the build output. For more information, see [Creating, configuring, and working with Amazon S3 buckets](https://docs.aws.amazon.com/AmazonS3/latest/userguide/creating-buckets-s3.html) in the *Amazon S3 User Guide*.

1. To create the input bucket, log in to the Amazon S3 console and choose **Create bucket**.

1. In **General configuration**, provide a name for the bucket and specify the AWS Region where you want to create the bucket. An example name is `codebuild-regionId-accountId-input-bucket`, where `regionId` is the AWS Region of the bucket ,and `accountId` is your AWS account ID.
**Note**  
If you are creating the bucket in a different AWS Region from US East (N. Virginia), specify the `LocationConstraint` parameter. For more information, see [Create Bucket](https://docs.aws.amazon.com/AmazonS3/latest/API/API_CreateBucket.html) in the *Amazon Simple Storage Service API Reference*.

1. Retain all other settings and choose **Create bucket**.

1. Repeat steps 1-3 to create the output bucket. An example name is `codebuild-regionId-accountId-output-bucket`, where `regionId` is the AWS Region of the bucket and `accountId` is your AWS account ID.

   Whatever names you choose for these buckets, be sure to use them throughout this tutorial.

## Step 3: Create the build spec file
<a name="tutorial-build-mf-spec"></a>

In this step, you create a build spec file,. This file provides build commands and related settings, in YAML format, for CodeBuild to run the build. For more information, see [Build specification reference for CodeBuild](https://docs.aws.amazon.com/codebuild/latest/userguide/build-spec-ref.html) in the *AWS CodeBuild User Guide*.

1. Create a file named `buildspec.yml` in the directory that you unzipped as a prerequisite.

1. Add the following content to the file and save. No changes are required for this file.

   ```
   version: 0.2
   env:
     exported-variables:
       - CODEBUILD_BUILD_ID
       - CODEBUILD_BUILD_ARN
   phases:
     install:
       runtime-versions:
         python: 3.7
     pre_build:
       commands:
         - echo Installing source dependencies...
         - ls -lR $CODEBUILD_SRC_DIR/source
     build:
       commands:
         - echo Build started on `date`
         - /start-build.sh -Dbasedir=$CODEBUILD_SRC_DIR/source -Dloaddir=$CODEBUILD_SRC_DIR/target 
     post_build:
       commands:
         - ls -lR $CODEBUILD_SRC_DIR/target
         - echo Build completed on `date`
   artifacts:
     files:
       - $CODEBUILD_SRC_DIR/target/**
   ```

   Here `CODEBUILD_BUILD_ID`, `CODEBUILD_BUILD_ARN`, `$CODEBUILD_SRC_DIR/source`, and `$CODEBUILD_SRC_DIR/target` are environment variables available within CodeBuild. For more information, see [Environment variables in build environments](https://docs.aws.amazon.com/codebuild/latest/userguide/build-env-ref-env-vars.html).

   At this point, your directory should look like this.

   ```
   (root directory name)
       |-- build.xml
       |-- buildspec.yml
       |-- LICENSE.txt
       |-- source
            |... etc.
   ```

1. Zip the contents of the folder to a file named `BankDemo.zip`.. For this tutorial, you can't zip the folder. Instead, zip the contents of the folder to the file `BankDemo.zip`.

## Step 4: Upload the source files
<a name="tutorial-build-mf-upload"></a>

In this step, you upload the source code for the BankDemo sample application to your Amazon S3 input bucket.

1. Log in to the Amazon S3 console and choose **Buckets** in the left navigation pane. Then choose the input bucket you created previously.

1. Under **Objects**, choose **Upload**.

1. In the **Files and folders** section, choose **Add Files**.

1. Navigate to and choose your `BankDemo.zip` file.

1. Choose **Upload**.

## Step 5: Create IAM policies
<a name="tutorial-build-mf-IAM-policy"></a>

In this step, you create two [IAM policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies.html). One policy grants permissions for AWS Mainframe Modernization to access and use the Docker image that contains the Rocket Software build tools. This policy is not customized for customers. The other policy grants permissions for AWS Mainframe Modernization to interact with the input and output buckets, and with the [Amazon CloudWatch logs](https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/WhatIsCloudWatchLogs.html) that CodeBuild generates.

To learn about creating an IAM policy, see [Editing IAM policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_create.html) in the *IAM User Guide*.

**To create a policy for accessing Docker images**

1. In the IAM console, copy the following policy document and paste it into the policy editor.

------
#### [ JSON ]

****  

   ```
   {
       "Version":"2012-10-17",		 	 	 
       "Statement": [
           {
               "Effect": "Allow",
               "Action": [
                   "ecr:GetAuthorizationToken"
               ],
               "Resource": "*"
           },
           {
               "Effect": "Allow",
               "Action": [
                   "ecr:BatchCheckLayerAvailability",
                   "ecr:GetDownloadUrlForLayer",
                   "ecr:BatchGetImage"
               ],
               "Resource": "arn:aws:ecr:*:673918848628:repository/m2-enterprise-build-tools"
           },
           {
               "Effect": "Allow",
               "Action": [
                   "s3:PutObject"
               ],
               "Resource": "arn:aws:s3:::aws-m2-repo-*-<region>-prod"
           }
       ]
   }
   ```

------

1. Provide a name for the policy, for example, `m2CodeBuildPolicy`.

**To create a policy that allows AWS Mainframe Modernization to interact with buckets and logs**

1. In the IAM console, copy the following policy document and paste it into the policy editor. Make sure to update `regionId` to the AWS Region, and `accountId` to your AWS account.

1. Provide a name for the policy, for example, `BankdemoCodeBuildRolePolicy`.

## Step 6: Create an IAM role
<a name="tutorial-build-mf-IAM-role"></a>

In this step, you create a new [IAM role](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles.html) that allows CodeBuild to interact with AWS resources for you, after you associate the IAM policies that you previously created with this new IAM role.

For information about creating a service role, see [Creating a Role to Delegate Permissions to an AWS Service](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html) in the *IAM User Guide*,.

1. Log in to the IAM console and choose **Roles** in the left navigation pane.

1. Choose **Create role**.

1. Under **Trusted entity type**, choose **AWS service**.

1. Under **Use cases for other AWS services**, choose **CodeBuild**, and then choose **CodeBuild** again.

1. Choose **Next**.

1. On the **Add permissions** page, choose **Next**. You assign a policy to the role later.

1. Under **Role details**, provide a name for the role, for example, `BankdemoCodeBuildServiceRole`.

1. Under **Select trusted entities**, verify that the policy document looks like the following:

------
#### [ JSON ]

****  

   ```
   {
             "Version":"2012-10-17",		 	 	 
             "Statement": [
               {
                 "Effect": "Allow",
                 "Principal": {
                   "Service": "codebuild.amazonaws.com"
                 },
                 "Action": "sts:AssumeRole"
               }
             ]
           }
   ```

------

1. Choose **Create role**.

## Step 7: Attach the IAM policies to the IAM role
<a name="tutorial-build-mf-attach"></a>

In this step, you attach the two IAM policies you previously created to the `BankdemoCodeBuildServiceRole` IAM role.

1. Log in to the IAM console and choose **Roles** in the left navigation pane.

1. In **Roles**, choose the role you created previously, for example, `BankdemoCodeBuildServiceRole`.

1. In **Permissions policies**, choose **Add permissions**, and then **Attach policies**.

1. In **Other permissions policies**, choose the policies that you created previously, for example, `m2CodeBuildPolicy` and `BankdemoCodeBuildRolePolicy`.

1. Choose **Attach policies.**

## Step 8: Create the CodeBuild project
<a name="tutorial-build-mf-create-project"></a>

In this step, you create the CodeBuild project.

1. Log in to the CodeBuild console and choose **Create build project**.

1. In the **Project configuration** section, provide a name for the project, for example, `codebuild-bankdemo-project`.

1. In the **Source** section, for **Source provider**, choose **Amazon S3**, and then choose the input bucket you created previously, for example, `codebuild-regionId-accountId-input-bucket`.

1. In the **S3 object key or S3 folder** field, enter the name of the zip file that you uploaded to the S3 bucket. In this case, the file name is `bankdemo.zip`.

1. In the **Environment** section, choose **Custom image**.

1. In the **Environment type** field, choose **Linux**.

1. Under **Image registry**, choose **Other registry**.

1. In the **External registry URL** field, 
   + For Rocket Software v9: Enter `673918848628.dkr.ecr.us-west-1.amazonaws.com/m2-enterprise-build-tools:9.0.7.R1`. If you're using a different AWS Region with Rocket Software v9, you can also specify ` 673918848628.dkr.ecr.<m2-region>.amazonaws.com/m2-enterprise-build-tools:9.0.7.R1`, where <m2-region> is an AWS Region in which AWS Mainframe Modernization service is available (for example, `eu-west-3`).
   + For Rocket Software v8: Enter `673918848628.dkr.ecr.us-west-2.amazonaws.com/m2-enterprise-build-tools:8.0.9.R1`
   + For Rocket Software v7: Enter `673918848628.dkr.ecr.us-west-2.amazonaws.com/m2-enterprise-build-tools:7.0.R10`

1. Under **Service role**, choose **Existing service role**, and in the **Role ARN** field, choose the service role you created previously; for example, `BankdemoCodeBuildServiceRole`.

1. In the **Buildspec** section, choose **Use a buildspec file**.

1. In the **Artifacts** section, under **Type**, choose **Amazon S3**, and then choose your output bucket, for example, `codebuild-regionId-accountId-output-bucket`.

1. In the **Name** field, enter the name of a folder in the bucket that you want to contain the build output artifacts, for example, `bankdemo-output.zip`.

1. Under **Artifacts packaging**, choose **Zip**.

1. Choose **Create build project**.

## Step 9: Start the build
<a name="tutorial-build-mf-start"></a>

In this step, you start the build.

1. Log in to the CodeBuild console.

1. In the left navigation pane, choose **Build projects**.

1. Choose the build project that you created previously, for example, `codebuild-bankdemo-project`.

1. Choose **Start build**.

This command starts the build. The build runs asynchronously. The output of the command is a JSON that includes the attribute id. This attribute idis a reference to the CodeBuild build id of the build that you just started. You can view the status of the build in the CodeBuild console. You can also see detailed logs about the build execution in the console. For more information, see [View detailed build information](https://docs.aws.amazon.com/codebuild/latest/userguide/getting-started-build-log-console.html) in the *AWS CodeBuild User Guide*.

When the current phase is COMPLETED, it means that your build finished successfully, and your compiled artifacts are ready on Amazon S3.

## Step 10: Download output artifacts
<a name="tutorial-build-mf-download-output"></a>

In this step, you download the output artifacts from Amazon S3. The Rocket Software build tool can create several different executable types. In this tutorial, it generates shared objects.

1. Log in to the Amazon S3 console.

1. In the **Buckets** role="bold"> section, choose the name of your output bucket, for example, `codebuild-regionId-accountId-output-bucket`.

1. Choose **Download** role="bold">.

1. Unzip the downloaded file. Navigate to the target folder to see the build artifacts. These include the `.so` Linux shared objects.

## Clean up resources
<a name="tutorial-build-mf-clean"></a>

If you no longer need the resources that you created for this tutorial, delete them to avoid additional charges. To do so, complete the following steps:
+ Delete the S3 buckets that you created for this tutorial. For more information, see [Deleting a bucket](https://docs.aws.amazon.com/AmazonS3/latest/userguide/delete-bucket.html) in the *Amazon Simple Storage Service User Guide*.
+ Delete the IAM policies that you created for this tutorial. For more information, see [Deleting IAM policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_manage-delete.html) in the *IAM User Guide*.
+ Delete the IAM role that you created for this tutorial. For more information, see [Deleting roles or instance profiles](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_manage_delete.html) in the *IAM User Guide*.
+ Delete the CodeBuild project that you created for this tutorial. For more information, see [Delete a build project in CodeBuild](https://docs.aws.amazon.com/codebuild/latest/userguide/delete-project.html) in the *AWS CodeBuild User Guide*.

# Tutorial: Set up WorkSpaces Applications for use with Rocket Enterprise Analyzer and Rocket Enterprise Developer
<a name="set-up-appstream-mf"></a>

AWS Mainframe Modernization provides several tools through Amazon WorkSpaces Applications. WorkSpaces Applications is a fully managed, secure application streaming service that lets you stream desktop applications to users without rewriting applications. WorkSpaces Applications provides users with instant access to the applications that they need with a responsive, fluid user experience on the device of their choice. Using WorkSpaces Applications to host runtime engine-specific tools gives customer application teams the ability to use the tools directly from their web browsers, interacting with application files stored in either Amazon S3 buckets or CodeCommit repositories. 

For information about browser support in WorkSpaces Applications see [System Requirements and Feature Support (Web Browser)](https://docs.aws.amazon.com/appstream2/latest/developerguide/requirements-and-features-web-browser-admin.html) in the *Amazon WorkSpaces Applications Administration Guide*. If you have issues when you are using WorkSpaces Applications see [Troubleshooting AppStream 2.0 User Issues](https://docs.aws.amazon.com/appstream2/latest/developerguide/troubleshooting-user-issues.html) in the *Amazon WorkSpaces Applications Administration Guide*.

This document is intended for members of the customer operations team. It describes how to set up Amazon WorkSpaces Applications fleets and stacks to host the Rocket Enterprise Analyzer and Rocket Enterprise Developer tools used with AWS Mainframe Modernization. Rocket Enterprise Analyzer is usually used during the Assess phase and Rocket Enterprise Developer is usually used during the Migrate and Modernize phase of the AWS Mainframe Modernization approach. If you plan to use both Enterprise Analyzer and Enterprise Developer you must create separate fleets and stacks for each tool. Each tool requires its own fleet and stack because their licensing terms are different.

**Important**  
The steps in this tutorial are based on the downloadable CloudFormation template [cfn-m2-appstream-fleet-ea-ed.yml](https://drm0z31ua8gi7.cloudfront.net/tutorials/mf/appstream/cfn-m2-appstream-fleet-ea-ed.yml). 

**Topics**
+ [

## Prerequisites
](#tutorial-aas-prerequisites)
+ [

## Step 1: Get the WorkSpaces Applications images
](#tutorial-aas-step1)
+ [

## Step 2: Create the stack using the CloudFormation template
](#tutorial-aas-step2)
+ [

## Step 3: Create a user in WorkSpaces Applications
](#tutorial-aas-step3)
+ [

## Step 4: Log in to WorkSpaces Applications
](#tutorial-aas-step4)
+ [

## Step 5: Verify buckets in Amazon S3 (optional)
](#tutorial-aas-step5)
+ [

## Next steps
](#tutorial-aas-next-steps)
+ [

## Clean up resources
](#tutorial-aas-cleanup)

## Prerequisites
<a name="tutorial-aas-prerequisites"></a>
+ Download the template: [cfn-m2-appstream-fleet-ea-ed.yml](https://drm0z31ua8gi7.cloudfront.net/tutorials/mf/appstream/cfn-m2-appstream-fleet-ea-ed.yml).
+ Get the ID of your default VPC and security group. For more information on the default VPC, see [Default VPCs](https://docs.aws.amazon.com/vpc/latest/userguide/default-vpc.html) in the *Amazon VPC User Guide*. For more information on the default security group, see [Default and custom security groups](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/default-custom-security-groups.html) in the *Amazon EC2 User Guide*. 
+ Make sure you have the following permissions:
  + create stacks, fleets, and users in WorkSpaces Applications.
  + create stacks in CloudFormation using a template.
  + create buckets and upload files to buckets in Amazon S3.
  + download credentials (`access_key_id` and `secret_access_key`) from IAM.

## Step 1: Get the WorkSpaces Applications images
<a name="tutorial-aas-step1"></a>

In this step, you share the WorkSpaces Applications images for Enterprise Analyzer and Enterprise Developer with your AWS account.

1. Open the AWS Mainframe Modernization console at [https://console.aws.amazon.com/m2/](https://us-west-2.console.aws.amazon.com/m2/home?region=us-west-2#/).

1. In the left navigation, choose **Tools**.

1. In **Analysis, development, and build assets**, choose **Share assets with my AWS account**.

## Step 2: Create the stack using the CloudFormation template
<a name="tutorial-aas-step2"></a>

In this step, you use the downloaded CloudFormation template to create an WorkSpaces Applications stack and fleet for running Rocket Enterprise Analyzer. You can repeat this step later to create another WorkSpaces Applications stack and fleet for running Rocket Enterprise Developer, since each tool requires its own fleet and stack in WorkSpaces Applications. For more information on CloudFormation stacks, see [Working with stacks](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/stacks.html) in the *AWS CloudFormation User Guide*.

**Note**  
AWS Mainframe Modernization adds an additional fee to the standard WorkSpaces Applications pricing for the use of Enterprise Analyzer and Enterprise Developer. For more information, see [AWS Mainframe Modernization Pricing](https://aws.amazon.com/mainframe-modernization/pricing/).

1. Download the [cfn-m2-appstream-fleet-ea-ed.yml](https://drm0z31ua8gi7.cloudfront.net/tutorials/mf/appstream/cfn-m2-appstream-fleet-ea-ed.yml) template, if necessary.

1. Open the CloudFormation console and choose **Create Stack** and **with new resources (standard)**.

1. In **Prerequisite - Prepare template**, choose **Template is ready**.

1. In **Specify Template**, choose **Upload a template file**.

1. In **Upload a template file**, choose **Choose file** and upload the [cfn-m2-appstream-fleet-ea-ed.yml](https://drm0z31ua8gi7.cloudfront.net/tutorials/mf/appstream/cfn-m2-appstream-fleet-ea-ed.yml) template.

1. Choose **Next**.  
![\[The CloudFormation Create stack page with selected cfn-m2-appstream-fleet-ea-ed.yml template.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/cfn-create-stack.png)

1. On **Specify stack details**, enter the following information:
   + In **Stack name**, enter a name of your choice. For example, **m2-ea**.
   + In **AppStreamApplication**, choose **ea**.
   + In **AppStreamFleetSecurityGroup**, choose your default VPC’s default security group.
   + In **AppStreamFleetVpcSubnet**, choose a subnet within your default VPC.
   + In **AppStreamImageName**, choose the image starting with `m2-enterprise-analyzer`. This image contains the currently supported version of the Rocket Enterprise Analyzer tool.
   + Accept the defaults for the other fields, then choose **Next**.  
![\[The CloudFormation specify stack details page with Enterprise Analyzer options filled in.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/cfn-specify-stack-details.png)

1. Accept all defaults, then choose **Next** again.

1. On **Review**, make sure all the parameters are what you intend.

1. Scroll to the bottom, choose **I acknowledge that AWS CloudFormation might create IAM resources with custom names**, and choose **Create Stack**.

It takes between 20 and 30 minutes for the stack and fleet to be created. You can choose **Refresh** to see the CloudFormation events as they occur. 

## Step 3: Create a user in WorkSpaces Applications
<a name="tutorial-aas-step3"></a>

While you are waiting for CloudFormation to finish creating the stack, you can create one or more users in WorkSpaces Applications. These users are those who will be using Enterprise Analyzer in WorkSpaces Applications. You will need to specify an email address for each user, and ensure that each user has sufficient permissions to create buckets in Amazon S3, upload files to a bucket, and link to a bucket to map its contents.

1. Open the WorkSpaces Applications console.

1. In the left navigation, choose **User pool**.

1. Choose **Create user**.

1. Provide an email address where the user can receive an email invitation to use WorkSpaces Applications, a first name and last name, and choose **Create user**.

1. Repeat if necessary to create more users. The email address for each user must be unique.

For more information on creating WorkSpaces Applications users, see [WorkSpaces Applications User Pools](https://docs.aws.amazon.com/appstream2/latest/developerguide/user-pool.html) in the *Amazon WorkSpaces Applications Administration Guide*.

When CloudFormation finishes creating the stack, you can assign the user you created to the stack, as follows:

1. Open the WorkSpaces Applications console.

1. Choose the user name.

1. Choose **Action**, then **Assign stack**.

1. In **Assign stack**, choose the stack that begins with `m2-appstream-stack-ea`.

1. Choose **Assign stack**.  
![\[The WorkSpaces Applications Assign stack page showing a user and the Enterprise Analyzer stack to be assigned.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/aas-assign-stack.png)

Assigning a user to a stack causes WorkSpaces Applications to send an email to the user at the address you provided. This email contains a link to the WorkSpaces Applications login page.

## Step 4: Log in to WorkSpaces Applications
<a name="tutorial-aas-step4"></a>

In this step, you log in to WorkSpaces Applications using the link in the email sent by WorkSpaces Applications to the user you created in [Step 3: Create a user in WorkSpaces Applications](#tutorial-aas-step3).

1. Log in to WorkSpaces Applications using the link provided in the email sent by WorkSpaces Applications.

1. Change your password, if prompted. The WorkSpaces Applications screen that you see is similar to the following:  
![\[A sample WorkSpaces Applications login screen showing the desktop icon.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/aas-login-screen.png)

1. Choose **Desktop**.

1. On the task bar, choose **Search** and enter **D:** to navigate to the Home Folder.
**Note**  
If you skip this step, you might get a `Device not ready` error when you try to access the Home Folder.

At any point, if you have trouble signing into WorkSpaces Applications, you can restart your WorkSpaces Applications fleet and try to sign in again, using the following steps.

1. Open the WorkSpaces Applications console.

1. In the left navigation, choose **Fleets**.

1. Choose the fleet you are trying to use.

1. Choose **Action**, then choose **Stop**.

1. Wait for the fleet to stop.

1. Choose **Action**, then choose **Start**.

This process can take around 10 minutes.

## Step 5: Verify buckets in Amazon S3 (optional)
<a name="tutorial-aas-step5"></a>

One of the tasks completed by the CloudFormation template you used to create the stack was to create two buckets in Amazon S3, which are necessary to save and restore user data and application settings across work sessions. These buckets are as follows:
+ Name starts with `appstream2-`. This bucket maps data to your Home Folder in WorkSpaces Applications (`D:\PhotonUser\My Files\Home Folder`).
**Note**  
The Home Folder is unique for a given email address and is shared across all fleets and stacks in a given AWS account. The name of the Home Folder is a SHA256 hash of the user’s email address, and is stored on a path based on that hash.
+ Name starts with `appstream-app-settings-`. This bucket contains user session information for WorkSpaces Applications, and includes settings such as browser favorites, IDE and application connection profiles, and UI customizations. For more information, see [How Application Settings Persistence Works](https://docs.aws.amazon.com/appstream2/latest/developerguide/how-it-works-app-settings-persistence.html) in the *Amazon WorkSpaces Applications Administration Guide*.

To verify that the buckets were created, follow these steps:

1. Open the Amazon S3 console.

1. In the left navigation, choose **Buckets**.

1. In **Find buckets by name**, enter **appstream** to filter the list.

If you see the buckets, no further action is necessary. Just be aware that the buckets exist. If you do not see the buckets, then either the CloudFormation template is not finished running, or an error occurred. Go to the CloudFormation console and review the stack creation messages.

## Next steps
<a name="tutorial-aas-next-steps"></a>

Now that the WorkSpaces Applications infrastructure is set up, you can set up and start using Enterprise Analyzer. For more information, see [Tutorial: Set up Enterprise Analyzer on WorkSpaces Applications](set-up-ea.md). You can also set up Enterprise Developer. For more information, see [Tutorial: Set up Rocket Enterprise Developer on WorkSpaces Applications](set-up-ed.md).

## Clean up resources
<a name="tutorial-aas-cleanup"></a>

The procedure to clean up the created stack and fleets is described in [Create an WorkSpaces Applications Fleet and Stack](https://docs.aws.amazon.com/appstream2/latest/developerguide/set-up-stacks-fleets.html).

When the WorkSpaces Applications objects have been deleted, the account administrator can also, if appropriate, clean up the Amazon S3 buckets for Application Settings and Home Folders.

**Note**  
The home folder for a given user is unique across all fleets, so you might need to retain it if other WorkSpaces Applications stacks are active in the same account.

Finally, WorkSpaces Applications does not currently allow you to delete users using the console. Instead, you must use the service API with the CLI. For more information, see [User Pool Administration](https://docs.aws.amazon.com/appstream2/latest/developerguide/user-pool-admin.html) in the *Amazon WorkSpaces Applications Administration Guide*.

# Tutorial: Use templates with Rocket Enterprise Developer (formerly Micro Focus Enterprise Developer)
<a name="tutorial-templates-ed"></a>

This tutorial describes how to use templates and predefined projects with Rocket Enterprise Developer. It covers three use cases. All of the use cases use the sample code provided in the BankDemo sample. To download the sample, choose [https://d1vi4vxke6c2hu.cloudfront.net/demo/bankdemo.zip](https://d1vi4vxke6c2hu.cloudfront.net/demo/bankdemo.zip) .

**Important**  
If you use the version of Enterprise Developer for Windows, the binaries generated by the compiler can run only on the Enterprise Server provided with Enterprise Developer. You cannot run them under the AWS Mainframe Modernization runtime, which is based on Linux.

**Topics**
+ [

## Use Case 1 - Using the COBOL Project Template containing source components
](#tutorial-templates-ed-step1)
+ [

## Use Case 2 - Using the COBOL Project Template without source components
](#tutorial-templates-ed-step2)
+ [

## Use Case 3 - Using the pre-defined COBOL project linking to the source folders
](#tutorial-templates-ed-step3)
+ [

## Using the Region Definition JSON Template
](#tutorial-templates-ed-step4)

## Use Case 1 - Using the COBOL Project Template containing source components
<a name="tutorial-templates-ed-step1"></a>

This use case requires you to copy the source components into the Template directory structure as part of the demo pre setup steps. In the [https://d1vi4vxke6c2hu.cloudfront.net/demo/bankdemo.zip](https://d1vi4vxke6c2hu.cloudfront.net/demo/bankdemo.zip) this has been changed from the original `AWSTemplates.zip` delivery to avoid having two copies of the source.

1. Start Enterprise Developer and specify the chosen workspace.  
![\[The Eclipse launcher with a workspace selected.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ed-uc1-step1.png)

1. Within the **Application Explorer** view, from the **Enterprise Development Project** tree view item, choose **New Project from Template** from the context menu.  
![\[The enterprise development project tree view context menu.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ed-uc1-step2.png)

1. Enter the template parameters as shown.
**Note**  
The Template Path will refer to where the ZIP was extracted.  
![\[The Enter template parameters box with the path and project name filled in.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ed-uc1-step3.png)

1. Choosing OK will create a local development Eclipse Project based on the provided template, with a complete source and execution environment structure.  
![\[The local development Eclipse project showing its structure.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ed-uc1-step4.png)

   The `System` structure contains a complete resource definition file with the required entries for BANKDEMO, the required catalog with entries added and the corresponding ASCII data files.

   Because the source template structure contains all the source items, these files are copied to the local project and therefore are automatically built in Enterprise Developer.

## Use Case 2 - Using the COBOL Project Template without source components
<a name="tutorial-templates-ed-step2"></a>

Steps 1 to 3 are identical to [Use Case 1 - Using the COBOL Project Template containing source components](#tutorial-templates-ed-step1). 

The `System` structure in this use case also contains a complete resource definition file with the required entries for BankDemo, the required catalog with entries added, and the corresponding ASCII data files.

However, the template source structure does not contain any components. You must import these into the project from whatever source repository you are using.

1. Choose the project name. From the related context menu, choose **Import**.  
![\[The project context menu with import selected.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ed-uc2-step4.png)

1. From the resulting dialog, under the **General** section, choose **File System** and then choose Next.  
![\[The Import box with file system selected.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ed-uc2-step5.png)

1. Populate the **From directory** field by browsing the file system to point to the repository folder. Choose all the folders you wish to import, such as `sources`. The `Into folder` field will be pre-populated. Choose **Finish**.   
![\[The File system box with the BankDemo directory expanded.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ed-uc2-step6.png)

   After the source template structure contains all the source items, they are built automatically in Enterprise Developer.

## Use Case 3 - Using the pre-defined COBOL project linking to the source folders
<a name="tutorial-templates-ed-step3"></a>

1. Start Enterprise Developer and specify the chosen workspace.  
![\[The Eclipse launcher with a workspace selected.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ed-uc3-step1.png)

1. From the **File** menu, choose **Import**.  
![\[The File menu with Import selected.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ed-uc3-step2.png)

1. From the resulting dialog, under **General**, choose **Projects from Folder or Archive** and choose **Next**.  
![\[The Import box with projects from folder or archive selected.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ed-uc3-step3.png)

1. Populate **Import source**, Choose **Directory** and browse through the file system to select the pre-defined project folder. The project contained within has links to the source folders in the same repository.  
![\[The import projects from file system or archive box with the path to the import source entered.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ed-uc3-step4.png)

   Choose **Finish**.

   Because the project is populated by the links to the source folder, the code is automatically built.

## Using the Region Definition JSON Template
<a name="tutorial-templates-ed-step4"></a>

1. Switch to the Server Explorer view. From the related context menu, choose **Open Administration Page**, which starts the default browser.  
![\[The server explorer context menu with open administration page selected.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ed-json-admin-page.png)

1. From the resulting Enterprise Server Common Web Administration (ESCWA) screen, choose **Import** .  
![\[The Enterprise Server Common Web Administration screen with Import selected.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ed-json-import.png)

1. Choose the **JSON** import type and choose **Next**.  
![\[The choose import type box with JSON selected.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ed-json-import-type.png)

1. Upload the supplied `BANKDEMO.JSON` file.  
![\[The choose file to upload box with the BANKDEMO file selected.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ed-json-upload.png)

   Once selected, choose **Next**.  
![\[The select regions box with clear ports from endpoints not selected.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ed-json-next.png)

   On the **Select Regions** panel, ensure that the **Clear Ports from Endpoints** option is not selected, and then continue to choose **Next** through the panels until the **Perform Import** panel is shown. Then choose **Import** from the left navigation pane.

   Finally click **Finish**. The BANKDEMO region will then be added to the server list.  
![\[The Region and server list with BankDemo added.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ed-json-server-list.png)

1. Navigate to the **General Properties** for the BANKDEMO region.

1. Scroll to the **Configuration** section.

1. The ESP environment variable needs to be set to the `System` folder relevant to the Eclipse Project created in the previous steps. This should be `workspacefolder/projectname/System`.  
![\[The configuration section with the ESP variable shown.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ed-json-ESP.png)

1. Click **Apply**.

   The region is now fully configured to run in conjunction with the Eclipse COBOL project.

1. Finally, back in Enterprise Developer, associate the imported region with the project.  
![\[The project context menu with Associated with project selected.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ed-json-associate.png)

   The Enterprise Developer environment is now ready to use, with a complete working version of BankDemo. You can edit, compile, and debug code against the region.
**Important**  
If you use the version of Enterprise Developer for Windows, the binaries generated by the compiler can run only on the Enterprise Server provided with Enterprise Developer. You cannot run them under the AWS Mainframe Modernization runtime, which is based on Linux.

# Tutorial: Set up Enterprise Analyzer on WorkSpaces Applications
<a name="set-up-ea"></a>

This tutorial describes how to set up Rocket Enterprise Analyzer (formerly Micro Focus Enterprise Analyzer) to analyze one or more mainframe applications. The Enterprise Analyzer tool provides several reports based on its analysis of the application source code and system definitions.

This setup is designed to foster team collaboration. Installation uses an Amazon S3 bucket to share the source code with virtual disks. Doing this makes use of [Rclone](https://rclone.org/)) on the Windows machine. With a common Amazon RDS instance running [PostgreSQL](https://www.postgresql.org/) , any member of the team can access to all requested reports.

Team members can also mount the virtual Amazon S3 backed disk on their personal machines. and update the source bucket from their workstations. They can potentially use scripts or any other form of automation on their machines if they are connected to other on-premises internal systems.

The setup is based on the WorkSpaces Applications Windows images that AWS Mainframe Modernization shares with the customer . Setup is also based on the creation of WorkSpaces Applications fleets and stacks as described in [Tutorial: Set up WorkSpaces Applications for use with Rocket Enterprise Analyzer and Rocket Enterprise Developer](set-up-appstream-mf.md).

**Important**  
The steps in this tutorial assume that you set up WorkSpaces Applications with the downloadable CloudFormation template [cfn-m2-appstream-fleet-ea-ed.yml](https://drm0z31ua8gi7.cloudfront.net/tutorials/mf/appstream/cfn-m2-appstream-fleet-ea-ed.yml). For more information, see [Tutorial: Set up WorkSpaces Applications for use with Rocket Enterprise Analyzer and Rocket Enterprise Developer](set-up-appstream-mf.md).  
To perform the steps in this tutorial, you must have set up your Enterprise Analyzer fleet and stack and they must be running.

For a complete description of Enterprise Analyzer features and deliverables, see the [Enterprise Analyzer Documentation](https://www.microfocus.com/documentation/enterprise-analyzer/) on the Rocket Software (formerly Micro Focus) website.

## Image contents
<a name="set-up-ea-image-contents"></a>

In addition to Enterprise Analyzer application itself, the image contains the following tools and libraries.

Third-party tools
+ [Python](https://www.python.org/)
+ [Rclone](https://rclone.org/)
+ [pgAdmin](https://www.pgadmin.org/)
+ [git-scm](https://git-scm.com/)
+ [PostgreSQL ODBC driver](https://odbc.postgresql.org/)

Libraries in `C:\Users\Public`
+ BankDemo source code and project definition for Enterprise Developer: `m2-bankdemo-template.zip`.
+ MFA install package for the mainframe: `mfa.zip`. For more information, see [Mainframe Access Overview](https://www.microfocus.com/documentation/enterprise-developer/30pu12/ED-VS2012/BKMMMMINTRS001.html) in the *Micro Focus Enterprise Developer *documentation.
+ Command and config files for Rclone (instructions for their use in the tutorials): `m2-rclone.cmd` and `m2-rclone.conf`.

**Topics**
+ [

## Image contents
](#set-up-ea-image-contents)
+ [

## Prerequisites
](#tutorial-ea-prerequisites)
+ [

## Step 1: Setup
](#tutorial-ea-step1)
+ [

## Step 2: Create the Amazon S3 based virtual folder on Windows
](#tutorial-ea-step2)
+ [

## Step 3: Create an ODBC source for the Amazon RDS instance
](#tutorial-ea-step3)
+ [

## Subsequent sessions
](#tutorial-ea-step4)
+ [

## Troubleshooting workspace connection
](#tutorial-ea-step5)
+ [

## Clean up resources
](#tutorial-ea-clean)

## Prerequisites
<a name="tutorial-ea-prerequisites"></a>
+ Upload the source code and system definitions for the customer application that you want to analyze to an S3 bucket. The system definitions include CICS CSD, DB2 object definitions, and so on. You can create a folder structure within the bucket that makes sense for how you want to organize the application artifacts. For example, when you unzip the BankDemo sample, it has the following structure:

  ```
  demo
       |--> jcl
       |--> RDEF
       |--> transaction
       |--> xa
  ```
+ Create and start an Amazon RDS instance running PostgreSQL. This instance will store the data and results produced by Enterprise Analyzer. You can share this instance with all members of the application team. In addition, create an empty schema called `m2_ea` (or any other suitable name) in the database. Define credentials for authorized users that allow them to create, insert, update, and delete items in this schema. You can obtain the database name, its server endpoint URL, and TCP port from the Amazon RDS console or from the account administrator.
+ Make sure you have set up programmatic access to your AWS account. For more information, see [Programmatic access](https://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html#access-keys-and-secret-access-keys) in the *Amazon Web Services General Reference.*

## Step 1: Setup
<a name="tutorial-ea-step1"></a>

1. Start a session with WorkSpaces Applications with the URL that you received in the welcome email message from WorkSpaces Applications.

1. Use your email as your user ID, and define your permanent password.

1. Select the Enterprise Analyzer stack.

1. On the WorkSpaces Applications menu page, choose **Desktop** to reach the Windows desktop that the fleet is streaming.

## Step 2: Create the Amazon S3 based virtual folder on Windows
<a name="tutorial-ea-step2"></a>

**Note**  
If you already used Rclone during the AWS Mainframe Modernization preview, you must update `m2-rclone.cmd` to the newer version located in `C:\Users\Public`.

1. Copy the `m2-rclone.conf` and `m2-rclone.cmd` files provided in `C:\Users\Public` to your home folder `C:\Users\PhotonUser\My Files\Home Folder` using File Explorer.

1. Update the `m2-rclone.conf` config parameters with your AWS access key and corresponding secret, as well as your AWS Region.

   ```
   [m2-s3]
   type = s3
   provider = AWS
   access_key_id = YOUR-ACCESS-KEY
   secret_access_key = YOUR-SECRET-KEY
   region = YOUR-REGION
   acl = private
   server_side_encryption = AES256
   ```

1. In `m2-rclone.cmd`, make the following changes:
   + Change `amzn-s3-demo-bucket` to your Amazon S3 bucket name. For example, `m2-s3-mybucket`.
   + Change `your-s3-folder-key` to your Amazon S3 bucket key. For example, `myProject`.
   + Change `your-local-folder-path` to the path of the directory where you want the application files synced from the Amazon S3 bucket that contains them. For example, `D:\PhotonUser\My Files\Home Folder\m2-new`. This synced directory must be a subdirectory of the Home Folder in order for WorkSpaces Applications to properly back up and restore it on session start and end.

   ```
   :loop
   timeout /T 10
   "C:\Program Files\rclone\rclone.exe" sync m2-s3:amzn-s3-demo-bucket/your-s3-folder-key "D:\PhotonUser\My Files\Home Folder\your-local-folder-path" --config "D:\PhotonUser\My Files\Home Folder\m2-rclone.conf"
   goto :loop
   ```

1. Open a Windows command prompt, cd to `C:\Users\PhotonUser\My Files\Home Folder` if needed and run `m2-rclone.cmd`. This command script runs a continuous loop, syncing your Amazon S3 bucket and key to the local folder every 10 seconds. You can adjust the time out as needed. You should see the source code of the application located in the Amazon S3 bucket in Windows File Explorer.

To add new files to the set that you are working on or to update existing ones, upload the files to the Amazon S3 bucket and they will be synced to your directory at the next iteration defined in `m2-rclone.cmd`. Similarly, if you want to delete some files, delete them from the Amazon S3 bucket. The next sync operation will delete them from your local directory.

## Step 3: Create an ODBC source for the Amazon RDS instance
<a name="tutorial-ea-step3"></a>

1. To start the EA\$1Admin tool, navigate to the application selector menu in the top left corner of the browser window and choose **MF EA\$1Admin**.

1. From the **Administer** menu, choose **ODBC Data Sources**, and choose **Add** from the **User DSN** tab.

1. In the Create New Data Source dialog box, choose the **PostgreSQL Unicode** driver, and then choose **Finish**.

1. In the **PostgreSQL Unicode ODBC Driver (psqlODBC) Setup** dialog box, define and take note of the data source name that you want. Complete the following parameters with the values from the RDS instance that you previously created:  
**Description**  
Optional description to help you identify this database connection quickly.  
**Database**  
The Amazon RDS database you created previously.  
**Server**  
The Amazon RDS endpoint.  
**Port**  
The Amazon RDS port.  
**User Name**  
As defined in the Amazon RDS instance.  
**Password**  
As defined in the Amazon RDS instance.

1. Choose **Test** to validate that the connection to Amazon RDS is successful, and then choose **Save** to save your new User DSN.

1. Wait until you see the message that confirms creation of the proper workspace, and then choose **OK** to finish with ODBC Data Sources and close the EA\$1Admin tool.

1. Navigate again to the application selector menu, and choose Enterprise Analyzer to start the tool. Choose **Create New**. 

1. In the Workspace configuration window, enter your workspace name and define its location. The workspace can be the Amazon S3 based disk if you work under this config, or your home folder if you prefer.

1. Choose **Choose Other Database** to connect to your Amazon RDS instance.

1. Choose the **Postgre** icon from the options, and then choose **OK**.

1. For the Windows settings under **Options – Define Connection Parameters**, enter the name of the data source that you created. Also enter the database name, the schema name, the user name, and password. Choose **OK**.

1. Wait for Enterprise Analyzer to create all the tables, indexes, and so on that it needs to store results. This process might take a couple of minutes. Enterprise Analyzer confirms when the database and workspace are ready for use.

1. Navigate again to the application selector menu and choose Enterprise Analyzer to start the tool.

1. The Enterprise Analyzer startup window appears in the new, selected workspace location. Choose **OK**.

1. Navigate to your repository in the left pane, select the repository name, and choose **Add files / folders to your workspace**.Select the folder where your application code is stored to add it to the workspace. You can use the previous BankDemo example code if you want. When Enterprise Analyzer prompts you to verify those files, choose **Verify** to start the initial Enterprise Analyzer verification report. It might take some minutes to complete, depending on the size of your application.

1. Expand your workspace to see the files and folders that you’ve added to the workspace. The object types and cyclomatic complexity reports are also visible in the top quadrant of the **Chart Viewer** pane.

You can now use Enterprise Analyzer for all needed tasks.

## Subsequent sessions
<a name="tutorial-ea-step4"></a>

1. Start a session with WorkSpaces Applications with the URL that you received in the welcome email message from WorkSpaces Applications.

1. Log in with your email and permanent password.

1. Select the Enterprise Analyzer stack.

1. Launch `Rclone` to connect to the Amazon S3 backed disk if you use this option to share the workspace files.

1. Launch Enterprise Analyzer to do your tasks.

## Troubleshooting workspace connection
<a name="tutorial-ea-step5"></a>

When you try to reconnect to your Enterprise Analyzer workspace, you might see an error like this:

```
Cannot access the workspace directory D:\PhotonUser\My Files\Home Folder\EA_BankDemo. The workspace has been created on a non-shared disk of the EC2AMAZ-E6LC33H computer. Would you like to correct the workspace directory location?
```

To resolve this issue, choose **OK** to clear the message, and then complete the following steps.

1. In WorkSpaces Applications, choose the **Launch Application** icon on the toolbar, and then choose **EA\$1Admin** to start the Enterprise Analyzer Administration tool.  
![\[The WorkSpaces Applications launch selector menu with the Rocket Enterprise Developer administration tool selected.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/aas-launch-selector.png)

1. From the **Administer** menu, choose **Refresh Workspace Path...**.  
![\[Administer menu of Rocket Enterprise Analyzer administration tool with Refresh Workspace Path selected.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ea_admin-administer-refresh.png)

1. Under **Select workspace**, choose the workspace that you want, and then choose **OK**.  
![\[The Select workspace dialog box of Rocket Enterprise Analyzer administration tool with a project selected.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ea_admin-select-workspace.png)

1. Choose **OK** to confirm the error message.  
![\[The Enterprise Analyzer error message "Cannot access the workspace directory" with OK selected.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ea_admin-select-workspace-error.png)

1. Under **Workspace directory network path**, enter the correct path to your workspace, for example, `D:\PhotonUser\My Files\Home Folder\EA\MyWorkspace3`.  
![\[The Enterprise Analyzer dialog box Workspace directory network path with an example path.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ea_admin-workspace-directory-network-path.png)

1. Close the Micro Focus Enterprise Analyzer Administration tool.  
![\[The Micro Focus Enterprise Analyzer Administration tool with the Close button selected.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/ea_admin-close.png)

1. In WorkSpaces Applications, choose the **Launch Application** icon on the toolbar, and then choose **EA** to start Micro Focus Enterprise Analyzer.  
![\[The WorkSpaces Applications launch application icon with EA selected.\]](http://docs.aws.amazon.com/m2/latest/userguide/images/aas-launch-ea.png)

1. Repeat steps 3 - 5.

Micro Focus Enterprise Analyzer should now open with the existing workspace.

## Clean up resources
<a name="tutorial-ea-clean"></a>

If you no longer need the resources that you created for this tutorial, delete them so that you don't incur further charges. Complete the following steps:
+ Use the **EA\$1Admin** tool to delete the workspace.
+ Delete the S3 buckets that you created for this tutorial. For more information, see [Deleting a bucket](https://docs.aws.amazon.com/AmazonS3/latest/userguide/delete-bucket.html) in the *Amazon S3 User Guide*.
+ Delete the database that you created for this tutorial. For more information, see [Deleting a DB instance](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/CHAP_GettingStarted.CreatingConnecting.PostgreSQL.html#CHAP_GettingStarted.Deleting.PostgreSQL).

# Tutorial: Set up Rocket Enterprise Developer on WorkSpaces Applications
<a name="set-up-ed"></a>

This tutorial describes how to set up Rocket Enterprise Developer (formerly Micro Focus Enterprise Developer) for one or more mainframe applications in order to maintain, compile, and test them using the Enterprise Developer features. The setup is based on the WorkSpaces Applications Windows images that AWS Mainframe Modernization shares with the customer and on the creation of WorkSpaces Applications fleets and stacks as described in [Tutorial: Set up WorkSpaces Applications for use with Rocket Enterprise Analyzer and Rocket Enterprise Developer](set-up-appstream-mf.md).

**Important**  
The steps in this tutorial assume that you set up WorkSpaces Applications using the downloadable CloudFormation template [cfn-m2-appstream-fleet-ea-ed.yaml](https://d1vi4vxke6c2hu.cloudfront.net/tutorial/cfn-m2-appstream-fleet-ea-ed.yaml). For more information, see [Tutorial: Set up WorkSpaces Applications for use with Rocket Enterprise Analyzer and Rocket Enterprise Developer](set-up-appstream-mf.md).  
You must perform the steps of this setup when the Enterprise Developer fleet and stack are up and running.

For a complete description of Enterprise Developer v7 features and deliverables, check out its [up-to-date online documentation (v7.0)](https://www.microfocus.com/documentation/enterprise-developer/ed70/ED-Eclipse/GUID-8D6B7358-AC35-4DAF-A445-607D8D97EBB2.html) on the Rocket Software (formerly Micro Focus) site.

## Image contents
<a name="set-up-ed-image-contents"></a>

In addition to Enterprise Developer itself, the image contains the image contains Rumba (a TN3270 emulator). It also contains the following tools and libraries.

Third-party tools
+ [Python](https://www.python.org/)
+ [Rclone](https://rclone.org/)
+ [pgAdmin](https://www.pgadmin.org/)
+ [git-scm](https://git-scm.com/)
+ [PostgreSQL ODBC driver](https://odbc.postgresql.org/)

Libraries in `C:\Users\Public`
+ BankDemo source code and project definition for Enterprise Developer: `m2-bankdemo-template.zip`.
+ MFA install package for the mainframe: `mfa.zip`. For more information, see [Mainframe Access Overview](https://www.microfocus.com/documentation/enterprise-developer/30pu12/ED-VS2012/BKMMMMINTRS001.html) in the *Micro Focus Enterprise Developer *documentation.
+ Command and config files for Rclone (instructions for their use in the tutorials): `m2-rclone.cmd` and `m2-rclone.conf`.

If you need to access source code that is not yet loaded into CodeCommit repositories, but that is available in an Amazon S3 bucket, for example to perform the initial load of the source code into git, follow the procedure to create a virtual Windows disk as described in [Tutorial: Set up Enterprise Analyzer on WorkSpaces Applications](set-up-ea.md).

**Topics**
+ [

## Image contents
](#set-up-ed-image-contents)
+ [

## Prerequisites
](#tutorial-ed-prerequisites)
+ [

## Step 1: Setup by individual Enterprise Developer users
](#tutorial-ed-step1)
+ [

## Step 2: Create the Amazon S3-based virtual folder on Windows (optional)
](#tutorial-ed-step2)
+ [

## Step 3: Clone the repository
](#tutorial-ed-step3)
+ [

## Subsequent sessions
](#tutorial-ed-step4)
+ [

## Clean up resources
](#tutorial-ed-clean)

## Prerequisites
<a name="tutorial-ed-prerequisites"></a>
+ One or more CodeCommit repositories loaded with the source code of the application to be maintained. The repository setup should match the requirements of the CI/CD pipeline above to create synergies by combination of both tools.
+ Each user must have credentials to the CodeCommit repository or repositories defined by the account administrator according to the information in [Authentication and access control for AWS CodeCommit](https://docs.aws.amazon.com/codecommit/latest/userguide/auth-and-access-control.html). The structure of those credentials is reviewed in [Authentication and access control for AWS CodeCommit](https://docs.aws.amazon.com/codecommit/latest/userguide/auth-and-access-control.html) and the complete reference for IAM authorizations for CodeCommit is in the [CodeCommit permissions reference](https://docs.aws.amazon.com/codecommit/latest/userguide/auth-and-access-control-permissions-reference.html): the administrator may define distinct IAM policies for distinct roles having credentials specific to the role for each repository and limiting its authorizations of the user to the specific set of tasks that he has to to accomplish on a given repository. So, for each maintainer of the CodeCommit repository, the account administrator will generate a primary user and grant this user permissions to access the required repository or repositories via selecting the proper IAM policy or policies for CodeCommit access.

## Step 1: Setup by individual Enterprise Developer users
<a name="tutorial-ed-step1"></a>

1. Obtain your IAM credentials:

   1. Connect to the AWS console at [https://console.aws.amazon.com/iam/](https://console.aws.amazon.com/iam/).

   1. Follow the procedure described in step 3 of [Setup for HTTPS users using Git credentials](https://docs.aws.amazon.com/codecommit/latest/userguide/setting-up-gc.html) in the *AWS CodeCommit User Guide*. 

   1. Copy the CodeCommit-specific sign-in credentials that IAM generated for you, either by showing, copying, and then pasting this information into a secure file on your local computer, or by choosing **Download credentials** to download this information as a .CSV file. You need this information to connect to CodeCommit.

1. Start a session with WorkSpaces Applications based on the url received in the welcome email. Use your email as user name and create your password.

1. Select your Enterprise Developer stack.

1. On the menu page, choose **Desktop** to reach the Windows desktop streamed by the fleet.

## Step 2: Create the Amazon S3-based virtual folder on Windows (optional)
<a name="tutorial-ed-step2"></a>

If there is a need for Rclone (see above), create the Amazon S3-based virtual folder on Windows: (optional if all application artefacts exclusively come from CodeCommit access).

**Note**  
If you already used Rclone during the AWS Mainframe Modernization preview, you must update `m2-rclone.cmd` to the newer version located in `C:\Users\Public`.

1. Copy the `m2-rclone.conf` and `m2-rclone.cmd` files provided in `C:\Users\Public` to your home folder `C:\Users\PhotonUser\My Files\Home Folder` using File Explorer.

1. Update the `m2-rclone.conf` config parameters with your AWS access key and corresponding secret, as well as your AWS Region.

   ```
   [m2-s3]
   type = s3
   provider = AWS
   access_key_id = YOUR-ACCESS-KEY
   secret_access_key = YOUR-SECRET-KEY
   region = YOUR-REGION
   acl = private
   server_side_encryption = AES256
   ```

1. In `m2-rclone.cmd`, make the following changes:
   + Change `amzn-s3-demo-bucket` to your Amazon S3 bucket name. For example, `m2-s3-mybucket`.
   + Change `your-s3-folder-key` to your Amazon S3 bucket key. For example, `myProject`.
   + Change `your-local-folder-path` to the path of the directory where you want the application files synced from the Amazon S3 bucket that contains them. For example, `D:\PhotonUser\My Files\Home Folder\m2-new`. This synced directory must be a subdirectory of the Home Folder in order for WorkSpaces Applications to properly back up and restore it on session start and end.

   ```
   :loop
   timeout /T 10
   "C:\Program Files\rclone\rclone.exe" sync m2-s3:amzn-s3-demo-bucket/your-s3-folder-key "D:\PhotonUser\My Files\Home Folder\your-local-folder-path" --config "D:\PhotonUser\My Files\Home Folder\m2-rclone.conf"
   goto :loop
   ```

1. Open a Windows command prompt, cd to `C:\Users\PhotonUser\My Files\Home Folder` if needed and run `m2-rclone.cmd`. This command script runs a continuous loop, syncing your Amazon S3 bucket and key to the local folder every 10 seconds. You can adjust the time out as needed. You should see the source code of the application located in the Amazon S3 bucket in Windows File Explorer.

To add new files to the set that you are working on or to update existing ones, upload the files to the Amazon S3 bucket and they will be synced to your directory at the next iteration defined in `m2-rclone.cmd`. Similarly, if you want to delete some files, delete them from the Amazon S3 bucket. The next sync operation will delete them from your local directory.

## Step 3: Clone the repository
<a name="tutorial-ed-step3"></a>

1. Navigate to the application selector menu in the top left corner of the browser window and select Enterprise Developer.

1. Complete the workspace creation required by Enterprise Developer in your Home folder by choosing `C:\Users\PhotonUser\My Files\Home Folder` (aka `D: \PhotonUser\My Files\Home Folder`) as location for the workspace.

1. In Enterprise Developer, clone your CodeCommit repository by going to the Project Explorer, right click and choose **Import**, **Import …**, **Git**, **Projects** from **Git** **Clone URI**. Then, enter your CodeCommit-specific sign-in credentials and complete the Eclipse dialog to import the code.

The CodeCommit git repository in now cloned in your local workspace.

Your Enterprise Developer workspace is now ready to start the maintenance work on your application. In particular, you can use the local instance of Enterprise Server (ES) integrated with Enterprise Developer to interactively debug and run your application to validate your changes locally.

**Note**  
The local Enterprise Developer environment, including the local Enterprise Server instance, runs under Windows while AWS Mainframe Modernization runs under Linux. We recommend that you run complementary tests in the Linux environment provided by AWS Mainframe Modernization after you commit the new application to CodeCommit and rebuild it for this target and before you roll out the new application to production.

## Subsequent sessions
<a name="tutorial-ed-step4"></a>

As you select a folder that is under WorkSpaces Applications management like the home folder for the cloning of your CodeCommit repository, it will be saved and restored transparently across sessions. Complete the following steps the next time you need to work with the application: 

1. Start a session with WorkSpaces Applications based on the url received in the welcome email.

1. Login with your email and permanent password.

1. Select the Enterprise Developer stack.

1. Launch `Rclone` to connect (see above) to the Amazon S3-backed disk when this option is used to share the workspace files.

1. Launch Enterprise Developer to do your work.

## Clean up resources
<a name="tutorial-ed-clean"></a>

If you no longer need the resources you created for this tutorial, delete them so that you won't continue to be charged for them. Complete the following steps:
+ Delete the CodeCommit repository you created for this tutorial. For more information, see [Delete an CodeCommit repository](https://docs.aws.amazon.com/codecommit/latest/userguide/how-to-delete-repository.html) in the *AWS CodeCommit User Guide*.
+ Delete the database you created for this tutorial. For more information, see [Deleting a DB instance](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/CHAP_GettingStarted.CreatingConnecting.PostgreSQL.html#CHAP_GettingStarted.Deleting.PostgreSQL).