

# Foundational
Foundational

Foundational Dashboards require CUR [Data Exports](data-exports.md) or Legacy CUR.

This section covers the following dashboards:

Contents
+  [CUDOS Dashboard](cudos-cid-kpi.md#foundational-cudos-dashboard) 
+  [Cost Intelligence Dashboard (CID)](cudos-cid-kpi.md#foundational-cid-dashboard) 
+  [KPI Dashboard](cudos-cid-kpi.md#foundational-kpi-dashboard) 

# CUDOS, CID, KPI
CUDOS, CID, KPI

## Feedback & Support


Follow [Feedback & Support](feedback-support.md) guide

## Introduction


In this section we provide a description of [CUDOS Dashboard](#foundational-cudos-dashboard), [Cost Intelligence Dashboard (CID)](#foundational-cid-dashboard) and [KPI Dashboard](#foundational-kpi-dashboard) which use data exclusively from the [AWS Cost and Usage Report.](https://aws.amazon.com/aws-cost-management/aws-cost-and-usage-reporting/) 

All these dashboards are based on the AWS Cost & Usage Report (CUR) that contains the most comprehensive set of AWS cost and usage data available, including additional metadata about AWS services, pricing, Reserved Instances, and Savings Plans. The CUR itemizes usage at the account or Organization level by product code, usage type and operation. These costs can be further organized by enabling Cost Allocation tags and Cost Categories.

![\[Recommended Deployment Architecture\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/basic_deployment_arch.png)


1.  [AWS Data Exports](https://aws.amazon.com/aws-cost-management/aws-data-exports/) delivers daily the Cost & Usage Report (CUR2) to an [Amazon S3 Bucket](https://aws.amazon.com/s3/) in the Management Account.

1.  [Amazon S3](https://aws.amazon.com/s3/) replication rule copies Export data to a dedicated Data Collection Account S3 bucket automatically.

1.  [Amazon Athena](https://aws.amazon.com/athena/) allows querying data directly from the S3 bucket using an [AWS Glue](https://aws.amazon.com/glue/) table schema definition.

1.  [Amazon Quick Sight](https://aws.amazon.com/quicksight/) creates datasets from [Amazon Athena](https://aws.amazon.com/athena/), refreshes daily and caches in [SPICE](https://docs.aws.amazon.com/quicksight/latest/user/spice.html)(Super-fast, Parallel, In-memory Calculation Engine) for [Amazon Quick Sight](https://aws.amazon.com/quicksight/) 

1. User Teams (Executives, FinOps, Engineers) can access Cloud Intelligence Dashboards in [Amazon Quick Sight](https://aws.amazon.com/quicksight/). Access is secured through [AWS IAM](https://aws.amazon.com/iam/), IIC ([AWS IAM Identity Center](https://aws.amazon.com/iam/identity-center/), formerly SSO), and optional [Row Level Security](https://catalog.workshops.aws/awscid/en-US/customizations/row-level-security).

If you do not have access to the Management account, you can also deploy CID for a subset of Linked Accounts.

## CUDOS Dashboard


### Authors

+ Yuriy Prykhodko, AWS Principal Technical Account Manager
+ Timur Tulyaganov, Ex-Amazonian

### Contributors

+ Alee Whitman, Principal Solutions Architect
+ Iakov Gan, Ex-Amazonian
+ Judith Lehner, Senior Technical Account Manager
+ Udi Dahan, Senior Technical Account Manager
+ Mylen Rath, Senior Technical Account Manager
+ Christopher Morris, Senior Technical Account Manager
+ Xianshu Zeng, Senior FinOps Commercial Architect
+ Oleksandr Moskalenko, Ex-Amazonian
+ Natalia Cummings, Senior FinOps Commercial Architect
+ Adam Richter, Senior Optimization Solutions Architect
+ Sabith Venkitachalapathy, Senior Storage Specialist SA
+ Brenno Passanha, Senior Technical Account Manager

The CUDOS (Cost and Usage Dashboard Operations Solution) is an in-depth, granular, and recommendation-driven dashboard to help customers dive deep into cost and usage and to fine-tune efficiency. Executives, directors, and other individuals within the CIO or CTO line of business or who manage DevOps and IT organizations will find the CUDOS Dashboard highly detailed and tailored to solve their use cases. Out-of-the-box benefits of the CUDOS dashboard include (but are not limited to):
+ Use the built-in tag explorer to group and filter cost and usage by your tags.
+ View resource-level detail such as your hourly AWS Lambda or individual Amazon S3 bucket costs.
+ Get alerted to service-level areas of focus such as top 3 On-Demand database instances by cost.

### Demo Dashboard


Explore a [sample CUDOS Dashboard](https://cid.workshops.aws.dev/demo?dashboard=cudos) 

### Deploy


Follow [deployment guide](deployment-in-global-regions.md) 

### Learn more

+  [What’s New in CUDOS versions 5.1 to 5.3](https://www.youtube.com/watch?v=3LuKzbFxuz8) 
+  [What’s new in CUDOS Dashboard v 4.77](https://www.youtube.com/watch?v=5IWAoKujOqo) 
+  [CUDOS Insights Learning Series on YouTube](https://www.youtube.com/watch?v=2N24ERSwPE4&list=PLevHThZeBjf85JyGgZGep0ib9eE-53A2T) 

### Changelog

+  [Changelog](https://github.com/aws-solutions-library-samples/cloud-intelligence-dashboards-framework/blob/main/changes/CHANGELOG-cudos.md) 

## Cost Intelligence Dashboard (CID)


### Authors

+ Alee Whitman, Principal Solutions Architect

### Contributors

+ Aaron Edell, Head of Accelerators, AWS
+ Aidin Khosrowshahi, AWS Sr. Technical Account Manager
+ Yuriy Prykhodko, AWS Principal Technical Account Manager
+ Arun Santhosh, Principal Specialist SA (Amazon Quick Sight)
+ Kareem Syed-Mohammed, Senior Product Manager - Technical (Amazon Quick Sight)
+ Timur Tulyaganov, Ex-Amazonian

 [Watch 10min video overview of CID dashboard](https://d3h9zoi3eqyz7s.cloudfront.net/Cost/Videos/DashboardCostIntelligence.mp4) 

The Cost Intelligence Dashboard is a customizable and accessible dashboard to help create the foundation of your own cost management and optimization (FinOps) tool. Executives, directors, and other individuals within the CFO’s line of business or who manage cloud financials for an organization will find the Cloud Intelligence Dashboard easy to use and relevant to their use cases. Little to no technical knowledge or understanding of AWS Services is required. Out-of-the-box benefits of the CID include (but are not limited to):
+ Create chargeback or showback reports for internal business units, accounts, or cost centers.
+ Track how Savings Plans (SP), Reserved Instances (RI), and Spot Instance usage has impacted your unit metrics such as your average hourly cost of Amazon EC2.
+ Keep track of which accounts or internal business units receive savings and when RIs and SPs expire.

### Demo Dashboard


Explore a [sample Cost Intelligence Dashboard](https://cid.workshops.aws.dev/demo?dashboard=cid) 

### Deploy


Follow [deployment guide](deployment-in-global-regions.md) 

## KPI Dashboard


### Authors

+ Alee Whitman, Principal Solutions Architect

### Contributors

+ Aaron Edell, Head of Accelerators, AWS
+ Alex Head, Sr. Manager, AWS OPTICS
+ Georgios Rozakis, AWS Sr. Technical Account Manager
+ Oleksandr Moskalenko, Ex-Amazonian
+ Timur Tulyaganov, Ex-Amazonian
+ Yash Bindlish, AWS Enterprise Support Manager
+ Yuriy Prykhodko, AWS Principal Technical Account Manager
+ Anjali Dhanerwal, AWS Senior Technical Account Manager

The KPI and Modernization Dashboard helps your organization combine DevOps and IT infrastructure with Finance and the C-Suite to grow more efficiently and effectively on AWS. This dashboard lets you set and track modernization and optimization goals such as percent OnDemand, Spot adoption, and Graviton usage. By enabling every line of business to create and track usage goals, and your cloud center of excellence to make recommendations organization-wide, you can grow more efficiently and innovate more quickly on AWS. Out-of-the-box benefits of the KPI dashboard include (but are not limited to):
+ Track percent on-demand across all your teams.
+ See potential cost savings by meeting certain KPIs and goals for your organization.
+ Quickly locate cost-optimization opportunities such as infrequently used S3 buckets, old EBS snapshots, and Graviton eligible instance usage.

### Demo Dashboard


Explore a [sample KPI Dashboard](https://cid.workshops.aws.dev/demo?dashboard=kpi) 

### Deploy


Follow [deployment guide](deployment-in-global-regions.md) 

### Learn more

+  [What’s new in KPI Dashboard](https://www.youtube.com/watch?v=1yDuYqNbcr4) 

## Time to complete


If using automation steps, setup should take approximately 15-30 minutes to complete. Please note that the first data refresh of Cost and Usage Report may take 24 hours to arrive.

## Steps

+  [Deployment in Global Regions](deployment-in-global-regions.md) 
+  [Column Definitions](column-definitions.md) 
+  [Add Account Names ( Optional )](add-account-names.md) 
+  [Migration to CUR 2.0](migration-to-cur.md) 
+  [Deployment In China](deployment-in-china.md) 

**Note**  
These dashboards and their content: (a) are for informational purposes only, (b) represent current AWS product offerings and practices, which are subject to change without notice, and (c) does not create any commitments or assurances from AWS and its affiliates, suppliers or licensors. AWS content, products or services are provided “as is” without warranties, representations, or conditions of any kind, whether express or implied. The responsibilities and liabilities of AWS to its customers are controlled by AWS agreements, and this document is not part of, nor does it modify, any agreement between AWS and its customers.

# Deployment in Global Regions
Deployment in Global Regions

**Note**  
Since November 2024, Cloud Intelligence Dashboards use [AWS Cost And Usage Report 2.0](https://docs.aws.amazon.com/cur/latest/userguide/table-dictionary-cur2.html) (CUR 2.0) as the main source for Foundational Dashboards. If you are deploying in China Regions, please follow the [China deployment instructions](deployment-in-china.md). If you have Legacy CUR setup, you can check [migration process](migration-to-cur.md).

## Architecture


We recommend the deployment of the Dashboards in a dedicated Data Collection Account, other than your Management (Payer) Account in order to respect AWS Best Practices [[1](https://docs.aws.amazon.com/organizations/latest/userguide/orgs_best-practices_mgmt-acct.html#bp_mgmt-acct_avoid-deploying), [2](https://docs.aws.amazon.com/whitepapers/latest/organizing-your-aws-environment/design-principles-for-your-multi-account-strategy.html#avoid-deploying-workloads-to-the-organizations-management-account)]. This Guide provides a CloudFormation template to copy CUR 2.0 data from your Management Account to a dedicated one. You can use it to aggregate data from multiple Management (Payer) Accounts or multiple Linked Accounts.

If you do not have access to the Management/Payer Account, you can still collect the data across multiple Linked accounts using the [same approach](data-collection-without-org.md).

![\[Recommended Deployment Architecture\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/cur2/cid-foundation-cur2-high-level-architecture.png)


Deployment process consists of 3 main steps:

1. Step 1: Deploy Amazon S3 Bucket and Athena Tables in the **Data Collection Account**.

1. Step 2: Deploy AWS Data Exports, Amazon S3 Bucket and a replication policy in **Source** Accounts (one or many).

1. Step 3: Deploy Cloud Intelligence Dashboards (CID) Stack in the **Data Collection Account**.

## Deployment


## Before you start


1. Choose the **region** for your deployment. Make sure to install all stacks in the same region to avoid cross region data transfer charges.

1. Define your Data Collection Account. Create or reuse an existing shared account. We do not recommend using the Management(Payer) Account for data collection.

1. Make sure you have the permissions for deploying CloudFormation Stacks.

### See Required Permissions

+ In the Management/Payer Account you will need permission to access AWS CloudFormation, AWS Cost & Usage Reports, AWS IAM, AWS Lambda and Amazon S3.
+ In the Data Collection Account you will need permission to access Amazon Athena, AWS CloudFormation, AWS Directory Service, Amazon EventBridge, AWS Glue, AWS IAM, AWS Lambda, Amazon Quick Sight, and Amazon S3 via both the console and the Command Line Tool.
+ For a CLI deployment,you will not require CloudFormation permissions.
+ You can use this [CloudFormation template](https://github.com/aws-solutions-library-samples/cloud-intelligence-dashboards-framework/blob/main/cfn-templates/cid-admin-policies.yaml) to provision an IAM role with minimal permissions required for dashboard deployment. It takes an IAM role name as a parameter and adds the required policies to the role.

1. If you use AWS Lake Formation in your Data Collection Account:

### See additional requirements for Lake Formation


Currently only foundational dashboards, CORA, Sustainability and FOCUS Dashboards support Lake Formation.
+ You will need to install an additional stack before [cid-lakeformation-prerequisite.yaml](https://github.com/aws-solutions-library-samples/cloud-intelligence-dashboards-framework/blob/main/cfn-templates/cid-lakeformation-prerequisite.yaml).
+ Also you will need to set `LakeFormationEnabled` parameter to `yes` in the Steps 1 and 3.

## Step 1. [Data Collection Account] Create Destination For CUR Aggregation


1. Sign in to your Data Collection Account.

1. Click the Launch Stack button below to open the **pre-populated stack template** in your CloudFormation console. This stack will create bucket open for replication and Athena Tables.

    [https://console.aws.amazon.com/cloudformation/home#/stacks/create/review?&templateURL=https://aws-managed-cost-intelligence-dashboards.s3.amazonaws.com/cfn/data-exports-aggregation.yaml&stackName=CID-DataExports-Destination&param_ManageCUR2=yes&param_ManageCOH=no&param_DestinationAccountId=REPLACE%20WITH%20DATA%20COLLECTION%20ACCOUNT%20ID&param_SourceAccountIds=PUT%20HERE%20PAYER%20ACCOUNT%20IDS](https://console.aws.amazon.com/cloudformation/home#/stacks/create/review?&templateURL=https://aws-managed-cost-intelligence-dashboards.s3.amazonaws.com/cfn/data-exports-aggregation.yaml&stackName=CID-DataExports-Destination&param_ManageCUR2=yes&param_ManageCOH=no&param_DestinationAccountId=REPLACE%20WITH%20DATA%20COLLECTION%20ACCOUNT%20ID&param_SourceAccountIds=PUT%20HERE%20PAYER%20ACCOUNT%20IDS) 

### More info about stack parameters and the process

+ Update `DestinationAccountId` parameter as your **Data Collection** Account ID (Current Account ID).
+ Make sure `Manage CUR 2.0` is set `yes`. You can optionally select Cost Optimization Hub (if you have this service activated) and FOCUS exports. This will allow you to use [CORA](cora-dashboard.md) and [FOCUS](focus-dashboard.md) dashboards.
+ Enter your Source Account(s) IDs, using commas to separate multiple Account IDs. These are accounts that will send their Data Exports to the bucket in the current Account. If you decided to deploy dashboards in Management/Payer Account (not recommended), make sure that SourceAccountId contains the current Account Id as the first element and skip Step 2.
+ Review the configuration, click **I acknowledge that AWS CloudFormation might create IAM resources** and click **Create stack**.
+ You will see the stack will start with **CREATE\$1IN\$1PROGRESS**. This step can take 5 - 15 mins. Once complete, the stack will show **CREATE\$1COMPLETE**.

You can only have one instance of this Stack in your Account. If you see errors indicating that one of exports exists already, update the existing stack setting parameter `CUR2` to `yes`.

You can add or delete Source Accounts later by updating this stack and adding or deleting Account IDs in a comma separated list of Source Account parameter.

## Step 2. [In Management/Payer/Source Account] Create CUR 2.0 and Replication


1. Click the **Launch Stack button** below to open the **stack template** in your AWS CloudFormation console.

    [https://console.aws.amazon.com/cloudformation/home#/stacks/create/review?&templateURL=https://aws-managed-cost-intelligence-dashboards.s3.amazonaws.com/cfn/data-exports-aggregation.yaml&stackName=CID-DataExports-Source&param_ManageCUR2=yes&param_ManageCOH=no&param_DestinationAccountId=REPLACE%20WITH%20DATA%20COLLECTION%20ACCOUNT%20IDs&param_SourceAccountIds=](https://console.aws.amazon.com/cloudformation/home#/stacks/create/review?&templateURL=https://aws-managed-cost-intelligence-dashboards.s3.amazonaws.com/cfn/data-exports-aggregation.yaml&stackName=CID-DataExports-Source&param_ManageCUR2=yes&param_ManageCOH=no&param_DestinationAccountId=REPLACE%20WITH%20DATA%20COLLECTION%20ACCOUNT%20IDs&param_SourceAccountIds=) 

### Click here for the configuration steps


1. Enter a **Stack name** for your template such as **CID-DataExports-Source**.

1. Enter your **Destination Account ID** parameter (Your Data Collection Account, where you will deploy dashboards).

1. Choose the exports to manage. The choice must be consistent with the configuration in the Data Collection Account (as in Step 1).

1. Review the configuration, click **I acknowledge that AWS CloudFormation might create IAM resources**, and click **Create stack**.

1. You will see the stack will start with **CREATE\$1IN\$1PROGRESS**. This step can take \$15 mins. Once completed, the stack will show **CREATE\$1COMPLETE**.

1. Repeat for other Source Accounts.

It will typically take about 24 hours for the first delivery of AWS Data Exports replication to the Destination Account, but it might take up to 72 hours (3 days). You can continue with the dashboards deployment however data will appear on the dashboards the next day after the first data delivery.

## Backfill Data Export


You can now [create a Support Case](https://support.console.aws.amazon.com/support/home#/case/create), requesting a [backfill](https://docs.aws.amazon.com/cur/latest/userguide/troubleshooting.html#backfill-data) of your reports (CUR or FOCUS) with up to 36 months of historical data. Case must be created from your Source Account (typically Management/Payer Account). If you are using multiple Management/Payer Accounts, the support ticket must be created in each.

### Support ticket example


Support ticket example:

```
Service: Billing
Category: Other Billing Questions
Subject: Backfill Data

Hello Dear Billing Team,
Please can you backfill the data in DataExport named `cid-cur2` for last 12 months.
Thanks in advance,
```

You can also use following command in AWS CloudShell to create this case via command line (requires AWS Enterprise or OnRamp Support):

```
aws support create-case \
    --subject "Backfill Data" \
    --service-code "billing" \
    --severity-code "normal" \
    --category-code "other-billing-questions" \
    --communication-body "
        Hello Dear Billing Team,
        Please can you backfill the data in DataExport named 'cid-cur2' for last 12 months.
        Thanks in advance"
```

Make sure you create the case from your Source Accounts (Typically Management/Payer Accounts).

## Step 3. [Data Collection Account] Deploy Dashboards


### 3.1 - Prepare Amazon Quick Sight (Quick Suite)


Amazon Quick Sight is the AWS Business Intelligence tool, part of Amazon Quick Suite service. You can install Dashboards into your Amazon Quick Sight account and customize them to your needs. If you are already a regular Amazon Quick Sight user you can skip these steps and move on to the next step. If not, complete the steps below.

#### Click here to expand Amazon Quick Suite Sign Up Workflow


1. Log into your Destination Linked Account and search for **Quick Suite** in the list of Services

1. You will be asked to **Sign up** before you will be able to use it
   + Ensure you select the **Region** that is most appropriate based on where you plan to deploy the dashboards.
   + Enter a **name** for your Quick Suite account. This must be unique across all Quick Suite accounts.
   + Enter an **email address** for notifications to be sent to. This email will be linked to your Quick Suite user account so it can be your email.

1. You will then need to fill in a series of options in order to finish creating your account:
   + Please select the appropriate **Authentication** method
**Note**  
Select `Use AWS IAM Identity Center` if you want to use and share the CID dashboards in Production with your wider Organization using your existing Identity Provider such as Azure AD, Okta, or others. Follow the steps [here](publishing-as-sso-application.md). You may select `Use IAM federated identities & Quick Sight-managed users` to get started quickly, however, **NOTE:** You will **NOT** be able to change the Quick Sight Authentication method later

1. Click **Create Account** and wait for the congratulations screen to display. Go to 'Manage Quick Suite'.
   + (optional, not recommended) Downgrade your user to avoid charges for Amazon Q in Quick Suite.
   + Make sure that Pixel Perfect and Amazon Q in Quick Suite are deactivated.
   + Click on the SPICE Capacity option and choose `auto purchase` or purchase enough SPICE capacity so that the total is roughly 40GB. If you get SPICE capacity errors later, you can come back here to purchase more. If you’ve purchased too much you can also release it after you’ve deployed the dashboards.

![\[Quick Sight Sign up Workflow Image\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/images/dashboards/qs-enterprise-activation.gif)


### 3.2 Deploy Dashboards


Make sure you use the same Region as in Step 1 to avoid cross region Data Transfer costs. Also your AWS Account must have a `quicksight:DescribeTemplate` permission for reading from us-east-1 region.

In this step we will use CloudFormation stack to create Athena Workgroup, S3 bucket, Glue Table, Glue Crawler, Quick Sight datasets, and finally the Dashboards. The template uses a custom resource (a Lambda with [this CLI tool](https://github.com/aws-solutions-library-samples/cloud-intelligence-dashboards-framework/blob/main/CID-CMD.md)) to create, delete, or update assets.

**Example**  

1. Log in to to your **Data Collection** Account.

1. Click the Launch Stack button below to open the **pre-populated stack template** in your CloudFormation.

    [https://console.aws.amazon.com/cloudformation/home#/stacks/create/review?templateURL=https://aws-managed-cost-intelligence-dashboards.s3.amazonaws.com/cfn/cid-cfn.yml&stackName=Cloud-Intelligence-Dashboards&param_DeployCUDOSv5=yes&param_DeployKPIDashboard=yes&param_DeployCostIntelligenceDashboard=yes](https://console.aws.amazon.com/cloudformation/home#/stacks/create/review?templateURL=https://aws-managed-cost-intelligence-dashboards.s3.amazonaws.com/cfn/cid-cfn.yml&stackName=Cloud-Intelligence-Dashboards&param_DeployCUDOSv5=yes&param_DeployKPIDashboard=yes&param_DeployCostIntelligenceDashboard=yes) 

1. Enter a **Stack name** for your template such as **Cloud-Intelligence-Dashboards** 

1. Review **Common Parameters** and confirm prerequisites before specifying the other parameters. You must answer `yes` to both prerequisites questions.

1. Copy and paste your **Quick SightUserName** into the parameter text box. To find your Quick Sight username:
   + Open a new tab or window and navigate to the **Quick Sight** console
   + Find your username from the person icon in the top right corner  
![\[Quick Sight page with username drop down in the top right highlighted\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/cf_dash_qs_2.png)

1. Select the Dashboards you want to install. We recommend deploying all three: Cost Intelligence Dashboard, CUDOS, and the KPI Dashboard.

1. Review the configuration, click **I acknowledge that AWS CloudFormation might create IAM resources, and click Create stack**.

1. You will see the stack will start in **CREATE\$1IN\$1PROGRESS**. This step can take \$115 minutes. Once complete, the stack will show **CREATE\$1COMPLETE** 

1. You can check the stack output for dashboard URLs. Please note that dashboards will be empty by this point. We recommend initiate a backfill via a Support Cases (see [Backfill](#deployment-global-backfill-data-export) section).

    **Troubleshooting:** 

    **No export named cid-DataExports-ReadAccessPolicyARN found.** 

   If you see `No export named cid-DataExports-ReadAccessPolicyARN found.` then you probably did not installed CUR2 with Cloud formation stack as per Step 1. Alternatively you can also use Legacy CUR but in this case you need explicitly specify the parameter `CurVersion=1.0`.
Alternative method to install dashboards is the [cid-cmd](https://github.com/aws-solutions-library-samples/cloud-intelligence-dashboards-framework/blob/main/CID-CMD.md#command-line-tool-cid-cmd) tool.  

1. Log in to to your **Data Collection** Account.

1. Open [AWS CloudShell](https://console.aws.amazon.com/cloudshell/home) 

1. Install cid-cmd tool. Run the following command and make sure you hit enter :

   ```
    pip3 install --upgrade cid-cmd
   ```

   If using [CloudShell](https://console.aws.amazon.com/cloudshell), use the following instead:

   ```
   sudo yum install python3.11-pip -y
   python3.11 -m pip install -U cid-cmd
   ```

1. Deploy CUDOS Dashboard:

   ```
    cid-cmd deploy --dashboard-id cudos-v5
   ```

   Please follow the instructions from the deployment wizard. More info about command line options are in the [Readme](https://github.com/aws-solutions-library-samples/cloud-intelligence-dashboards-framework/blob/main/CID-CMD.md#command-line-tool-cid-cmd) or `cid-cmd --help`.

1. Repeat deployment command for Cost Intelligence Dashboard and KPI:

   ```
    cid-cmd deploy
   ```

   Please note that Advanced Dashboard will require Advanced [Data Collection](data-collection.md) 
WIP

**Note**  
After update Quick Sight datasets will be refreshed automatically. During the refresh process you may see "Dataset changed too much" error which should disappear once datasets are fully refreshed

## Update of the stack


**Note**  
We recommend customers updating both cid-cmd tool and CloudFormation stack to a version 4.2.3 or more recent.

Please note that dashboards are not updated with update of CloudFormation Stack. You need to use [command line for updates](update-dashboards.md) as it preserves potential customization.

You can check the latest Cloud Formation Stack [Here](https://aws-managed-cost-intelligence-dashboards.s3.amazonaws.com/cfn/cid-cfn.yml) and the source code on [GitHub](https://github.com/aws-solutions-library-samples/cloud-intelligence-dashboards-framework/blob/main/cfn-templates/cid-cfn.yml). Please note the version in Description.

### Update of Cloud-Intelligence-Dashboards Stack


1. Open CloudFormation console and identify the stack (default name is `Cloud-Intelligence-Dashboards`).

1. Open the Stack and press Update button.

1. Choose to update the template and insert this link: https://aws-managed-cost-intelligence-dashboards.s3.amazonaws.com/cfn/cid-cfn.yml

1. Review the parameters. Please make sure to choose the right version of CUR in CurVersion parameter. Choose 1.0 to stay on CUR1. Choose 2.0 to switch all new dashboards to CUR 2. To preform a full migration please reference [CUR2 migration guide](migration-to-cur.md).

## Troubleshooting


### No data in Dashboards after 24-48 hours


Please check the following:

1. In Quick Sight, go to Datasets and click on Summary View. Check for errors (if you see a status `Failed`, you can click it to see more info).

1. Check if CUR 2.0 data has arrived to the S3 bucket. If you just created CUR you will need to wait 24-48 hours before the first data arrives.

1. The Quick Sight datasets refresh once per day, if your first CUR was delivered after your latest refresh, you may need to click manual refresh on each dataset to see data in the dashboard.

 **Any issues? Visit our [FAQs](faq.md).** 

## Next steps

+ Deploy [CORA](cora-dashboard.md) 
+ Deploy [Compute Optimizer Dashboard](compute-optimizer-dashboard.md) and [Trusted Advisor Organizational (TAO) Dashboard](trusted-advisor-dashboard.md) 

# Column Definitions
Column Definitions

## summary\$1view


 [summary\$1view](https://github.com/aws-solutions-library-samples/cloud-intelligence-dashboards-framework/blob/main/cid/builtin/core/data/queries/cid/summary_view.sql) is a view in Amazon Athena, created on top of Cost and Usage Report [CUR](https://docs.aws.amazon.com/cur/latest/userguide/what-is-cur.html), it allow users to have a concise overview of their AWS spend. It provides aggregated insights into costs and usage across various services and accounts.


| Column | Data Type | Description | 
| --- | --- | --- | 
|   **year**   |  string  |  Year of the billing period that is covered by report.  | 
|   **month**   |  string  |  Month of the billing period that is covered by report.  | 
|   **billing\$1period**   |  timestamp  |  The start date of the billing period that is covered by this report, in UTC. The format is `YYYY-MM-DDTHH:mm:ssZ`.  | 
|   **usage\$1date**   |  timestamp  |  If Start date is older than 3 months then it converts into first date of month, else actual date.The start date for the line item in UTC, inclusive. The format is `YYYY-MM-DDTHH:mm:ssZ`.  | 
|   **payer\$1account\$1id**   |  string  |  The account ID of the paying account. For an organization in AWS Organizations, this is the account ID of the management account.  | 
|   **linked\$1account\$1id**   |  string  |  The account ID of the account that used this line item. For organizations, this can be either the management account or a member account. You can use this field to track costs or usage by account.  | 
|   **invoice\$1id**   |  string  |  The ID associated with a specific line item. Until the report is final, the InvoiceId is blank, generally after the 6th or 7th of the month (example: June data available after July 6 or 7).  | 
|   **charge\$1type**   |  string  |  The type of charge covered by this line item. Some possible types are the following: Credit, Discount, Fee & Refund. For more charge type please refer this [Link](https://docs.aws.amazon.com/cur/latest/userguide/Lineitem-columns.html#Lineitem-details-L-LineItemType).  | 
|   **charge\$1category**   |  varchar(13)  |  Describes charge category as "running\$1usage" or "non\$1usage".In case of Charge type "DiscountedUsage","SavingsPlanCoveredUsage" & "Usage" it converts into "running\$1usage" else "non\$1usage".  | 
|   **purchase\$1option**   |  varchar(11)  |  Describes the available purchasing models for an AWS service. For example: AWS provides four main Amazon EC2 instance purchasing options: On-Demand, Reserved Instances & Spot Instances.  | 
|   **ri\$1sp\$1arn**   |  string  |  Provides Savings Plan and RI arn, if resource not covered by SP or RI, it returns blank.  | 
|   **product\$1code**   |  string  |  The code of the product measured. For example: Amazon EC2 is the product code for Amazon Elastic Compute Cloud.  | 
|   **product\$1name**   |  string  |  Describes the full name of the AWS service. Use this column to filter AWS usage by AWS service. Sample values: AWS Backup, AWS Config, Amazon Registrar, Amazon Elastic File System & Amazon Elastic Compute Cloud.  | 
|   **service**   |  string  |  This identifies the specific AWS service to the customer as a unique short abbreviation including Marketplace. Sample values: Amazon EC2 , AWS KMS, AWS Budgets, AWS Backup & AWS Certificate Manager.  | 
|   **product\$1family**   |  string  |  This describes category for the type of product. Sample values: Alarm, AWS Budgets, Stopped Instance, Storage Snapshot & Compute.  | 
|   **usage\$1type**   |  string  |  The usage details of the line item. For example: `USW2-BoxUsage:m2.2xlarge` describes an `m2` High Memory Double Extra Large instance in the US West (Oregon) Region.  | 
|   **operation**   |  string  |  The specific AWS operation covered by this line item. This describes the specific usage of the line item. For example: a value of RunInstances indicates the operation of an Amazon EC2 instance.  | 
|   **item\$1description**   |  string  |  The description of the line item type.For example: The description of a usage line item summarizes what type of usage you incurred during a specific time period. For size-flexible RIs, the description corresponds to the RI the benefit was applied to. If a line item corresponds to a t2.micro and a t2.small RI was applied to the usage, the lineItem/LineItemDescription displays t2.small.  | 
|   **availability\$1zone**   |  string  |  The Availability Zone that hosts this line item. For example: us-east-1a or us-east-1b.  | 
|   **region**   |  string  |  This describes geographical area that hosts your AWS services. Use this field to analyze spend across a particular Region. Sample values: eu-west-3, us-west-1, us-east-1, ap-northeast-2 & sa-east-1.  | 
|   **instance\$1type\$1family**   |  string  |  This describes the instance family that is associated with the given usage. Sample values: t2, m4 & m3.  | 
|   **instance\$1type**   |  string  |  Describes the instance type, size, and family, which define the CPU, networking, and storage capacity of your instance. Sample values: t2.small, m4.xlarge, t2.micro, m4.large & t2.large  | 
|   **platform**   |  string  |  Describes the operating system of your Amazon EC2 instance. Sample values: Amazon Linux, Ubuntu, Windows Server, Oracle Linux & FreeBSD.  | 
|   **tenancy**   |  string  |  Describes the type of tenancy allowed on the Amazon EC2 instance. Sample values: Dedicated, Reserved, Shared, NA & Host.  | 
|   **processor**   |  string  |  Describes the processor on your Amazon EC2 instance. Sample values: High Frequency Intel Xeon E7-8880 v3 (Haswell) & Intel Xeon E5-2670 & AMD EPYC 7571.  | 
|   **processor\$1features**   |  string  |  Describes the processor features of your instances. Sample values: Intel AVX, Intel AVX2, Intel AVX512 & Intel Turbo.  | 
|   **database\$1engine**   |  string  |  Describes which database engine is being used. Sample Values: Aurora MySQL, Aurora PostgreSQL, Oracle & MySQL.  | 
|   **product\$1group**   |  string  |  A construct of several products that are similar by definition, or grouped together. For example: the Amazon EC2 team can categorize their products into shared instances, dedicated host, and dedicated usage.  | 
|   **product\$1from\$1location**   |  string  |  Describes the location where the usage originated from. Sample values: External, US East (N. Virginia) & Global.  | 
|   **product\$1to\$1location**   |  string  |  Describes the location usage destination. Sample values: External & US East (N. Virginia).  | 
|   **current\$1generation**   |  string  |  Describes the instance’s generation is current or not, if it is current generation instance, the record will show "Yes" if not it will show "No".  | 
|   **legal\$1entity**   |  string  |  The Seller of Record of a specific product or service. In most cases, the invoicing entity and legal entity are the same. The values might differ for third-party AWS Marketplace transactions. Possible values include: Amazon Web Services, Inc. -- The entity that sells AWS services Amazon Web Services India Private Limited — The local Indian entity that acts as a reseller for AWS services in India.  | 
|   **billing\$1entity**   |  string  |  Helps you identify whether your invoices or transactions are for AWS Marketplace or for purchases of other AWS services. Possible values include: AWS — Identifies a transaction for AWS services other than in AWS Marketplace. AWS Marketplace — Identifies a purchase in AWS Marketplace.  | 
|   **pricing\$1unit**   |  string  |  The smallest billing unit for an AWS service. For example: 0.01c per API call.  | 
|   **resource\$1id\$1count**   |  bigint  |  Count of Distinct ResourceIDs, whereas a ResourceID is an ID of individual resource that you provisioned. For example: an Amazon S3 storage bucket, an Amazon EC2 compute instance, or an Amazon RDS database can each have a resource ID.  | 
|   **usage\$1quantity**   |  double  |  Sum of the amount of usage that you incurred during the specified time period. It specifically covers usage covered by Savings plan and on-demand usage.  | 
|   **unblended\$1cost**   |  double  |  Sum of the unblended cost, whereas the UnblendedCost is the UnblendedRate multiplied by the UsageAmount.  | 
|   **amortized\$1cost**   |  double  |  Sum of amortized cost, the costs are amortized over the billing period. This means that the costs are broken out into the effective daily rate. AWS estimates your amortized costs by combining your unblended costs with the amortized portion of your upfront and recurring reservation fees.  | 
|   **ri\$1sp\$1trueup**   |  double  |  In case of No Upfront or Partial Upfront Savings Plans, it shows the amount of upfront fee a Savings Plan subscription is costing you for the billing period in negative. The initial upfront payment for All Upfront Savings Plan and Partial Upfront Savings Plan amortized over the current month.  | 
|   **ri\$1sp\$1upfront\$1fees**   |  double  |  Describes upfront payment of Savings plan and Reserved Instances.  | 
|   **public\$1cost**   |  double  |  Sum of the total cost for the line item based on public On-Demand Instance rates. If you have SKUs with multiple On-Demand public costs, the equivalent cost for the highest tier is displayed. For example: services offering free-tiers or tiered pricing.  | 

# Add Account Names ( Optional )
Add Account Names ( Optional )

## Account Map


The Cost & Usage Report data doesn’t currently contain account names and other business or organization specific mapping so you can create a view that enhances your CUR data. There are a few options you can leverage to create your account\$1map view to provide opportunities to leverage your existing mapping tables, organization information, or other business mappings allowing for deeper insights. This view will be used to create the **Account Map** for your dashboards.

The steps below are necessary **ONLY** if you have deployed your dashboards using legacy CUR. Dashboards created using CUR 2.0 have account names integrated as part of deployment process.

### Option 1: Leverage your existing AWS Organizations account mapping (Recommended)


This option allows you to bring in your AWS Organizations data including OU groupings

#### Click here to expand


You will need to go through an additional Lab for this. This can collect multiple types of data across accounts and AWS Organization, including Trusted Advisor and Compute Optimizer Data. For Account Names you will need only one module **AWS Organization Module**, but we recommend to explore other modules of this lab as well.

 [Click to navigate to Optimization Data Collection Lab](data-collection-deployment.md) 

After successful deployment create or update your account\$1map view by running the following query in Athena Query Editor.

```
CREATE OR REPLACE VIEW account_map AS
SELECT DISTINCT
    "id" "account_id",
    "name" "account_name",
    ' ' "parent_account_id",
    ' ' "account_email_id"
FROM
    "optimization_data"."organization_data"
```

Also you must update the role that Quick Sight uses to update datasets. This can be a standard Quick Sight role that you can manage in Quick Sight Admin space (Security and Permissions section). Or this can be a role named CidQuick SightDataSourceRole. This role can be managed by Cloud-Intelligence-Dashboards stack in CloudFormation. Please make sure that you configure there the same bucket name as in [Data Collection Lab](data-collection.md).

### Option 2: Leverage AWS Cost Categories to add account names


This option allows you to bring in account names using AWS Cost Categories. If you have multiple payer accounts, please ensure you use the same name for your cost category in each of the payer accounts, so that consolidated cost and usage report in the data collection account will be consistent. Recommended cost category name: **accountname** 

#### Click here to expand


Navigate to cost categories by either searching for cost categories in the AWS console search bar

![\[Searching for cost categories in AWS console search highlighted\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/cur/search_cc.png)


OR by going to the Billing console and choosing Cost Categories from the navigation menu

![\[Choosing cost categories in billing console highlighted\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/cur/billing_console.png)


In the Cost Categories console Select **Create cost category** 

![\[Choosing create Cost Category in CC console\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/cur/cc_create.png)


Name your cost category as **accountname** or any other name you’d like. Be consistent with the name across multiple payer accounts if you are consolidating data from other payer accounts

For lookback period select the second option **Apply cost category rules starting any specified month from the previous 12 months** and then choose a month which is atleast 3 months prior to the current month. Select **Next** 

![\[Creating Cost Category name\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/cur/cc_name_option.png)


In the category rules under Rule Builder choose Rule Type as **Inherited value** and Dimension as **Account** 

Specify a default value as **unnamed**. You can use anything you’d like to define accounts which do not have an account name but be consistent across multiple payer accounts. Select **Next** 

![\[Creating Cost Category rules\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/cur/cc_rule_option.png)


Select **create cost category** 

![\[Finishing Cost Category creation\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/cur/cc_create_final.png)


The CUR will now have a column called **CostCategory/accountname** with the account names populated in them. Please note, it might take **24-48 hours** for the CUR to be updated. In Athena the column name in the CUR table will be something similar to **cost\$1category\$1accountname** 

Once the cost category is available in your CUR Athena table, update your account\$1map view with the below query with the following modifications

On line 4 and line 9, replace **cost\$1category\$1accountname** with the name of the cost category you chose for account name. If you chose just accountname as shown in the example above then no change is needed.

On line 8, replace **(database).(tablename)** with your CUR database and table name (e.g. cid\$1cur.cur)

Run the query after the modification. Your account\$1map view will now have account names from the cost category created.

```
CREATE OR REPLACE VIEW "account_map" AS
SELECT DISTINCT
line_item_usage_account_id "account_id"
, max_by(cost_category_accountname,line_item_usage_start_date) "account_name"
, ’ ’ parent_account_id
, ’ ’ account_email_id
FROM
(database).(tablename)
WHERE ((cost_category_accountname <> ’') AND (("bill_billing_period_start_date" >= ("date_trunc"(’month', current_timestamp) - INTERVAL '2' MONTH)) AND (CAST("concat"("year", '-', "month", '-01') AS date) >= ("date_trunc"('month', current_date) - INTERVAL '2' MONTH))))
group by line_item_usage_account_id
```

### Option 3: Account Map CSV file using your existing AWS Account mapping data


Many organizations already maintain their account mapping outside of AWS. You can leverage your existing mapping data by creating a csv file with your account mapping data including any additional organization attributes.

#### Step 1:Click here to create using your own account mapping csv and Amazon S3


 **Create your account\$1map csv file** 

This example will show you how to create using a sample account\$1map csv file

1. Create an account\$1map csv file locally, you can use the sample here and requirements below as a starting point: [account\$1map.csv](samples/account_map.csv.zip) 

1. Update your account\$1map csv with your account mapping data

 **Upload your account\$1map csv file to Amazon S3** 

1. Navigate to **Amazon S3** 

1. Select **Create Bucket** 

![\[Amazon S3 console with create bucket button highlighted\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/cur/view0_create_bucket.png)


1. Name your bucket, we recommend **cost-account-map-** to easily locate

![\[Amazon S3 create bucket with bucket name field highlighted\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/cur/view0_name_bucket.png)


1. Scroll to the bottom and select **Create Bucket** 

![\[Amazon S3 create bucket with create bucket button highlighted\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/cur/view0_save_bucket.png)


1. Navigate to your newly created s3 bucket

![\[Amazon S3 bucket list with newly created bucket highlighted\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/cur/view0_select_bucket.png)


1. Select **Create folder** 

![\[Amazon S3 bucket object page with create folder button highlighted\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/cur/view0_create_folder.png)


1. Name your folder **account-map** and select **Create folder** 

![\[Create folder page with folder name field and create folder button highlighted\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/cur/view0_name_folder.png)


1. Click on your newly created **account-map** folder

![\[Amazon S3 bucket screen in cost-account-map folder with account-map folder highlighted\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/cur/view0_select_folder.png)


1. Select **Upload** 

![\[account-map folder page with upload button highlighted\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/cur/view0_upload.png)


1. In your newly created folder, **drag and drop** your account\$1map.csv file then select **Upload** 

![\[Amazon S3 upload page with the drag and drop file upload section and upload button highlighted\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/cur/view0_upload_csv.png)


1. Copy down the **S3 Destination** of the account-map.csv. You will need this to create your Athena table

![\[Amazon S3 upload status page with destination part highlighted\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/cur/view0_copy.png)


 **Create your account\$1mapping Athena table** 

1. Navigate to **Amazon Athena** 

1. Modify the below query with your account\$1map.csv information. Replace the **(S3.Destination) value in row 15** with your account\$1map folder S3 destination from step 8 of the last section (e.g. cost-account-map-123456789012/account-map)

**Note**  
Validate rows **2-5** match your csv columns. If you removed one of the fields in the csv you will need to remove it in the query. If you added any additional fields you will need to add the attribute to the query.\$1

```
CREATE EXTERNAL TABLE +account_mapping+(
    +account_id+ string,
    +account_name+ string,
    +business_unit+ string,
    +team+ string,
    +cost_center+ string
    )
ROW FORMAT DELIMITED
    FIELDS TERMINATED BY ','
STORED AS INPUTFORMAT
    'org.apache.hadoop.mapred.TextInputFormat'
OUTPUTFORMAT
    'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
LOCATION
    '(S3.Destination)'
TBLPROPERTIES (
    'has_encrypted_data'='false',
    'skip.header.line.count'='1')
```

#### Step 2:Click here to create your account\$1map view in Athena using the table you created in the above step


 **Create your account\$1map Athena view** 

The account\$1map Athena view ensures any new accounts are not missed in your dashboard by creating a view off of your CUR table and account\$1mapping Athena table.

Modify the following query with your table names:

1. Replace **(database).(tablename)** in line 13 with your CUR database and table name (e.g. cid\$1cur.cur)

1. Replace **(database).(tablename)** in line 23 with your account\$1mapping database and table name (e.g. cid\$1cur.account\$1mapping)

```
CREATE OR REPLACE VIEW account_map AS
SELECT DISTINCT
a.line_item_usage_account_id "account_id"
, b.account_name
, b.business_unit
, b.team
, b.cost_center
FROM
    ((
    SELECT DISTINCT line_item_usage_account_id
    FROM (database).(tablename) ) a
LEFT JOIN (
    SELECT DISTINCT
        "lpad"("account_id", 12, '0') "account_id"
    , account_name
    , business_unit
    , team
    , cost_center
    FROM
    (database).(tablename) ) b ON (b.account_id = a.line_item_usage_account_id))
```

You must update the role that Quick Sight uses to update datasets. This can be a standard Quick Sight role that you can manage in Quick Sight Admin space (Security and Permissions section). Or this can be a role named CidQuick SightDataSourceRole. This role can be managed by Cloud-Intelligence-Dashboards stack in CloudFormation. Please make sure that you configure there the same bucket name as in [Data Collection Lab](data-collection.md).

Alternatively you can also choose to do an one-time update of your account map view using one of the options below

#### Click here to one-time update account map from CSV data with cid-cmd tool


```
cid-cmd map --account-map-source csv --account-map-file FILE.CSV
```

### Final Steps


Once you update and test the account\$1map view in Athena, you need to make sure Quick Sight has access to the bucket containing Optimization Data Collection data and then refresh summary\$1view dataset in Quick Sight.

# Migration to CUR 2.0
Migration to CUR 2.0

## Migration Overview


AWS Provides a [Cost and Usage Report 2.0](https://docs.aws.amazon.com/cur/latest/userguide/dataexports-migrate.html) that will gradually replace the [Legacy Cost and Usage Report](https://docs.aws.amazon.com/cur/latest/userguide/cur-overview.html). This guide helps you to migrate existing CID dashboards to the new CUR 2.0.

Use this guide if you already have CID dashboards installed via CloudFormation or CLI methods.

![\[Migration Phases\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/cur2/migration-phases.png)


Migration can be done in 3 steps:

1. Deploy CUR 2.0 via AWS Data Exports

1. Update Dashboards

1. (Optional) Decommission Legacy CUR

### Step 1 / 3: Deploy Data Exports


If you already have Data Exports Stack deployed for other dashboards (CORA or FOCUS) please just make sure you have CUR2 option activated.

If you do not have Data Exports Stack please install it using [this guide](data-exports.md). Please do not forget to request Back Fill from source accounts for up to 36 months. If you need more then this it can be possible to migrate your Legacy CUR data to the dataset that will be close enough to CUR2 schema.

By the end of this step and if backfill completed you must have a table with CUR2 available in Athena.

You can query the data to make sure that the data are identical.

**Example**  

```
SELECT
    billing_period,
    sum(line_item_unblended_cost)
FROM cid_data_export.cur2
GROUP BY 1
```

```
SELECT
    concat("`year`", '`-`', "`month`") AS billing_period,
    sum(line_item_unblended_cost)
FROM cid_cur.cur -- replace with your legacy CUR table
GROUP BY 1
```

Proceed to the next step once you have the data returned by Athena query above for several billing periods.

### Step 2 / 3: Update Dashboards


Dashboards can be installed in 2 different ways: With CloudFormation stack or with Command Line Tools. The update will be done with Command Line Tool regardless of the deployment method, but if you used CloudFormation for initial deployment, you need to update the stack.

With Command Line update you are in control of all modifications and you can backport your customizations if needed.

#### Step 2.1: CloudFormation Update (If needed)


You can skip this if you used only Command Line for dashboards deployment (cid-cmd).

1. Download [CloudFormation Stack](https://aws-managed-cost-intelligence-dashboards.s3.amazonaws.com/cfn/cid-cfn.yml).

1. Open [CloudFormation Console](https://console.aws.amazon.com/cloudformation/home?#/stacks) in your Destination/Dashboard Account. Make sure to use the same region where you deployed the stack previously.

1. Update the stack (default name is Cloud-Intelligence-Dashboards) with the version you get from GitHub and set the parameter "CURVersion = 2". If you want to keep Legacy CUR, set "Keep Legacy CUR Table" (KeepLegacyCURTable) parameter to "Yes" and proceed with CLI update (Step 2.2).

These steps update the role used by Quick Sight DataSources to access Amazon S3 bucket and Athena Database with CUR2. At this point your dashboard will not be updated (if you choose to KeepLegacyCURTable).

Once CloudFormation Stack is updated you need to proceed to Command Line Update.

#### Step 2.2: Command line Update (Mandatory)


1. Open CloudShell in the same region and install the (cid-cmd) tool:

```
pip3 install -U cid-cmd
```

1. Run the tool to update your dashboard and dependencies:

```
cid-cmd update --force --recursive
```

Please select `cid_data_export.cur2` when asked to choose the CUR table.

The tool will provide you with diff between the current Athena views and the updated views SQL query. You can choose to "proceed and override" or you can adjust your Athena views manually using the diff information and choose "keep existing". Another option can be to backport changed after the migration.

### Step 3 / 3: Decommission (Optional)


Once your dashboards are updated to CUR2 you can delete the Legacy CUR using CloudFormation (CUR-Source/CUR-Destination) or manually depending on how it was created.

## Troubleshooting and FAQ


### I do not see data on dashboards after migration


1. Run `cid-cmd status` to get more info about dataset status. Or manually check dataset status in Quick Sight UI

1. Double check that Quick Sight has permissions to read from your CUR bucket. If you use a default Quick Sight Role please add manually permissions to read from `cid-ACCOUNTID-data-exports` bucket.

1. Check if data are in Athena table and view `SELECT * FROM summary_view LIMIT 10` 

### How I can rollback


If you need to revert to the previous version

```
pip3 install cid-cmd==0.3.10
```

```
cid-cmd update --force --recursive
```

Choose Table with Legacy CUR when asked

## Feedback


Please [contact the team](feedback-support.md) if any issue.

# Deployment In China
Deployment In China

**Note**  
For deployments in AWS China Regions, please note there are specific regional considerations and limitations. For all other AWS Regions, please follow the [standard deployment guide](deployment-in-global-regions.md) 

## Architecture


There are 2 options how you can analyze your Cost and Usage. You can consolidate all your Cost and Usage data to Global Regions (for example using [Data Transfer Hub](https://github.com/aws-solutions/data-transfer-hub)) or you can deploy Cloud Intelligence Dashboards in China Regions. Here we will provide a specific guidance for deployment in China Regions.

We recommend deployment of the Dashboards in a dedicated Data Collection Account, other than your Management (Payer) Account. This guidance provides a CloudFormation template to copy Cost and Usage Report(CUR) data from your Management Account to the dedicated one. You can use it to aggregate data from multiple Management Accounts or multiple Linked Accounts.

If you do not have access to the Management/Payer Account, you can still collect the data across multiple Linked accounts using the same approach.

![\[Foundational Architecture\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/china/china-foundamental-architecture.png)


1.  [AWS Cost and Usage Report](https://aws.amazon.com/aws-cost-management/aws-data-exports/) delivers daily the Cost & Usage data to an [Amazon S3 Bucket](https://aws.amazon.com/s3/) in the Management Account.

1.  [Amazon S3](https://aws.amazon.com/s3/) replication rule copies CUR data to a dedicated Data Collection Account S3 bucket automatically.

1.  [Amazon Athena](https://aws.amazon.com/athena/) allows querying data directly from the S3 bucket using an [AWS Glue](https://aws.amazon.com/glue/) table schema definition.

1.  [Amazon Quick Sight](https://aws.amazon.com/quicksight/) creates datasets from [Amazon Athena](https://aws.amazon.com/athena/), refreshes daily and caches in [SPICE](https://docs.aws.amazon.com/quicksight/latest/user/spice.html)(Super-fast, Parallel, In-memory Calculation Engine) for [Amazon Quick Sight](https://aws.amazon.com/quicksight/) 

1. User Teams (Executives, FinOps, Engineers) can access Cloud Intelligence Dashboards in [Amazon Quick Sight](https://aws.amazon.com/quicksight/). Access is secured through [AWS IAM](https://aws.amazon.com/iam/), IIC ([AWS IAM Identity Center](https://aws.amazon.com/iam/identity-center/), formerly SSO), and optional [Row Level Security](https://catalog.workshops.aws/awscid/en-US/customizations/row-level-security).

## Deployment


![\[Deployment Steps\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/china/china-deploy-simple.png)


Deployment process consists of 3 main steps:

1. Deploy Amazon S3 Bucket and Athena Tables in the **Data Collection Account** 

1. Amazon S3 Bucket and a replication policy in **Source** Accounts (one or many)

1. Deploy Cloud Intelligence Dashboards (CID) Stack in the **Data Collection Account** 

## Deployment


### Before you start


1. Choose **Beijing Region (cn-north-1)** for your deployment as Quick Sight is only available in this region for AWS China.

1. Define your Data Collection Account. Create or reuse an existing shared account. We do not recommend using the Management(Payer) Account for data collection.

1. Make sure you have permissions for deploying CloudFormation Stacks.

#### See Required Permissions

+ In the Management/Payer Account you will need permission to access AWS CloudFormation, AWS Cost & Usage Reports, AWS IAM, AWS Lambda and Amazon S3.
+ In the Data Collection Account you will need permission to access Amazon Athena, AWS CloudFormation, AWS Directory Service, Amazon EventBridge, AWS Glue, AWS IAM, AWS Lambda, Amazon Quick Sight, and Amazon S3 via both the console and the Command Line Tool.
+ For a CLI deployment, you will not require CloudFormation permissions.
+ You can use this CloudFormation template to provision an IAM role with minimal permissions required for dashboard deployment. It takes an IAM role name as a parameter and adds the required policies to the role.

### Step 1. [Data Collection Account] Create Destination For CUR Aggregation


1. Sign in to your Data Collection Account.

1. Click the Launch Stack button below to open the **pre-populated stack template** in your CloudFormation console. This Stack will create bucket open for replication and Athena Tables.

    [https://console.amazonaws.cn/cloudformation/home?region=cn-north-1#/stacks/quickcreate?&templateURL=https://aws-managed-cost-intelligence-dashboards.s3.amazonaws.com/cfn/cur-aggregation.yaml&stackName=CID-CUR-Destination&param_CreateCUR=False&param_DestinationAccountId=REPLACE%20WITH%20THE%20CURRENT%20ACCOUNT%20ID&param_SourceAccountIds=PUT%20HERE%20PAYER%20ACCOUNT%20ID](https://console.amazonaws.cn/cloudformation/home?region=cn-north-1#/stacks/quickcreate?&templateURL=https://aws-managed-cost-intelligence-dashboards.s3.amazonaws.com/cfn/cur-aggregation.yaml&stackName=CID-CUR-Destination&param_CreateCUR=False&param_DestinationAccountId=REPLACE%20WITH%20THE%20CURRENT%20ACCOUNT%20ID&param_SourceAccountIds=PUT%20HERE%20PAYER%20ACCOUNT%20ID) 

### Step 2. [Source/Management Account] Create CUR and Configure Replication


1. Sign in to your Source Account (Management/Payer Account).

1. Click the Launch Stack button below to open the **pre-populated stack template** in your CloudFormation console.

    [https://console.amazonaws.cn/cloudformation/home?region=cn-north-1#/stacks/quickcreate?&templateURL=https://aws-managed-cost-intelligence-dashboards.s3.amazonaws.com/cfn/cur-aggregation.yaml&stackName=CID-CUR-Replication&param_CreateCUR=True&param_DestinationAccountId=REPLACE%20WITH%20DATA%20COLLECTION%20ACCOUNT%20ID&param_SourceAccountIds=](https://console.amazonaws.cn/cloudformation/home?region=cn-north-1#/stacks/quickcreate?&templateURL=https://aws-managed-cost-intelligence-dashboards.s3.amazonaws.com/cfn/cur-aggregation.yaml&stackName=CID-CUR-Replication&param_CreateCUR=True&param_DestinationAccountId=REPLACE%20WITH%20DATA%20COLLECTION%20ACCOUNT%20ID&param_SourceAccountIds=) 

### Step 3. [Data Collection Account] Deploy Dashboards


#### 3.1 - Prepare Amazon Quick Sight (Quick Suite)


##### Click here to expand Amazon Quick Suite Sign Up Workflow for AWS China Beijing Region


**Note**  
Quick Suite is only available in cn-north-1 Beijing region for AWS China

1. Sign in to your Data Collection Account and navigate to the AWS Management Console and search for **Quick Suite** in the services menu.

1. Select **Sign up for Quick Suite** if this is your first time accessing the service.

1. On the Quick Suite setup page, you’ll need to choose an authentication method:
   +  **IAM Identity Center** - Recommended for simplified user management and SSO capabilities
   +  **Active Directory** - Suitable for enterprises with existing AD infrastructure

     You cannot change authentication method after the initial setup. You will need to re-create the Amazon Quick Suite account.

1. If selecting IAM Identity Center:
   + Configure user groups for Quick Suite access levels (Admin/Reader)
   + Follow the [IAM Identity Center user management guide](https://docs.aws.amazon.com/singlesignon/latest/userguide/addusers.html) to set up groups and permissions

Note: Choose your authentication method based on your organization’s requirements and existing identity management infrastructure.

1. At the bottom of the sign up page, there is an optional add-on for Pixel-Perfect Reports:

**Note**  
Make sure to uncheck Pixel-Perfect Reports option unless specifically needed, as it incurs additional charges. This feature can be enabled later if needed.

![\[Quick Sight configuration page - uncheck Pixel-Perfect Reports option\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/pixel-perfect-china.png)


1. Complete the account creation:
   + Select the appropriate Authentication method
   + Enter a unique name for your Quick Suite account
   + Enter an email address for notifications
   + (Optional) Click Select S3 buckets and choose all cid buckets (cid-\$1)
   + Click Finish and wait for the congratulations screen

#### 3.2 - Deploy Foundational Dashboards


**Note**  
To avoid cross-region data transfer costs, use the Beijing Region (cn-north-1) - the only region where Quick Suite is available in China.

1. Sign in to your Data Collection Account.

1. Click the Launch Stack button below to open the **pre-populated stack template** in your CloudFormation console.

    [https://console.amazonaws.cn/cloudformation/home?region=cn-north-1#/stacks/quickcreate?&templateURL=https://aws-managed-cost-intelligence-dashboards.s3.amazonaws.com/cfn/cid-cfn.yml&stackName=Cloud-Intelligence-Dashboards&param_DeployCUDOSv5=yes&param_DeployKPIDashboard=yes&param_DeployCostIntelligenceDashboard=yes&param_CreateLocalAssetsBucket=yes&param_CURVersion=1.0&param_KeepLegacyCURTable=yes&param_CurrencySymbol=JPY](https://console.amazonaws.cn/cloudformation/home?region=cn-north-1#/stacks/quickcreate?&templateURL=https://aws-managed-cost-intelligence-dashboards.s3.amazonaws.com/cfn/cid-cfn.yml&stackName=Cloud-Intelligence-Dashboards&param_DeployCUDOSv5=yes&param_DeployKPIDashboard=yes&param_DeployCostIntelligenceDashboard=yes&param_CreateLocalAssetsBucket=yes&param_CURVersion=1.0&param_KeepLegacyCURTable=yes&param_CurrencySymbol=JPY) 

1. Configure stack parameters:

##### Click here to expand Amazon Quick Suite Sign Up Workflow for AWS China Beijing Region

+ Enter a Stack name for your template such as Cloud-Intelligence-Dashboards
+ Review Common Parameters and confirm prerequisites before specifying the other parameters. You must answer "yes" to both prerequisites questions.
+ Copy and paste your **Quick SightUserName** into the parameter text box. To find your Quick Sight username:
  + Open a new tab or window and navigate to the **Quick Sight** console
  + Find your username from the person icon in the top right corner  
![\[Quick Sight page with username drop down in the top right highlighted\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/cd_dash_qs_china.png)
+ Select the Dashboards you want to install. We recommend deploying all three: Cost Intelligence Dashboard, CUDOS, and the KPI Dashboard.
+ Make sure Parameters **CreateLocalAssetsBucket** set to **yes** and **CURVersion** set to **1.0** 
+ The **CurrencySymbol** parameter is defaulted to JPY (Japanese Yen - ¥). Please select the appropriate symbol from the dropdown option to match your CUR settings.
+ Review the configuration, select the checkbox **I acknowledge that Amazon CloudFormation might create IAM resources with custom names**, and click **Create stack**.
+ You will see the stack will start in **CREATE\$1IN\$1PROGRESS**. This step can take \$120 minutes. Once complete, the stack will show **CREATE\$1COMPLETE** 

**Note**  
Dashboards will be empty initially. We recommend initiating a backfill via Support Cases

### Step 4 (optional). Request Data Backfill


You can create a Support Case requesting a back-fill of your Cost And Usage Report with up to 36 months of historical data. Case must be created from each of your Source Accounts (typically Management/Payer Accounts).

## Post-Deployment Steps


After successful deployment:

1. Check stack outputs for dashboard URLs

1. Verify Quick Sight access

1. Wait for data to populate (typically 24-48 hours for first data delivery)

1. Consider requesting a backfill through AWS Support if you need historical data

## FAQ


### How can I see AWS Usage in China and other Partitions?

+ You can consolidate Cost and Usage report from China and Global regions in one account (can be in any partition of your choice). We recommend using [Data Transfer Hub](https://github.com/aws-solutions/data-transfer-hub). Please consult with your legal team before moving data across AWS Partitions. If you aggregate data in different currencies you might need additionally a [currency conversion](spend-in-local-currency.md).

#### See Sample Architecture


![\[Data Transfer Architecture\]](http://docs.aws.amazon.com/guidance/latest/cloud-intelligence-dashboards/images/china/china-cur-transfer.png)


1. Amazon S3 replicates AWS CUR data from a Management account in Global region to a Data Collection Account.

1. Cloud Intelligence Dashboards leverage Amazon Athena and Amazon Quick Sight for viualization.

1.  [Data Transfer Hub](https://github.com/aws-solutions/data-transfer-hub) moves data from China region to the Data collection account in Global Region.

1. Additional solution can be used for pulling up to date exchange rate information from a 3rd party source.

### What dashboards are available in China?

+ At the moment only Foundational Dashboards (CUDOS, CID, KPI) are available. We are working on other dashboard as well.

Other questions? Visit our [FAQs](faq.md).