

# Ingest data to AWS IoT SiteWise
<a name="industrial-data-ingestion"></a>

AWS IoT SiteWise is designed to efficiently collect and correlate industrial data with corresponding assets, representing various aspects of industrial operations. This documentation focuses on the practical aspects of ingesting data into AWS IoT SiteWise, offering multiple methods tailored to diverse industrial use cases. For instructions to build your virtual industrial operation, see [Model industrial assets](industrial-asset-models.md).

You can send industrial data to AWS IoT SiteWise using any of the following options:
+ **AWS IoT SiteWise Edge**–Use [SiteWise Edge gateway](gateways.md) as an intermediary between AWS IoT SiteWise and your data servers. AWS IoT SiteWise provides AWS IoT Greengrass components that you can deploy on any platform that can run AWS IoT Greengrass to set up a SiteWise Edge gateway. This option supports linking with [OPC UA](https://en.wikipedia.org/wiki/OPC_Unified_Architecture) server protocol.
+ **AWS IoT SiteWise API**–Use the [AWS IoT SiteWise API](ingest-api.md) to upload data from any other source. Use our streaming [BatchPutAssetPropertyValue](https://docs.aws.amazon.com/iot-sitewise/latest/APIReference/API_BatchPutAssetPropertyValue.html) API for ingestion within seconds, or the batch-oriented [CreateBulkImportJob](https://docs.aws.amazon.com/iot-sitewise/latest/APIReference/API_CreateBulkImportJob.html) API to facilitate cost-effective ingestion in larger batches.
+ **AWS IoT Core rules**–Use [AWS IoT Core rules](iot-rules.md) to upload data from MQTT messages published by an AWS IoT thing or another AWS service.
+ **AWS IoT Events actions**–Use [AWS IoT Events actions](iot-events.md) triggered by specific events in AWS IoT Events. This method is suitable for scenarios where data upload is tied to event occurrences.
+ **AWS IoT Greengrass stream manager**–Use [AWS IoT Greengrass stream manager](greengrass-stream-manager.md) to upload data from local data sources using an edge device. This option caters to situations where data originates from on-premises or edge locations.

These methods offer a range of solutions for managing data from different sources. Delve into the details of each option to gain a comprehensive understanding of the data ingestion capabilities AWS IoT SiteWise provides.

# Manage data streams for AWS IoT SiteWise
<a name="manage-data-streams"></a>

 A data stream is the resource that contains historical time series data. Each data stream is identified by a unique alias, making it easier to keep track of the origin for each piece of data. Data streams are automatically created in AWS IoT SiteWise when the first time series data is received. If the first time series data is identified with an alias, AWS IoT SiteWise creates a new data stream with that alias, provided no asset properties are already assigned that alias. Alternatively, if the first time series data is identified with an asset ID and property ID, AWS IoT SiteWise creates a new data stream and associates that data stream with the asset property. 

 There are two ways to assign an alias to an asset property. The method used depends on if data is sent to AWS IoT SiteWise first, or an asset is created first.
+  If data is sent to AWS IoT SiteWise first, this automatically creates a data stream with the assigned alias. When the asset is created later, use the [ AssociateTimeSeriesToAssetProperty](https://docs.aws.amazon.com/iot-sitewise/latest/APIReference/API_AssociateTimeSeriesToAssetProperty.html) API to associate the data stream and its alias to the asset property. 
+  If an asset is created first, use the [ UpdateAssetProperty](https://docs.aws.amazon.com/iot-sitewise/latest/APIReference/API_UpdateAssetProperty.html) API to assign an alias to an asset property. When data is later sent to AWS IoT SiteWise, the data stream is automatically created and associated with the asset property. 

Currently, you can only associate data streams with measurements. *Measurements* are a type of asset property that represent devices' raw sensor data streams, such as timestamped temperature values or timestamped rotations per minute (RPM) values.

When these measurements define metrics or transformations, the incoming data triggers specific calculations. It’s important to note that an asset property can only be linked to one data stream at a time.

AWS IoT SiteWise uses `TimeSeries` for the Amazon Resource Name (ARN) resource to determine your storage charges. For more information, see [AWS IoT SiteWise Pricing](https://aws.amazon.com/iot-sitewise/pricing/).

The following sections show you how to use the AWS IoT SiteWise console or API to manage data streams.

**Topics**
+ [Configure permissions and settings](manage-data-streams-configuration.md)
+ [Associate a data stream to an asset property](manage-data-streams-method.md)
+ [Disassociate a data stream from an asset property](disassociate-data-streams-method.md)
+ [Delete a data stream](delete-data-streams-method.md)
+ [Update an asset property alias](update-data-streams-method.md)
+ [Common scenarios](data-ingestion-scenarios.md)

# Configure permissions and settings
<a name="manage-data-streams-configuration"></a>

 Data streams are automatically created in AWS IoT SiteWise when the first time series data is received. If the data ingested is not associated with an asset property, AWS IoT SiteWise creates a new disassociated data stream which is configurable to be associated with an asset property. Configure the access control of the gateway sending data to AWS IoT SiteWise, using IAM policies to specify the type of data to be ingested. 

 The following IAM policy disables disassociated data ingestion from the gateway, while still allowing data ingestion to data streams associated with an asset property: 

**Example IAM user policy that disables disassociated data ingestion from the gateway**    
****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
    {
      "Sid": "AllowPutAssetPropertyValuesUsingAssetIdAndPropertyId",
      "Effect": "Allow",
      "Action": "iotsitewise:BatchPutAssetPropertyValue",
      "Resource": "arn:aws:iotsitewise:*:*:asset/*"
    },
    {
      "Sid": "AllowPutAssetPropertyValuesUsingAliasWithAssociatedAssetProperty",
      "Effect": "Allow",
      "Action": "iotsitewise:BatchPutAssetPropertyValue",
      "Resource": "arn:aws:iotsitewise:*:*:time-series/*",
      "Condition": {
        "StringLikeIfExists": {
          "iotsitewise:isAssociatedWithAssetProperty": "true"
        }
      }
    },
    {
      "Sid": "DenyPutAssetPropertyValuesUsingAliasWithNoAssociatedAssetProperty",
      "Effect": "Deny",
      "Action": "iotsitewise:BatchPutAssetPropertyValue",
      "Resource": "arn:aws:iotsitewise:*:*:time-series/*",
      "Condition": {
        "StringLikeIfExists": {
          "iotsitewise:isAssociatedWithAssetProperty": "false"
        }
      }
    }
  ]
}
```

**Example IAM user policy that disables all data ingestion from the gateway**    
****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "DenyPutAssetPropertyValues",
            "Effect": "Deny",
            "Action": "iotsitewise:BatchPutAssetPropertyValue",
            "Resource": [
                "arn:aws:iotsitewise:*:*:asset/*",
                "arn:aws:iotsitewise:*:*:time-series/*"
            ]
        }
    ]
}
```

# Associate a data stream to an asset property
<a name="manage-data-streams-method"></a>

Manage your data streams using the AWS IoT SiteWise console or AWS CLI.

------
#### [ Console ]

Use the AWS IoT SiteWise console to manage your data streams.

**To manage data streams (console)**

1. <a name="sitewise-open-console"></a>Navigate to the [AWS IoT SiteWise console](https://console.aws.amazon.com/iotsitewise/).

1. In the navigation pane, choose **Data streams**.

1. Choose a data stream by either filtering on data stream alias, or selecting **Disassociated data streams** in the filter drop down menu.

1. Select the data stream to update. You may select multiple data streams. Click **Manage data streams** on the upper right. 

1. Select the data stream to be associated from **Update data stream associations**, and click the **Choose measurement** button.

1.  In the **Choose measurement** section, find the corresponding asset measurement property. Select the measurement then click **Choose**. 

1.  Perform steps 4 and 5 for other data streams selected in step 3. Assign asset properties to all the data streams. 

1.  Choose **Update** to commit the changes. A successful confirmation banner is displayed to confirm the update. 

------
#### [ AWS CLI ]

 To associate a data stream (identified by its alias) to an asset property (identified by its IDs), run the following command: 

```
aws iotsitewise associate-time-series-to-asset-property \ 
    --alias <data-stream-alias> \
    --assetId <asset-ID> \
    --propertyId <property-ID>
```

------

# Disassociate a data stream from an asset property
<a name="disassociate-data-streams-method"></a>

------
#### [ Console ]

Use the AWS IoT SiteWise console to disassociate your data stream from an asset property.

**To disassociate data streams from an asset property (console)**

1. <a name="sitewise-open-console"></a>Navigate to the [AWS IoT SiteWise console](https://console.aws.amazon.com/iotsitewise/).

1. In the navigation pane, choose **Data streams**.

1. Choose a data stream by either filtering on data stream alias, or selecting **Associated data streams** in the filter drop down menu.

1. Select the data stream to disassociate. The **Data stream alias** column must contain an alias. The **Asset name** and **Asset property name** columns must contain the values of the asset property the data stream is associated with. You can select multiple data streams.

1.  Click **Manage data streams** on the upper right. 

1.  In the **Update data stream associations** section, click **X** in the **Measurement name** column. A `submitted` status should appear in the **Status** column. 

1.  Choose **Update** to commit the changes. The data stream is now disassociated from the asset property, and the alias is now used to identify the data stream. 

------
#### [ AWS CLI ]

To disassociate a data stream from an asset property, (identified by its `ID`s and its alias), run the following command: 

```
    aws iotsitewise disassociate-time-series-from-asset-property \ 
        --alias <asset-property-alias> \
        --assetId <asset-ID> \
        --propertyId <property-ID>
```

 The data stream is now disassociated from the asset property, and the alias is used to identify the data stream. The alias is no longer associated with the asset property, as it is now associated with the data stream. 

------

# Delete a data stream
<a name="delete-data-streams-method"></a>

 When a property is removed from an asset model, AWS IoT SiteWise deletes the properties and their data streams from all assets that are managed by the asset model. It also deletes all properties and their data streams of an asset when the asset is deleted. If a data stream data must be preserved, it must be disassociated from the asset property before it is deleted. 

**Warning**  
 When a property is deleted from an asset, the associated data stream is also deleted. To preserve the data stream, disassociate it from the asset property first, before deleting the property from the asset model, or deleting the asset. 

------
#### [ Console ]

Use the AWS IoT SiteWise console to disassociate your data stream from an asset property.

**To delete a data stream (console)**

1. <a name="sitewise-open-console"></a>Navigate to the [AWS IoT SiteWise console](https://console.aws.amazon.com/iotsitewise/).

1.  In the navigation pane, choose **Data streams**. 

1.  Choose a data stream by filtering on data stream alias. 

1.  Select the data stream to delete. You may select multiple data streams. 

1.  Choose the **Delete** button to delete the data stream. 

------
#### [ AWS CLI ]

 Use the [ DeleteTimeSeries](https://docs.aws.amazon.com/iot-sitewise/latest/APIReference/API_DeleteTimeSeries.html) API to delete a specific data stream, by its alias. 

```
    aws iotsitewise delete-time-series \ 
        --alias <data-stream-alias>
```

------

# Update an asset property alias
<a name="update-data-streams-method"></a>

 Aliases must be unique within an AWS region. This includes aliases of both asset properties and data streams. Do not assign an alias to an asset property, if another property or data stream is using that alias. 

------
#### [ Console ]

Use the AWS IoT SiteWise console to update an asset property alias.

**To update an asset property alias (console)**

1. <a name="sitewise-open-console"></a>Navigate to the [AWS IoT SiteWise console](https://console.aws.amazon.com/iotsitewise/).

1.  In the navigation pane, choose **Assets **. 

1.  Select the asset from the table. 

1.  Click the **Edit** button. 

1.  Select the **Property type** in the **Properties** table. 

1.  Find the property, and type the new alias in the property alias text field. 

1.  Click the **Save** button to save the changes. 

------
#### [ AWS CLI ]

 To update an alias on an asset property, run the following command: 

```
    aws iotsitewise update-asset-property \
        --asset-id <asset-ID> \
        --property-id <property-ID> \
        --property-alias <asset-property-alias> \
        --property-notification-state <ENABLED|DISABLED>
```

**Note**  
 If property notifications are currently enabled, it must be provided again to ensure it continues to be enabled. 

------

# Common scenarios
<a name="data-ingestion-scenarios"></a>

## Move a data stream
<a name="data-ingestion-scenario-move-data-stream"></a>

 To change a data stream’s association to another asset property, first disassociate the data stream from the current asset property. When disassociating a data stream from an asset property, there must be an alias assigned to that asset property. 

```
    aws iotsitewise disassociate-time-series-from-asset-property \ 
        --alias <asset-property-alias> \
        --assetId <asset-ID> \
        --propertyId <property-ID>
```

 Now re-assign the data stream to the new asset property. 

```
    aws iotsitewise associate-time-series-from-asset-property \ 
        --alias <data-stream-alias> \
        --assetId <new-asset-ID> \
        --propertyId <new-property-ID>
```

## Error when assigning an alias to an asset property
<a name="data-ingestion-scenario-assetid-contains-data"></a>

 When using the `UpdateAssetProperty` API to assign an alias to a property, you may see the following error message: 

```
Given alias <data-stream-alias> for property <property-name> with ID <property-ID> already in use by another property or data stream
```

 This error message indicates the alias is not assigned to the property, because it is currently used by another property or a data stream. 

 This happens if data is being ingested to AWS IoT SiteWise with an alias. When data is sent with an alias not being used by another data stream or asset property, a new data stream is created with that alias. The below two options resolve the issue. 
+  Use `AssociateTimeSeriesToAssetProperty` API to associate the data stream with its alias to the asset property. 
+  Temporarily stop the data ingestion and delete the data stream. Use `UpdateAssetProperty` API to assign the alias to the asset property, and then turn data ingestion back on. 

## Error when associating a data stream to an asset property
<a name="data-ingestion-scenario-move-data-stream"></a>

 When associating a data stream to an asset property, the following error message is seen. 

```
assetProperty <property-name> with assetId <asset-ID> propertyId <property-ID> contains data
```

 This error message indicates the asset property already is associated with a data stream containing data. That data stream must be disassociated or deleted, before associating an other data stream to that asset property. 

**Note**  
 When disassociating a data stream from an asset property, the alias assigned to the property is given to the data stream. For that alias to remain assigned to the property, assign a new alias to that property before disassociating the data stream. 

 To preserve the data stored in the asset property do the following: 
+  Ensure no data is being ingested to the asset property, to prevent creating a new data stream. 
+  Use `UpdateAssetProperty` API to set a new alias that is given to the currently assigned data stream. 
+  Use `DisassociateTimeSeriesFromAssetProperty` API to disassociate the current data stream from the asset property. 
+  Use `AssociateTimeSeriesToAssetProperty` API to associate the desired data stream to the asset property. 

 If the data stored in the asset property must be deleted, do the following: 
+  Ensure no data is being ingested to the asset property, to prevent creating a new data stream. 
+  Use `DeleteTimeSeries` API to delete the currently assigned data stream. 
+  Use `AssociateTimeSeriesToAssetProperty` API to associate the desired data stream to the asset property. 

# Ingest data with AWS IoT SiteWise APIs
<a name="ingest-api"></a>

Use AWS IoT SiteWise APIs to send timestamped industrial data to your assets' attribute and measurement properties. The APIs accepts payload containing timestamp-quality-value (TQV) structures.

# BatchPutAssetPropertyValue API
<a name="ingest-api-batch-putasset"></a>

Use the [BatchPutAssetPropertyValue](https://docs.aws.amazon.com/iot-sitewise/latest/APIReference/API_BatchPutAssetPropertyValue.html) operation to upload your data. With this operation, you can upload multiple data entries at a time to collect data from several devices and send it all in a single request.

**Important**  
The [BatchPutAssetPropertyValue](https://docs.aws.amazon.com/iot-sitewise/latest/APIReference/API_BatchPutAssetPropertyValue.html) operation is subject to the following quotas:   
Up to 10 [entries](https://docs.aws.amazon.com/iot-sitewise/latest/APIReference/API_BatchPutAssetPropertyValue.html#API_BatchPutAssetPropertyValue_RequestSyntax) per request.
Up to 10 [property values](https://docs.aws.amazon.com/iot-sitewise/latest/APIReference/API_PutAssetPropertyValueEntry.html#iotsitewise-Type-PutAssetPropertyValueEntry-propertyValues) (TQV data points) per entry. 
AWS IoT SiteWise rejects any data with a timestamp dated to more than 7 days in the past or more than 10 minutes in the future.
 For more information about these quotas, see [BatchPutAssetPropertyValue](https://docs.aws.amazon.com/iot-sitewise/latest/APIReference/API_BatchPutAssetPropertyValue.html) in the *AWS IoT SiteWise API Reference*.

To identify an asset property, specify one of the following:
+ The `assetId` and `propertyId` of the asset property that data is sent to.
+ The `propertyAlias`, which is a data stream alias (for example, `/company/windfarm/3/turbine/7/temperature`). To use this option, you must first set your asset property's alias. To set property aliases, see [Manage data streams for AWS IoT SiteWise](manage-data-streams.md).

The following example demonstrates how to send a wind turbine's temperature and rotations per minute (RPM) readings from a payload stored in a JSON file.

```
aws iotsitewise batch-put-asset-property-value --cli-input-json file://batch-put-payload.json
```

The example payload in `batch-put-payload.json` has the following content.

```
{
  "enablePartialEntryProcessing": true,      
  "entries": [
    {
      "entryId": "unique entry ID",
      "propertyAlias": "/company/windfarm/3/turbine/7/temperature",
      "propertyValues": [
        {
          "value": {
            "integerValue": 38
          },
          "timestamp": {
            "timeInSeconds": 1575691200
          }
        }
      ]
    },
    {
      "entryId": "unique entry ID",
      "propertyAlias": "/company/windfarm/3/turbine/7/rpm",
      "propertyValues": [
        {
          "value": {
            "doubleValue": 15.09
          },
          "timestamp": {
            "timeInSeconds": 1575691200
          },
          "quality": "GOOD"
        }
      ]
    },
    {
  "entryId": "unique entry ID",
      "propertyAlias": "/company/windfarm/3/turbine/7/rpm",
      "propertyValues": [
        {
  "value": {
  "nullValue":{"valueType": "D"}
          },
          "timestamp": {
  "timeInSeconds": 1575691200
          },
          "quality": "BAD"
        }
      ]
    }
  ]
}
```

Specifying `enablePartialEntryProcessing` as `true` allows ingestion of all values that do not result in failure. The default behavior is `false`. If a value is invalid, the entire entry fails ingestion.

Each entry in the payload contains an `entryId` that you can define as any unique string. If any request entries fail, each error will contain the `entryId` of the corresponding request so that you know which requests to retry.

Each structure in the list of `propertyValues` is a timestamp-quality-value (TQV) structure that contains a `value`, a `timestamp`, and optionally a `quality`.
+ `value` – A structure that contains one of the following fields, depending on the type of the property being set:
  + `booleanValue`
  + `doubleValue`
  + `integerValue`
  + `stringValue`
  + `nullValue`
+ `nullValue` – A structure with the following field denoting the type of the property value with value Null and quality of `BAD` or `UNCERTAIN`.
  + `valueType` – Enum of \$1"B", "D", "S", "I"\$1
+ `timestamp` – A structure that contains the current Unix epoch time in seconds, `timeInSeconds`. You can also set the `offsetInNanos` key in the `timestamp` structure if you have temporally precise data. AWS IoT SiteWise rejects any data points with timestamps older than 7 days in the past or newer than 10 minutes in the future.
+ `quality` – (Optional) One of the following quality strings:
  + `GOOD` – (Default) The data isn't affected by any issues.
  + `BAD` – The data is affected by an issue such as sensor failure.
  + `UNCERTAIN` – The data is affected by an issue such as sensor inaccuracy.

  For more information about how AWS IoT SiteWise handles data quality in computations, see [Data quality in formula expressions](expression-tutorials.md#data-quality).

# CreateBulkImportJob API
<a name="ingest-bulkImport"></a>

Use the `CreateBulkImportJob` API to import large amounts of data from Amazon S3. Your data must be saved in the CSV format in Amazon S3. Data files can have the following columns.

**Note**  
 Data older than 1 January 1970 00:00:00 UTC is not supported.   
To identify an asset property, specify one of the following.  
The `ASSET_ID` and `PROPERTY_ID` of the asset property that you you're sending data to.
The `ALIAS`, which is a data stream alias (for example, `/company/windfarm/3/turbine/7/temperature`). To use this option, you must first set your asset property's alias. To learn how to set property aliases, see [Manage data streams for AWS IoT SiteWise](manage-data-streams.md).
+ `ALIAS` – The alias that identifies the property, such as an OPC UA server data stream path (for example, `/company/windfarm/3/turbine/7/temperature`). For more information, see [Manage data streams for AWS IoT SiteWise](manage-data-streams.md).
+ `ASSET_ID` – The ID of the asset.
+ `PROPERTY_ID` – The ID of the asset property.
+ `DATA_TYPE` – The property's data type can be one of the following.
  + `STRING` – A string with up to 1024 bytes.
  + `INTEGER` – A signed 32-bit integer with range [-2,147,483,648, 2,147,483,647].
  + `DOUBLE` – A floating point number with range [-10^100, 10^100] and IEEE 754 double precision.
  + `BOOLEAN` – `true` or `false`.
+ `TIMESTAMP_SECONDS` – The timestamp of the data point, in Unix epoch time.
+ `TIMESTAMP_NANO_OFFSET` – The nanosecond offset coverted from `TIMESTAMP_SECONDS`.
+ `QUALITY` – (Optional) The quality of the asset property value. The value can be one of the following.
  + `GOOD` – (Default) The data isn't affected by any issues.
  + `BAD` – The data is affected by an issue such as sensor failure.
  + `UNCERTAIN` – The data is affected by an issue such as sensor inaccuracy.

  For more information about how AWS IoT SiteWise handles data quality in computations, see [Data quality in formula expressions](expression-tutorials.md#data-quality).
+ `VALUE` – The value of the asset property.

**Example data file(s) in the .csv format**  

```
asset_id,property_id,DOUBLE,1635201373,0,GOOD,1.0
asset_id,property_id,DOUBLE,1635201374,0,GOOD,2.0
asset_id,property_id,DOUBLE,1635201375,0,GOOD,3.0
```

```
unmodeled_alias1,DOUBLE,1635201373,0,GOOD,1.0
unmodeled_alias1,DOUBLE,1635201374,0,GOOD,2.0
unmodeled_alias1,DOUBLE,1635201375,0,GOOD,3.0
unmodeled_alias1,DOUBLE,1635201376,0,GOOD,4.0
unmodeled_alias1,DOUBLE,1635201377,0,GOOD,5.0
unmodeled_alias1,DOUBLE,1635201378,0,GOOD,6.0
unmodeled_alias1,DOUBLE,1635201379,0,GOOD,7.0
unmodeled_alias1,DOUBLE,1635201380,0,GOOD,8.0
unmodeled_alias1,DOUBLE,1635201381,0,GOOD,9.0
unmodeled_alias1,DOUBLE,1635201382,0,GOOD,10.0
```

AWS IoT SiteWise provides the following API operations to create a bulk import job and get information about an existing job.
+ [CreateBulkImportJob](https://docs.aws.amazon.com/iot-sitewise/latest/APIReference/API_CreateBulkImportJob.html) – Creates a new bulk import job.
+ [DescribeBulkImportJob](https://docs.aws.amazon.com/iot-sitewise/latest/APIReference/API_DescribeBulkImportJob.html) – Retrieves information about a bulk import job.
+ [ListBulkImportJob](https://docs.aws.amazon.com/iot-sitewise/latest/APIReference/API_ListBulkImportJobs.html) – Retrieves a paginated list of summaries of all bulk import jobs.

# Create an AWS IoT SiteWise bulk import job (AWS CLI)
<a name="CreateBulkImportJob"></a>

Use the [CreateBulkImportJob](https://docs.aws.amazon.com/iot-sitewise/latest/APIReference/API_CreateBulkImportJob.html) API operation to transfer data from Amazon S3 to AWS IoT SiteWise. The [CreateBulkImportJob](https://docs.aws.amazon.com/iot-sitewise/latest/APIReference/API_CreateBulkImportJob.html) API enables ingestion of large volumes of historical data, and buffered ingestion of analytical data streams in small batches. It provides a cost-effective primitive for data ingestion. The following example uses the AWS CLI.

**Important**  
Before creating a bulk import job, you must enable AWS IoT SiteWise warm tier or AWS IoT SiteWise cold tier. For more information, see [Configure storage settings in AWS IoT SiteWise](configure-storage.md).  
 The [CreateBulkImportJob](https://docs.aws.amazon.com/iot-sitewise/latest/APIReference/API_CreateBulkImportJob.html) API supports ingestion of historical data into AWS IoT SiteWise with the option to set the adaptive-ingestion-flag parameter.   
When set to `false`, the API ingests historical data without triggering computations or notifications.
When set to `true`, the API ingests new data, calculating metrics and transforming the data to optimize ongoing analytics and notifications within seven days.

Run the following command. Replace *file-name* with the name of the file that contains the bulk import job configuration.

```
aws iotsitewise create-bulk-import-job --cli-input-json file://file-name.json
```

**Example Bulk import job configuration**  
The following are examples of configuration settings:  
+ Replace *adaptive-ingestion-flag* with `true` or `false`.
  + If set to `false`, the bulk import job ingests historical data into AWS IoT SiteWise.
  + If set to `true`, the bulk import job does the following:
    + Ingests new data into AWS IoT SiteWise.
    + Calculates metrics and transforms, and supports notifications for data with a time stamp that's within seven days.
+ Replace *delete-files-after-import-flag* with `true` to delete the data from the Amazon S3 data bucket after ingesting into AWS IoT SiteWise warm tier storage.
+ Replace amzn-s3-demo-bucket*-for-errors* with the name of the Amazon S3 bucket to which errors associated with this bulk import job are sent.
+ Replace amzn-s3-demo-bucket*-for-errors-prefix* with the prefix of the Amazon S3 bucket to which errors associated with this bulk import job are sent. 

  Amazon S3 uses the prefix as a folder name to organize data in the bucket. Each Amazon S3 object has a key that is its unique identifier in the bucket. Each object in a bucket has exactly one key. The prefix must end with a forward slash (/). For more information, see [Organizing objects using prefixes](https://docs.aws.amazon.com/AmazonS3/latest/userguide/using-prefixes.html) in the *Amazon Simple Storage Service User Guide*.
+ Replace amzn-s3-demo-bucket*-data* with the name of the Amazon S3 bucket from which data is imported.
+ Replace *data-bucket-key* with the key of the Amazon S3 object that contains your data. Each object has a key that is a unique identifier. Each object has exactly one key.
+ Replace *data-bucket-version-id* with the version ID to identify a specific version of the Amazon S3 object that contains your data. This parameter is optional.
+ Replace *column-name* with the column name specified in the .csv file.
+ Replace *job-name* with a unique name that identifies the bulk import job.
+ Replace *job-role-arn* with the IAM role that allows AWS IoT SiteWise to read Amazon S3 data.
Make sure that your role has the permissions shown in the following example. Replace amzn-s3-demo-bucket*-data* with the name of the Amazon S3 bucket that contains your data. Also, replace *amzn-s3-demo-bucket-for-errors* with the name of the Amazon S3 bucket to which errors associated with this bulk import job are sent.    
****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Action": [
                "s3:GetObject",
                "s3:GetBucketLocation"
            ],
            "Resource": [
                "arn:aws:s3:::amzn-s3-demo-bucket-data",
                "arn:aws:s3:::amzn-s3-demo-bucket-data/*"
            ],
            "Effect": "Allow"
        },
        {
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:GetBucketLocation"
            ],
            "Resource": [
                "arn:aws:s3:::amzn-s3-demo-bucket-for-errors",
                "arn:aws:s3:::amzn-s3-demo-bucket-for-errors/*"
            ],
            "Effect": "Allow"
        }
    ]
}
```

```
{
   "adaptiveIngestion": adaptive-ingestion-flag,
   "deleteFilesAfterImport": delete-files-after-import-flag,       
   "errorReportLocation": { 
      "bucket": "amzn-s3-demo-bucket-for-errors",
      "prefix": "amzn-s3-demo-bucket-for-errors-prefix"
   },
   "files": [ 
      { 
         "bucket": "amzn-s3-demo-bucket-data",
         "key": "data-bucket-key",
         "versionId": "data-bucket-version-id"
      }
   ],
   "jobConfiguration": { 
      "fileFormat": { 
         "csv": { 
            "columnNames": [ "column-name" ]
         }
      }
   },
   "jobName": "job-name",
   "jobRoleArn": "job-role-arn"    
}
```

**Example response**  

```
{
   "jobId":"f8c031d0-01d1-4b94-90b1-afe8bb93b7e5",
   "jobStatus":"PENDING",
   "jobName":"myBulkImportJob"
}
```

# Describe an AWS IoT SiteWise bulk import job (AWS CLI)
<a name="DescribeBulkImportJob"></a>

Use the [DescribeBulkImportJob](https://docs.aws.amazon.com/iot-sitewise/latest/APIReference/API_DescribeBulkImportJob.html) API operation to retrieve information about a specific bulk import job in AWS IoT SiteWise. This operation returns details such as the job's status, creation time, and error information if the job failed. You can use this operation to monitor job progress and troubleshoot issues. To use `DescribeBulkImportJob`, you need the job ID from the `CreateBulkImportJob` operation. The API returns the following information:
+ List of files being imported, including their Amazon S3 bucket locations and keys
+ Error report location (if applicable)
+ Job configuration details, such as file format and CSV column names
+ Job creation and last update timestamps
+ Current job status (for example, whether the job is in progress, completed, or failed)
+ IAM role ARN used for the import job

For completed jobs, review the results to confirm successful data integration. If a job fails, examine the error details to diagnose and resolve issues.

Replace *job-ID* with the ID of the bulk import job that you want to retrieve.

```
aws iotsitewise describe-bulk-import-job --job-id job-ID
```

**Example response**  

```
{
   "files":[
      {
         "bucket":"amzn-s3-demo-bucket1",
         "key":"100Tags12Hours.csv"
      },
      {
         "bucket":"amzn-s3-demo-bucket2",
         "key":"BulkImportData1MB.csv"
      },
      {
         "bucket":"	amzn-s3-demo-bucket3",
         "key":"UnmodeledBulkImportData1MB.csv"
      }
   ],
   "errorReportLocation":{
      "prefix":"errors/",
      "bucket":"amzn-s3-demo-bucket-for-errors"
   },
   "jobConfiguration":{
      "fileFormat":{
         "csv":{
            "columnNames":[
               "ALIAS",
               "DATA_TYPE",
               "TIMESTAMP_SECONDS",
               "TIMESTAMP_NANO_OFFSET",
               "QUALITY",
               "VALUE"
            ]
         }
      }
   },
   "jobCreationDate":1645745176.498,
   "jobStatus":"COMPLETED",
   "jobName":"myBulkImportJob",
   "jobLastUpdateDate":1645745279.968,
   "jobRoleArn":"arn:aws:iam::123456789012:role/DemoRole",
   "jobId":"f8c031d0-01d1-4b94-90b1-afe8bb93b7e5"
}
```

# List AWS IoT SiteWise bulk import jobs (AWS CLI)
<a name="ListBulkImportJobs"></a>

Use the [ListBulkImportJobs](https://docs.aws.amazon.com/iot-sitewise/latest/APIReference/API_ListBulkImportJobss.html) API operation to retrieve a list of summaries for bulk import jobs in AWS IoT SiteWise. This operation provides an efficient way to monitor and manage your data import processes. It returns the following key information for each job:
+ Job ID. A unique identifier for each bulk import job
+ Job name. The name you assigned to the job when creating it
+ Current status. The job's current state (for example, COMPLETED, RUNNING, FAILED)

ListBulkImportJobs is particularly useful for getting a comprehensive overview of all your bulk import jobs. This can help you track multiple data imports, identify any jobs that require attention, and maintain an organized workflow. The operation supports pagination, allowing you to retrieve large numbers of job summaries efficiently. You can use the job IDs returned by this operation with the [DescribeBulkImportJob](https://docs.aws.amazon.com/iot-sitewise/latest/APIReference/API_DescribeBulkImportJob.html) operation to retrieve more detailed information about specific jobs. This two-step process allows you to first get a high-level view of all jobs, and then drill down into the details of jobs of interest. When using `ListBulkImportJobs`, you can apply filters to narrow down the results. For example, you can filter jobs based on their status to retrieve only completed jobs or only running jobs. This feature helps you focus on the most relevant information for your current task. The operation also returns a `nextToken` if there are more results available. You can use this token in subsequent calls to retrieve the next set of job summaries, enabling you to iterate through all your bulk import jobs even if you have a large number of them. The following example demonstrates how to use `ListBulkImportJobs` with the AWS CLI to retrieve a list of completed jobs.

```
aws iotsitewise list-bulk-import-jobs --filter COMPLETED
```

**Example Response for completed jobs filter**  

```
{
   "jobSummaries":[
      {
         "id":"bdbbfa52-d775-4952-b816-13ba1c7cb9da",
         "name":"myBulkImportJob",
         "status":"COMPLETED"
      },
      {
         "id":"15ffc641-dbd8-40c6-9983-5cb3b0bc3e6b",
         "name":"myBulkImportJob2",
         "status":"COMPLETED"
      }
   ]
}
```

This command demonstrates how to use `ListBulkImportJobs` to retrieve a list of jobs that completed with failures. The maximum is set to 50 results and we're using a next token for paginated results.

```
aws iotsitewise list-bulk-import-jobs --filter COMPLETED_WITH_FAILURES --max-results 50 --next-token "string"
```

# Ingest data to AWS IoT SiteWise using AWS IoT Core rules
<a name="iot-rules"></a>

Send data to AWS IoT SiteWise from AWS IoT things and other AWS services by using rules in AWS IoT Core. Rules transform MQTT messages and perform actions to interact with AWS services. The AWS IoT SiteWise rule action forwards messages data to the [BatchPutAssetPropertyValue](https://docs.aws.amazon.com/iot-sitewise/latest/APIReference/API_BatchPutAssetPropertyValue.html) operation from the AWS IoT SiteWise API. For more information, see [Rules](https://docs.aws.amazon.com/iot/latest/developerguide/iot-rules.html) and [AWS IoT SiteWise action](https://docs.aws.amazon.com/iot/latest/developerguide/iot-rule-actions.html#iotsitewise-rule) in the *AWS IoT Developer Guide*.

To follow a tutorial that walks through the steps required to set up a rule that ingests data through device shadows, see [Ingest data to AWS IoT SiteWise from AWS IoT things](ingest-data-from-iot-things.md).

You can also send data from AWS IoT SiteWise to other AWS services. For more information, see [Interact with other AWS services](interact-with-other-services.md).

**Topics**
+ [Grant AWS IoT the required access](grant-rule-access.md)
+ [Configure the AWS IoT SiteWise rule action](configure-rule-action.md)
+ [Reduce costs with Basic Ingest in AWS IoT SiteWise](basic-ingest-rules.md)

# Grant AWS IoT the required access
<a name="grant-rule-access"></a>

You use IAM roles to control the AWS resources to which each rule has access. Before you create a rule, you must create an IAM role with a policy that allows the rule to perform actions on the required AWS resource. AWS IoT assumes this role when running a rule.

If you create the rule action in the AWS IoT console, you can choose a root asset to create a role that has access to a selected asset hierarchy. For more information about how to manually define a role for a rule, see [Granting AWS IoT the required access](https://docs.aws.amazon.com/iot/latest/developerguide/iot-create-role.html) and [Pass role permissions](https://docs.aws.amazon.com/iot/latest/developerguide/pass-role.html) in the *AWS IoT Developer Guide*.

For the AWS IoT SiteWise rule action, you must define a role that allows `iotsitewise:BatchPutAssetPropertyValue` access to the asset properties to which the rule sends data. To improve security, you can specify an AWS IoT SiteWise asset hierarchy path in the `Condition` property. 

The following example trust policy allows access to a specific asset and its children.

------
#### [ JSON ]

****  

```
{
  "Version":"2012-10-17",		 	 	 
  "Statement": [
    {
      "Effect": "Allow",
      "Action": "iotsitewise:BatchPutAssetPropertyValue",
      "Resource": "*",
      "Condition": {
        "StringLike": {
          "iotsitewise:assetHierarchyPath": [
            "/root node asset ID",
            "/root node asset ID/*"
          ]
        }
      }
    }
  ]
}
```

------

Remove the `Condition` from the policy to allow access to all of your assets. The following example trust policy allows access to all of your assets in the current Region.

------
#### [ JSON ]

****  

```
{
  "Version":"2012-10-17",		 	 	 
  "Statement": [
    {
      "Effect": "Allow",
      "Action": "iotsitewise:BatchPutAssetPropertyValue",
      "Resource": "*"
    }
  ]
}
```

------

# Configure the AWS IoT SiteWise rule action
<a name="configure-rule-action"></a>

The AWS IoT SiteWise rule action sends data from the MQTT message that initiated the rule to asset properties in AWS IoT SiteWise. You can upload multiple data entries to different asset properties at the same time,to send updates for all sensors of a device in one message. You can also upload multiple data points at once for each data entry.

**Note**  
When you send data to AWS IoT SiteWise with the rule action, your data must meet all of the requirements of the `BatchPutAssetPropertyValue` operation. For example, your data can't have a timestamp earlier than 7 days from current Unix epoch time. For more information, see [Ingesting data with the AWS IoT SiteWise API]().

For each data entry in the rule action, you identify an asset property and specify the timestamp, quality, and value of each data point for that asset property. The rule action expects strings for all parameters.

To identify an asset property in an entry, specify one of the following:
+ The **Asset ID** (`assetId`) and **Property ID** (`propertyId`) of the asset property that you're sending data to. You can find the Asset ID and Property ID using the AWS IoT SiteWise console. If you know the Asset ID, you can use the AWS CLI to call [DescribeAsset](https://docs.aws.amazon.com/iot-sitewise/latest/APIReference/API_DescribeAsset.html) to find the Property ID.
+ The **Property alias** (`propertyAlias`), which is a data stream alias (for example, `/company/windfarm/3/turbine/7/temperature`). To use this option, you must first set your asset property's alias. To learn how to set property aliases, see [Manage data streams for AWS IoT SiteWise](manage-data-streams.md).

For the timestamp in each entry, use the timestamp reported by your equipment or the timestamp provided by AWS IoT Core. The timestamp has two parameters:
+ **Time in seconds** (`timeInSeconds`) – The Unix epoch time, in seconds, at which the sensor or equipment reported the data.
+ **Offset in nanos** (`offsetInNanos`) – (Optional) The nanosecond offset from the time in seconds.

**Important**  
If your timestamp is a string, has a decimal portion, or isn't in seconds, AWS IoT SiteWise rejects the request. You must convert the timestamp to seconds and nanosecond offset. Use features of the AWS IoT rules engine to convert the timestamp. For more information, see the following:  
[Getting timestamps for devices that don't report accurate time](#rule-timestamp-function)
[Converting timestamps that are in string format](#rule-time-to-epoch-function)

You can use substitution templates for several parameters in the action to perform calculations, invoke functions, and pull values from the message payload. For more information, see [Substitution templates](https://docs.aws.amazon.com/iot/latest/developerguide/iot-substitution-templates.html) in the *AWS IoT Developer Guide*.

**Note**  <a name="substitution-template-limitations"></a>
Because an expression in a substitution template is evaluated separately from the `SELECT` statement, you can't use a substitution template to reference an alias created using an `AS` clause. You can reference only information present in the original payload, in addition to supported functions and operators.

**Topics**
+ [Getting timestamps for devices that don't report accurate time](#rule-timestamp-function)
+ [Converting timestamps that are in string format](#rule-time-to-epoch-function)
+ [Converting nanosecond-precision timestamp strings](#rule-convert-precise-timestamp-string)
+ [Example rule configurations](#rule-action-examples)
+ [Troubleshooting the rule action](#troubleshoot-rule-action)

## Getting timestamps for devices that don't report accurate time
<a name="rule-timestamp-function"></a>

If your sensor or equipment doesn't report accurate time data, get the current Unix epoch time from the AWS IoT rules engine with [timestamp()](https://docs.aws.amazon.com/iot/latest/developerguide/iot-sql-functions.html#iot-function-timestamp). This function outputs time in milliseconds, so you must convert the value to time in seconds and offset in nanoseconds. To do so, use the following conversions:
+ For **Time in seconds** (`timeInSeconds`), use **\$1\$1floor(timestamp() / 1E3)\$1** to convert the time from milliseconds to seconds.
+ For **Offset in nanos** (`offsetInNanos`), use **\$1\$1(timestamp() % 1E3) \$1 1E6\$1** to calculate the nanosecond offset of the timestamp.

## Converting timestamps that are in string format
<a name="rule-time-to-epoch-function"></a>

If your sensor or equipment reports time data in string format (for example, `2020-03-03T14:57:14.699Z`), use [time\$1to\$1epoch(String, String)](https://docs.aws.amazon.com/iot/latest/developerguide/iot-sql-functions.html#iot-sql-function-time-to-epoch). This function inputs the timestamp and format pattern as parameters and outputs time in milliseconds. Then, you must convert the time to time in seconds and offset in nanoseconds. To do so, use the following conversions:
+ For **Time in seconds** (`timeInSeconds`), use **\$1\$1floor(time\$1to\$1epoch("2020-03-03T14:57:14.699Z", "yyyy-MM-dd'T'HH:mm:ss'Z'") / 1E3)\$1** to convert the timestamp string to milliseconds, and then to seconds.
+ For **Offset in nanos** (`offsetInNanos`), use **\$1\$1(time\$1to\$1epoch("2020-03-03T14:57:14.699Z", "yyyy-MM-dd'T'HH:mm:ss'Z'") % 1E3) \$1 1E6\$1** to calculate the nanosecond offset of the timestamp string.

**Note**  
The `time_to_epoch` function supports up to millisecond-precision timestamp strings. To convert strings with microsecond or nanosecond precision, configure an AWS Lambda function that your rule calls to convert the timestamp into numerical values. For more information, see [Converting nanosecond-precision timestamp strings](#rule-convert-precise-timestamp-string).

## Converting nanosecond-precision timestamp strings
<a name="rule-convert-precise-timestamp-string"></a>

If your device sends timestamp information in string format with nanosecond precision (for example, `2020-03-03T14:57:14.699728491Z`), use the following procedure to configure your rule action. You can create an AWS Lambda function that converts the timestamp from a string into **Time in seconds** (`timeInSeconds`) and **Offset in nanos** (`offsetInNanos`). Then, use [aws\$1lambda(functionArn, inputJson)](https://docs.aws.amazon.com/iot/latest/developerguide/iot-sql-functions.html#iot-func-aws-lambda) in your rule action parameters to invoke that Lambda function and use the output in your rule.

**Note**  
This section contains advanced instructions that assume that you're familiar with how to create the following resources:  
Lambda functions. For more information, see [Create your first Lambda function](https://docs.aws.amazon.com/lambda/latest/dg/getting-started.html) in the *AWS Lambda Developer Guide*.
AWS IoT rules with the AWS IoT SiteWise rule action. For more information, see [Ingest data to AWS IoT SiteWise using AWS IoT Core rules](iot-rules.md).

**To create an AWS IoT SiteWise rule action that parses timestamp strings**

1. Create a Lambda function with the following properties:
   + **Function name** – Use a descriptive function name (for example, **ConvertNanosecondTimestampFromString**).
   + **Runtime** – Use a Python 3 runtime, such as **Python 3.11** (`python3.11`).
   + **Permissions** – Create a role with basic Lambda permissions (**AWSLambdaBasicExecutionRole**).
   + **Layers** – Add the **AWSSDKPandas-Python311** layer for the Lambda function to use `numpy`.
   + **Function code** – Use the following function code, which consumes a string argument named `timestamp` and outputs `timeInSeconds` and `offsetInNanos` values for that timestamp.

     ```
     import json
     import math
     import numpy
     
     # Converts a timestamp string into timeInSeconds and offsetInNanos in Unix epoch time.
     # The input timestamp string can have up to nanosecond precision.
     def lambda_handler(event, context):
         timestamp_str = event['timestamp']
         # Parse the timestamp string as nanoseconds since Unix epoch.
         nanoseconds = numpy.datetime64(timestamp_str, 'ns').item()
         time_in_seconds = math.floor(nanoseconds / 1E9)
         # Slice to avoid precision issues.
         offset_in_nanos = int(str(nanoseconds)[-9:])
         return {
             'timeInSeconds': time_in_seconds,
             'offsetInNanos': offset_in_nanos
         }
     ```

     This Lambda function inputs timestamp strings in [ISO 8601](https://en.wikipedia.org/wiki/ISO_8601) format using [datetime64](https://numpy.org/doc/stable/reference/arrays.datetime.html) from NumPy.
**Note**  
If your timestamp strings aren't in ISO 8601 format, you can implement a solution with pandas that defines the timestamp format. For more information, see [pandas.to\$1datetime](https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.to_datetime.html).

1. When you configure the AWS IoT SiteWise action for your rule, use the following substitution templates for **Time in seconds** (`timeInSeconds`) and **Offset in nanos** (`offsetInNanos`). These substitution templates assume that your message payload contains the timestamp string in `timestamp`. The `aws_lambda` function consumes a JSON structure for its second parameter, so you can modify the below substitution templates if needed.
   + For **Time in seconds** (`timeInSeconds`), use the following substitution template.

     ```
     ${aws_lambda('arn:aws:lambda:region:account-id:function:ConvertNanosecondTimestampFromString', {'timestamp': timestamp}).timeInSeconds}
     ```
   + For **Offset in nanos** (`offsetInNanos`), use the following substitution template.

     ```
     ${aws_lambda('arn:aws:lambda:region:account-id:function:ConvertNanosecondTimestampFromString', {'timestamp': timestamp}).offsetInNanos}
     ```

   For each parameter, replace *region* and *account-id* with your Region and AWS account ID. If you used a different name for your Lambda function, change that as well.

1. Grant AWS IoT permissions to invoke your function with the `lambda:InvokeFunction` permission. For more information, see [aws\$1lambda(functionArn, inputJson)](https://docs.aws.amazon.com/iot/latest/developerguide/iot-sql-functions.html#iot-func-aws-lambda).

1. Test your rule (for example, use the AWS IoT MQTT test client) and verify that AWS IoT SiteWise receives the data that you send.

   If your rule doesn't work as expected, see [Troubleshoot an AWS IoT SiteWise rule action](troubleshoot-rule.md).

**Note**  
This solution invokes the Lambda function twice for each timestamp string. You can create another rule to reduce the number of Lambda function invocations if your rule handles multiple data points that have the same timestamp in each payload.  
To do so, create a rule with a republish action that invokes the Lambda and publishes the original payload with the timestamp string converted to `timeInSeconds` and `offsetInNanos`. Then, create a rule with an AWS IoT SiteWise rule action to consume the converted payload. With this approach, you reduce the number of times that the rule invokes the Lambda but increase the number of AWS IoT rule actions run. Consider the pricing of each service if you apply this solution to your use case.

## Example rule configurations
<a name="rule-action-examples"></a>

This section contains example rule configurations to create a rule with an AWS IoT SiteWise action.

**Example rule action that uses property aliases as message topics**  
The following example creates a rule with an AWS IoT SiteWise action that uses the topic (through [topic()](https://docs.aws.amazon.com/iot/latest/developerguide/iot-sql-functions.html#iot-function-topic)) as the property alias to identify asset properties. Use this example to define one rule for ingesting double-type data to all wind turbines in all wind farms. This example requires that you define property aliases on all turbine assets' properties. You would need to define a second, similar rule to ingest integer-type data.  

```
aws iot create-topic-rule \
  --rule-name SiteWiseWindFarmRule \
  --topic-rule-payload file://sitewise-rule-payload.json
```
The example payload in `sitewise-rule-payload.json` contains the following content.  

```
{
  "sql": "SELECT * FROM '/company/windfarm/+/turbine/+/+' WHERE type = 'double'",
  "description": "Sends data to the wind turbine asset property with the same alias as the topic",
  "ruleDisabled": false,
  "awsIotSqlVersion": "2016-03-23",
  "actions": [
    {
      "iotSiteWise": {
        "putAssetPropertyValueEntries": [
          {
            "propertyAlias": "${topic()}",
            "propertyValues": [
              {
                "timestamp": {
                  "timeInSeconds": "${timeInSeconds}"
                },
                "value": {
                  "doubleValue": "${value}"
                }
              }
            ]
          }
        ],
        "roleArn": "arn:aws:iam::account-id:role/MySiteWiseActionRole"
      }
    }
  ]
}
```
With this rule action, send the following message to a wind turbine property alias (for example, `/company/windfarm/3/turbine/7/temperature`) as a topic to ingest data.  

```
{
  "type": "double",
  "value": "38.3",
  "timeInSeconds": "1581368533"
}
```

**Example rule action that uses timestamp() to determine time**  
The following example creates a rule with an AWS IoT SiteWise action that identifies an asset property by IDs and uses [timestamp()](https://docs.aws.amazon.com/iot/latest/developerguide/iot-sql-functions.html#iot-function-timestamp) to determine the current time.  

```
aws iot create-topic-rule \
  --rule-name SiteWiseAssetPropertyRule \
  --topic-rule-payload file://sitewise-rule-payload.json
```
The example payload in `sitewise-rule-payload.json` contains the following content.  

```
{
  "sql": "SELECT * FROM 'my/asset/property/topic'",
  "description": "Sends device data to an asset property",
  "ruleDisabled": false,
  "awsIotSqlVersion": "2016-03-23",
  "actions": [
    {
      "iotSiteWise": {
        "putAssetPropertyValueEntries": [
          {
            "assetId": "a1b2c3d4-5678-90ab-cdef-22222EXAMPLE",
            "propertyId": "a1b2c3d4-5678-90ab-cdef-33333EXAMPLE",
            "propertyValues": [
              {
                "timestamp": {
                  "timeInSeconds": "${floor(timestamp() / 1E3)}",
                  "offsetInNanos": "${(timestamp() % 1E3) * 1E6}"
                },
                "value": {
                  "doubleValue": "${value}"
                }
              }
            ]
          }
        ],
        "roleArn": "arn:aws:iam::account-id:role/MySiteWiseActionRole"
      }
    }
  ]
}
```
With this rule action, send the following message to the `my/asset/property/topic` to ingest data.  

```
{
  "type": "double",
  "value": "38.3"
}
```

## Troubleshooting the rule action
<a name="troubleshoot-rule-action"></a>

To troubleshoot your AWS IoT SiteWise rule action in AWS IoT Core, configure CloudWatch Logs or configure a republish error action for your rule. For more information, see [Troubleshoot an AWS IoT SiteWise rule action](troubleshoot-rule.md).

# Reduce costs with Basic Ingest in AWS IoT SiteWise
<a name="basic-ingest-rules"></a>

AWS IoT Core provides a feature called Basic Ingest that you can use to send data through AWS IoT Core without incurring [AWS IoT messaging costs](https://aws.amazon.com/iot-core/pricing/). Basic Ingest optimizes data flow for high volume data ingestion workloads by removing the publish/subscribe message broker from the ingestion path. You can use Basic Ingest if you know which rules your messages should be routed to.

To use Basic Ingest, you send messages directly to a specific rule using a special topic, `$aws/rules/rule-name`. For example, to send a message to a rule named `SiteWiseWindFarmRule`, you send a message to the topic `$aws/rules/SiteWiseWindFarmRule`.

If your rule action uses substitution templates that contain [topic(Decimal)](https://docs.aws.amazon.com/iot/latest/developerguide/iot-sql-functions.html#iot-function-topic), you can pass the original topic at the end of the Basic Ingest special topic, such as `$aws/rules/rule-name/original-topic`. For example, to use Basic Ingest with the wind farm property alias example from the previous section, you can send messages to the following topic.

```
$aws/rules/SiteWiseWindFarmRule//company/windfarm/3/turbine/7/temperature
```

**Note**  
The above example includes a second slash (`//`) because AWS IoT removes the Basic Ingest prefix (`$aws/rules/rule-name/`) from the topic that's visible to the rule action. In this example, the rule receives the topic `/company/windfarm/3/turbine/7/temperature`.

For more information, see [Reducing messaging costs with basic ingest](https://docs.aws.amazon.com/iot/latest/developerguide/iot-basic-ingest.html) in the *AWS IoT Developer Guide*.

# Ingest data to AWS IoT SiteWise from AWS IoT Events
<a name="iot-events"></a>

With AWS IoT Events, you can build complex event monitoring applications for your IoT fleet in the AWS Cloud. Use the IoT SiteWise action in AWS IoT Events to send data to asset properties in AWS IoT SiteWise when an event occurs.

**Note**  
End of support notice: AWS ended support for AWS IoT Events. For more information, see [AWS IoT Events end of support](https://docs.aws.amazon.com/iotevents/latest/developerguide/iotevents-end-of-support.html).

AWS IoT Events is designed to streamline the development of event monitoring applications for IoT devices and systems within the AWS Cloud. Using AWS IoT Events, you can:
+ Detect and respond to changes, anomalies, or specific conditions across your IoT fleet.
+  Enhance your operational efficiency and enable proactive management of your IoT ecosystem.

 By integrating with AWS IoT SiteWise through the AWS IoT SiteWise action, AWS IoT Events extends its capabilities, allowing you to automatically update asset properties in AWS IoT SiteWise in response to specific events. This interaction can simplify data ingestion and management. It can also empower you with actionable insights. 

For more information, see the following topics in the *AWS IoT Events Developer Guide*:
+ [What is AWS IoT Events?](https://docs.aws.amazon.com/iotevents/latest/developerguide/)
+ [AWS IoT Events actions](https://docs.aws.amazon.com/iotevents/latest/developerguide/iotevents-supported-actions.html)
+ [IoT SiteWise action](https://docs.aws.amazon.com/iotevents/latest/developerguide/iotevents-other-aws-services.html#iotevents-iotsitewise)

# Use AWS IoT Greengrass stream manager in AWS IoT SiteWise
<a name="greengrass-stream-manager"></a>

AWS IoT Greengrass stream manager is an integration feature that facilitates the transfer of data streams from local sources to the AWS Cloud. It acts as an intermediary layer that manages data flows, enabling devices operating at the edge to gather and store data before it is sent to AWS IoT SiteWise, for further analysis and processing.

Add a data destination by configuring a local source on the AWS IoT SiteWise console. You can also use stream manager in your custom AWS IoT Greengrass solution to ingest data to AWS IoT SiteWise.

**Note**  
To ingest data from OPC UA sources, configure an AWS IoT SiteWise Edge gateway that runs on AWS IoT Greengrass. For more information, see [Use AWS IoT SiteWise Edge gateways](gateways.md).

For more information about how to **configure a destination** for local source data, see [Understand AWS IoT SiteWise Edge destinations](gw-destinations.md#source-destination).

For more information about how to **ingest data using stream manager** in a custom AWS IoT Greengrass solution, see the following topics in the *AWS IoT Greengrass Version 2 Developer Guide*:
+ [What is AWS IoT Greengrass?](https://docs.aws.amazon.com/greengrass/v2/developerguide/)
+ [Manage data streams on the AWS IoT Greengrass core](https://docs.aws.amazon.com/greengrass/v2/developerguide/manage-data-streams.html)
+ [Exporting data to AWS IoT SiteWise asset properties](https://docs.aws.amazon.com/greengrass/v2/developerguide/stream-export-configurations.html#export-to-iot-sitewise)