

# Fanout to Firehose delivery streams
<a name="sns-firehose-as-subscriber"></a>

You can subscribe [ delivery streams](https://docs.aws.amazon.com/firehose/latest/dev/what-is-this-service.html) to Amazon SNS topics, allowing you to send notifications to additional storage and analytics endpoints. Messages published to an Amazon SNS topic are sent to the subscribed Firehose delivery stream, and delivered to the destination as configured in Firehose. A subscription owner can subscribe up to five Firehose delivery streams to an Amazon SNS topic. Each Firehose delivery stream has a [default quota](https://docs.aws.amazon.com/firehose/latest/dev/limits.html) for requests and throughput per second. This limit could result in more messages published (inbound traffic) than delivered (outbound traffic). When there's more inbound than outbound traffic, your subscription can accumulate a large message backlog, causing high message delivery latency. You can request an [increase in quota](https://support.console.aws.amazon.com/support/home#/case/create?issueType=service-limit-increase) based on the publish rate to avoid adverse impact on your workload. 

Through Firehose delivery streams, you can fan out Amazon SNS notifications to Amazon Simple Storage Service (Amazon S3), Amazon Redshift, Amazon OpenSearch Service (OpenSearch Service), and to third-party service providers such as Datadog, New Relic, MongoDB, and Splunk.

For example, you can use this functionality to permanently store messages sent to a topic in an Amazon S3 bucket for compliance, archival, or other purposes. To do this, create a Firehose delivery stream with an Amazon S3 bucket destination, and subscribe that delivery stream to the Amazon SNS topic. As another example, to perform analysis on messages sent to an Amazon SNS topic, create a delivery stream with an OpenSearch Service index destination. You can then subscribe the Firehose delivery stream to the Amazon SNS topic.

Amazon SNS also supports message delivery status logging for notifications sent to Firehose endpoints. For more information, see [Amazon SNS message delivery status](sns-topic-attributes.md).

# Prerequisites for subscribing Firehose delivery streams to Amazon SNS topics
<a name="prereqs-kinesis-data-firehose"></a>

To subscribe an delivery stream to an SNS topic, your AWS account must have:
+ A standard SNS topic. For more information, see [Creating an Amazon SNS topic](sns-create-topic.md).
+ A Firehose delivery stream. For more information, see [Creating an Delivery Stream](https://docs.aws.amazon.com/firehose/latest/dev/basic-create.html) and [Grant Your Application Access to Your Firehose Resources](https://docs.aws.amazon.com/firehose/latest/dev/controlling-access.html#access-to-firehose) in the *Amazon Data Firehose Developer Guide*.
+ An AWS Identity and Access Management (IAM) role that trusts the Amazon SNS service principal and has permission to write to the delivery stream. You'll enter this role's Amazon Resource Name (ARN) as the `SubscriptionRoleARN` when you create the subscription. Amazon SNS assumes this role, which allows Amazon SNS to put records in the Firehose delivery stream.

  The following example policy shows the recommended permissions:

------
#### [ JSON ]

****  

  ```
  {
    "Version":"2012-10-17",		 	 	 
    "Statement": [
      {
        "Action": [
          "firehose:DescribeDeliveryStream",
          "firehose:ListDeliveryStreams",
          "firehose:ListTagsForDeliveryStream",
          "firehose:PutRecord",
          "firehose:PutRecordBatch"
        ],
        "Resource": [
          "arn:aws:firehose:us-east-1:111111111111:deliverystream/firehose-sns-delivery-stream"
        ],
        "Effect": "Allow"
      }
    ]
  }
  ```

------

  To provide full permission for using Firehose, you can also use the AWS managed policy `AmazonKinesisFirehoseFullAccess`. Or, to provide stricter permissions for using Firehose, you can create your own policy. At minimum, the policy must provide permission to run the `PutRecord` operation on a specific delivery stream.

  In all cases, you must also edit the trust relationship to include the Amazon SNS service principal. For example:

------
#### [ JSON ]

****  

  ```
  {
    "Version":"2012-10-17",		 	 	 
    "Statement": [
      {
        "Effect": "Allow",
        "Principal": {
          "Service": "sns.amazonaws.com"
        },
        "Action": "sts:AssumeRole"
      }
    ]
  }
  ```

------

  For more information on creating roles, see [Creating a role to delegate permissions to an AWS service](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html) in the *IAM User Guide*.

After you've completed these requirements, you can [subscribe the delivery stream to the SNS topic](firehose-endpoints-subscribe.md).

# Subscribing a Firehose delivery stream to an Amazon SNS topic
<a name="firehose-endpoints-subscribe"></a>

To deliver Amazon SNS notifications to [ delivery streams](sns-firehose-as-subscriber.md), first make sure that you've addressed all the [prerequisites](prereqs-kinesis-data-firehose.md). For a list of supported endpoints, see [ endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/fh.html) in the *Amazon Web Services General Reference*.

**To subscribe a Firehose delivery stream to a topic**

1. Sign in to the [Amazon SNS console](https://console.aws.amazon.com/sns/home).

1. In the navigation pane, choose **Subscriptions**.

1. On the **Subscriptions** page, choose **Create subscription**.

1. On the **Create subscription** page, in the **Details** section, do the following:

   1. For **Topic ARN**, choose the Amazon Resource Name (ARN) of a standard topic.

   1. For **Protocol**, choose **Firehose**.

   1. For **Endpoint**, choose the ARN of a Firehose delivery stream that can receive notifications from Amazon SNS.

   1. For **Subscription role ARN**, specify the ARN of the AWS Identity and Access Management (IAM) role that you created for writing to Firehose delivery streams. For more information, see [Prerequisites for subscribing Firehose delivery streams to Amazon SNS topics](prereqs-kinesis-data-firehose.md).

   1. (Optional) To remove any Amazon SNS metadata from published messages, choose **Enable raw message delivery**. For more information, see [Amazon SNS raw message delivery](sns-large-payload-raw-message-delivery.md).

1. (Optional) To configure a filter policy, expand the **Subscription filter policy** section. For more information, see [Amazon SNS subscription filter policies](sns-subscription-filter-policies.md).

1. (Optional) To configure a dead-letter queue for the subscription, expand the **Redrive policy (dead-letter queue)** section. For more information, see [Amazon SNS dead-letter queues](sns-dead-letter-queues.md).

1. Choose **Create subscription**.

The console creates the subscription and opens the subscription's **Details** page.

# Managing Amazon SNS messages across multiple delivery stream destinations
<a name="firehose-working-with-destinations"></a>

[ delivery streams](sns-firehose-as-subscriber.md) allow you to manage Amazon SNS messages across multiple destinations, enabling integration with Amazon S3, Amazon OpenSearch Service, Amazon Redshift, and HTTP endpoints for storage, indexing, and analysis. By properly configuring message formatting and delivery, you can store Amazon SNS notifications in Amazon S3 for later processing, analyze structured message data using Amazon Athena, index messages in OpenSearch for real-time search and visualization, and structure archives in Amazon Redshift for advanced querying.

# Storing and analyzing Amazon SNS messages in Amazon S3 destinations
<a name="firehose-s3-destinations"></a>

This topic explains how delivery streams publish data to Amazon Simple Storage Service (Amazon S3).

![\[The integration and workflow of Amazon services for message handling. It shows how a publisher sends messages to an Amazon SNS topic, which then fans out messages to multiple Amazon SQS queues and an Data Firehose delivery stream. From there, messages can be processed by Lambda functions or stored persistently in an Amazon S3 bucket.\]](http://docs.aws.amazon.com/sns/latest/dg/images/firehose-architecture-s3.png)


**Topics**
+ [Formatting notifications for storage in Amazon S3 destinations](firehose-archived-message-format-S3.md)
+ [Analyzing messages stored in Amazon S3 using Athena](firehose-message-analysis-s3.md)

# Formatting Amazon SNS notifications for storage in Amazon S3 destinations
<a name="firehose-archived-message-format-S3"></a>

The following example shows an Amazon SNS notification sent to an Amazon Simple Storage Service (Amazon S3) bucket, with indentation for readability.

**Note**  
In this example, raw message delivery is disabled for the published message. When raw message delivery is disabled, Amazon SNS adds JSON metadata to the message, including these properties:  
`Type`
`MessageId`
`TopicArn`
`Subject`
`Timestamp`
`UnsubscribeURL`
`MessageAttributes`
For more information about raw delivery, see [Amazon SNS raw message delivery](sns-large-payload-raw-message-delivery.md).

```
{
    "Type": "Notification",
    "MessageId": "719a6bbf-f51b-5320-920f-3385b5e9aa56",
    "TopicArn": "arn:aws:sns:us-east-1:333333333333:my-kinesis-test-topic",     
    "Subject": "My 1st subject",
    "Message": "My 1st body",
    "Timestamp": "2020-11-26T23:48:02.032Z",
    "UnsubscribeURL": "https://sns.us-east-1.amazonaws.com/?Action=Unsubscribe&SubscriptionArn=arn:aws:sns:us-east-1:333333333333:my-kinesis-test-topic:0b410f3c-ee5e-49d8-b59b-3b4aa6d8fcf5",
    "MessageAttributes": {
        "myKey1": {
            "Type": "String",
            "Value": "myValue1"
        },
        "myKey2": {
            "Type": "String",
            "Value": "myValue2"
        }
    }
 }
```

The following example shows three SNS messages sent through an delivery stream to the same Amazon S3 bucket. Buffering is applied, and line breaks separate each message.

```
{"Type":"Notification","MessageId":"d7d2513e-6126-5d77-bbe2-09042bd0a03a","TopicArn":"arn:aws:sns:us-east-1:333333333333:my-kinesis-test-topic","Subject":"My 1st subject","Message":"My 1st body","Timestamp":"2020-11-27T00:30:46.100Z","UnsubscribeURL":"https://sns.us-east-1.amazonaws.com/?Action=Unsubscribe&SubscriptionArn=arn:aws:sns:us-east-1:313276652360:my-kinesis-test-topic:0b410f3c-ee5e-49d8-b59b-3b4aa6d8fcf5","MessageAttributes":{"myKey1":{"Type":"String","Value":"myValue1"},"myKey2":{"Type":"String","Value":"myValue2"}}}
{"Type":"Notification","MessageId":"0c0696ab-7733-5bfb-b6db-ce913c294d56","TopicArn":"arn:aws:sns:us-east-1:333333333333:my-kinesis-test-topic","Subject":"My 2nd subject","Message":"My 2nd body","Timestamp":"2020-11-27T00:31:22.151Z","UnsubscribeURL":"https://sns.us-east-1.amazonaws.com/?Action=Unsubscribe&SubscriptionArn=arn:aws:sns:us-east-1:313276652360:my-kinesis-test-topic:0b410f3c-ee5e-49d8-b59b-3b4aa6d8fcf5","MessageAttributes":{"myKey1":{"Type":"String","Value":"myValue1"}}}
{"Type":"Notification","MessageId":"816cd54d-8cfa-58ad-91c9-8d77c7d173aa","TopicArn":"arn:aws:sns:us-east-1:333333333333:my-kinesis-test-topic","Subject":"My 3rd subject","Message":"My 3rd body","Timestamp":"2020-11-27T00:31:39.755Z","UnsubscribeURL":"https://sns.us-east-1.amazonaws.com/?Action=Unsubscribe&SubscriptionArn=arn:aws:sns:us-east-1:313276652360:my-kinesis-test-topic:0b410f3c-ee5e-49d8-b59b-3b4aa6d8fcf5"}
```

# Analyzing Amazon SNS messages stored in Amazon S3 using Athena
<a name="firehose-message-analysis-s3"></a>

This page explains how to analyze Amazon SNS messages that are sent through delivery streams to Amazon Simple Storage Service (Amazon S3) destinations.

**To analyze SNS messages sent through Firehose delivery streams to Amazon S3 destinations**

1. Configure your Amazon S3 resources. For instructions, see [Creating a bucket](https://docs.aws.amazon.com/AmazonS3/latest/userguide/CreatingABucket.html) in the *Amazon Simple Storage Service User Guide* and [Working with Amazon S3 Buckets](https://docs.aws.amazon.com/AmazonS3/latest/userguide/UsingBucket.html) in the *Amazon Simple Storage Service User Guide*.

1. Configure your delivery stream. For instructions, see [Choose Amazon S3 for Your Destination](https://docs.aws.amazon.com/firehose/latest/dev/create-destination.html#create-destination-s3) in the *Amazon Data Firehose Developer Guide*.

1. Use [Amazon Athena](https://console.aws.amazon.com/athena) to query the Amazon S3 objects using standard SQL. For more information, see [Getting Started](https://docs.aws.amazon.com/athena/latest/ug/getting-started.html) in the *Amazon Athena User Guide*.

## Example query
<a name="example-s3-query"></a>

For this example query, assume the following:
+ Messages are stored in the `notifications` table in the `default` schema.
+ The `notifications` table includes a `timestamp` column with a type of `string`.

The following query returns all SNS messages received in the specified date range:

```
SELECT * 
FROM default.notifications
WHERE from_iso8601_timestamp(timestamp) BETWEEN TIMESTAMP '2020-12-01 00:00:00' AND TIMESTAMP '2020-12-02 00:00:00';
```

# Integrating Amazon SNS messages with Amazon OpenSearch Service destinations
<a name="firehose-elasticsearch-destinations"></a>

This section explains how delivery streams publish data to Amazon OpenSearch Service (OpenSearch Service).

![\[A publisher sends messages to an Amazon SNS topic, which then distributes these messages to multiple Amazon SQS queues. Messages from these queues can be processed by Lambda functions or sent through an Data Firehose delivery stream to an Amazon OpenSearch Service, creating a searchable message index. This setup demonstrates an advanced message routing and processing scenario using AWS services.\]](http://docs.aws.amazon.com/sns/latest/dg/images/firehose-architecture-es.png)


**Topics**
+ [Archived message format](firehose-archived-message-format-elasticsearch.md)
+ [Analyzing messages](firehose-message-analysis-elasticsearch.md)

# Storing and formatting Amazon SNS Notifications in OpenSearch Service indices
<a name="firehose-archived-message-format-elasticsearch"></a>

The following example demonstrates an Amazon SNS notification sent to an Amazon OpenSearch Service (OpenSearch Service) index called `my-index`. This index has a time filter field on the `Timestamp` field. The SNS notification is placed in the `_source` property of the payload.

**Note**  
In this example, raw message delivery is disabled for the published message. When raw message delivery is disabled, Amazon SNS adds JSON metadata to the message, including these properties:  
`Type`
`MessageId`
`TopicArn`
`Subject`
`Timestamp`
`UnsubscribeURL`
`MessageAttributes`
For more information about raw delivery, see [Amazon SNS raw message delivery](sns-large-payload-raw-message-delivery.md).

```
{
  "_index": "my-index",
  "_type": "_doc",
  "_id": "49613100963111323203250405402193283794773886550985932802.0",
  "_version": 1,
  "_score": null,
  "_source": {
    "Type": "Notification",
    "MessageId": "bf32e294-46e3-5dd5-a6b3-bad65162e136",
    "TopicArn": "arn:aws:sns:us-east-1:111111111111:my-topic",
    "Subject": "Sample subject",
    "Message": "Sample message",
    "Timestamp": "2020-12-02T22:29:21.189Z",
    "UnsubscribeURL": "https://sns.us-east-1.amazonaws.com/?Action=Unsubscribe&SubscriptionArn=arn:aws:sns:us-east-1:111111111111:my-topic:b5aa9bc1-9c3d-452b-b402-aca2cefc63c9",
    "MessageAttributes": {
      "my_attribute": {
        "Type": "String",
        "Value": "my_value"
      }
    }
  },
  "fields": {
    "Timestamp": [
      "2020-12-02T22:29:21.189Z"
    ]
  },
  "sort": [
    1606948161189
  ]
}
```

# Analyzing Amazon SNS messages for OpenSearch Service destinations
<a name="firehose-message-analysis-elasticsearch"></a>

This topic explains how to analyze Amazon SNS messages sent through delivery streams to Amazon OpenSearch Service (OpenSearch Service) destinations.

**To analyze SNS messages sent through Firehose delivery streams to OpenSearch Service destinations**

1. Configure your OpenSearch Service resources. For instructions, see [Getting Started with Amazon OpenSearch Service](https://docs.aws.amazon.com/opensearch-service/latest/developerguide/es-gsg.html) in the *Amazon OpenSearch Service Developer Guide*.

1. Configure your delivery stream. For instructions, see [Choose OpenSearch Service for Your Destination](https://docs.aws.amazon.com/firehose/latest/dev/create-destination.html#create-destination-elasticsearch) in the *Amazon Data Firehose Developer Guide*.

1. Run a query using OpenSearch Service queries and Kibana. For more information, see [Step 3: Search Documents in an OpenSearch Service Domain](https://docs.aws.amazon.com/opensearch-service/latest/developerguide/es-gsg-search.html) and [Kibana](https://docs.aws.amazon.com/opensearch-service/latest/developerguide/es-kibana.html) in the *Amazon OpenSearch Service Developer Guide*.

## Example query
<a name="example-es-query"></a>

The following example queries the `my-index` index for all SNS messages received in the specified date range:

```
POST https://search-my-domain.us-east-1.es.amazonaws.com/my-index/_search
{
  "query": {
    "bool": {
      "filter": [
        {
          "range": {
            "Timestamp": {
              "gte": "2020-12-08T00:00:00.000Z",
              "lte": "2020-12-09T00:00:00.000Z",
              "format": "strict_date_optional_time"
            }
          }
        }
      ]
    }
  }
}
```

# Configuring Amazon SNS message delivery and analysis in Amazon Redshift destinations
<a name="firehose-redshift-destinations"></a>

This topic explains how to fan out Amazon SNS notifications to an delivery stream, which then publishes data to Amazon Redshift. With this setup, you can connect to the Amazon Redshift database and use a SQL query tool to retrieve Amazon SNS messages that match specific criteria.

![\[Messages published by a sender to an Amazon SNS topic are distributed to multiple Amazon SQS queues for processing by Lambda functions, and also sent through an Data Firehose delivery stream to an Amazon Redshift cluster for storage and analysis in a message data warehouse. This setup demonstrates a robust message handling and data warehousing architecture using AWS services.\]](http://docs.aws.amazon.com/sns/latest/dg/images/firehose-architecture-rs.png)


**Topics**
+ [Structuring message archives in Amazon Redshift tables](firehose-archive-table-structure-redshift.md)
+ [Analyzing messages stored in Amazon Redshift destinations](firehose-message-analysis-redshift.md)

# Structuring Amazon SNS message archives in Amazon Redshift tables
<a name="firehose-archive-table-structure-redshift"></a>

For Amazon Redshift endpoints, Amazon SNS messages are archived as rows in a table. Here's an example of how the data is stored:

**Note**  
In this example, raw message delivery is disabled for the published message. When raw message delivery is disabled, Amazon SNS adds JSON metadata to the message, including these properties:  
`Type`
`MessageId`
`TopicArn`
`Subject`
`Message`
`Timestamp`
`UnsubscribeURL`
`MessageAttributes`
For more information about raw delivery, see [Amazon SNS raw message delivery](sns-large-payload-raw-message-delivery.md).  
Although Amazon SNS adds properties to the message using the capitalization shown in this list, column names in Amazon Redshift tables appear in all lowercase characters. To transform the JSON metadata for the Amazon Redshift endpoint, you can use the SQL `COPY` command. For more information, see [Copy from JSON examples](https://docs.aws.amazon.com/redshift/latest/dg/r_COPY_command_examples.html#r_COPY_command_examples-copy-from-json) and [Load from JSON data using the 'auto ignorecase' option](https://docs.aws.amazon.com/redshift/latest/dg/r_COPY_command_examples.html#copy-from-json-examples-using-auto-ignorecase) in the *Amazon Redshift Database Developer Guide*.


|  type  |  messageid  |  topicarn  |  subject  |  message  |  timestamp  |  unsubscribeurl  |  messageattributes  | 
| --- | --- | --- | --- | --- | --- | --- | --- | 
|  Notification  |  ea544832-a0d8-581d-9275-108243c46103  |  arn:aws:sns:us-east-1:111111111111:my-topic  |  Sample subject  |  Sample message  |  2020-12-02T00:33:32.272Z  |  https://sns.us-east-1.amazonaws.com/?Action=Unsubscribe&SubscriptionArn=arn:aws:sns:us-east-1:111111111111:my-topic:326deeeb-cbf4-45da-b92b-ca77a247813b  |  \$1\$1"my\$1attribute\$1":\$1\$1"Type\$1":\$1"String\$1",\$1"Value\$1":\$1"my\$1value\$1"\$1\$1  | 
|  Notification  |  ab124832-a0d8-581d-9275-108243c46114  |  arn:aws:sns:us-east-1:111111111111:my-topic  |  Sample subject 2  |  Sample message 2  |  2020-12-03T00:18:11.129Z  |  https://sns.us-east-1.amazonaws.com/?Action=Unsubscribe&SubscriptionArn=arn:aws:sns:us-east-1:111111111111:my-topic:326deeeb-cbf4-45da-b92b-ca77a247813b  |  \$1\$1"my\$1attribute2\$1":\$1\$1"Type\$1":\$1"String\$1",\$1"Value\$1":\$1"my\$1value\$1"\$1\$1  | 
|  Notification  |  ce644832-a0d8-581d-9275-108243c46125  |  arn:aws:sns:us-east-1:111111111111:my-topic  |  Sample subject 3  |  Sample message 3  |  2020-12-09T00:08:44.405Z  |  https://sns.us-east-1.amazonaws.com/?Action=Unsubscribe&SubscriptionArn=arn:aws:sns:us-east-1:111111111111:my-topic:326deeeb-cbf4-45da-b92b-ca77a247813b  |  \$1\$1"my\$1attribute3\$1":\$1\$1"Type\$1":\$1"String\$1",\$1"Value\$1":\$1"my\$1value\$1"\$1\$1  | 

For more information about fanning out notifications to Amazon Redshift endpoints, see [Configuring Amazon SNS message delivery and analysis in Amazon Redshift destinations](firehose-redshift-destinations.md).

# Analyzing Amazon SNS messages stored in Amazon Redshift destinations
<a name="firehose-message-analysis-redshift"></a>

This topic describes how to analyze Amazon SNS messages that are sent through delivery streams to Amazon Redshift destinations.

**To analyze SNS messages sent through Firehose delivery streams to Amazon Redshift destinations**

1. Configure your Amazon Redshift resources. For instructions, see [Getting started with Amazon Redshift](https://docs.aws.amazon.com/redshift/latest/gsg/getting-started.html) in the *Amazon Redshift Getting Started Guide*.

1. Configure your delivery stream. For instructions, see [Choose Amazon Redshift for Your Destination](https://docs.aws.amazon.com/firehose/latest/dev/create-destination.html#create-destination-redshift) in the *Amazon Data Firehose Developer Guide*.

1. Run a query. For more information, see [Querying a database using the query editor](https://docs.aws.amazon.com/redshift/latest/mgmt/query-editor.html) in the *Amazon Redshift Management Guide*.

## Example query
<a name="example-rs-query"></a>

For this example query, assume the following:
+ Messages are stored in the `notifications` table in the default `public` schema.
+ The `Timestamp` property from the SNS message is stored in the table's `timestamp` column with a column data type of `timestamptz`.
**Note**  
To transform the JSON metadata for the Amazon Redshift endpoint, you can use the SQL `COPY` command. For more information, see [Copy from JSON examples](https://docs.aws.amazon.com/redshift/latest/dg/r_COPY_command_examples.html#r_COPY_command_examples-copy-from-json) and [Load from JSON data using the 'auto ignorecase' option](https://docs.aws.amazon.com/redshift/latest/dg/r_COPY_command_examples.html#copy-from-json-examples-using-auto-ignorecase) in the *Amazon Redshift Database Developer Guide*.

The following query returns all SNS messages received in the specified date range:

```
SELECT *
FROM public.notifications
WHERE timestamp > '2020-12-01T09:00:00.000Z' AND timestamp < '2020-12-02T09:00:00.000Z';
```

# Configuring Amazon SNS message delivery to HTTP destinations using
<a name="firehose-http-destinations"></a>

This topic explains how delivery streams publish data to HTTP endpoints.

![\[A publisher to an Amazon SNS topic, which then distributes the messages to multiple Amazon SQS queues. These messages are processed by Lambda functions and also sent through an Data Firehose delivery stream to an HTTP endpoint. This setup showcases how AWS services work together to facilitate message handling and integration with external HTTP services.\]](http://docs.aws.amazon.com/sns/latest/dg/images/firehose-architecture-http.png)


**Topics**
+ [Notification format for delivery to HTTP destinations](firehose-delivered-message-format-http.md)

# Amazon SNS notification format for delivery to HTTP destinations
<a name="firehose-delivered-message-format-http"></a>

Here’s an example of an HTTP POST request body from Amazon SNS, sent through an delivery stream to an HTTP endpoint. The Amazon SNS notification is encoded as a base64 payload within the records property.

**Note**  
In this example, raw message delivery is disabled for the published message. For more information about raw delivery, see [Amazon SNS raw message delivery](sns-large-payload-raw-message-delivery.md).

```
"body": {
    "requestId": "ebc9e8b2-fce3-4aef-a8f1-71698bf8175f",
    "timestamp": 1606255960435,
    "records": [
      {
        "data": "eyJUeXBlIjoiTm90aWZpY2F0aW9uIiwiTWVzc2FnZUlkIjoiMjFkMmUzOGQtMmNhYi01ZjYxLTliYTItYmJiYWFhYzg0MGY2IiwiVG9waWNBcm4iOiJhcm46YXdzOnNuczp1cy1lYXN0LTE6MTExMTExMTExMTExOm15LXRvcGljIiwiTWVzc2FnZSI6IlNhbXBsZSBtZXNzYWdlIGZvciBBbWF6b24gS2luZXNpcyBEYXRhIEZpcmVob3NlIGVuZHBvaW50cyIsIlRpbWVzdGFtcCI6IjIwMjAtMTEtMjRUMjI6MDc6MzEuNjY3WiIsIlVuc3Vic2NyaWJlVVJMIjoiaHR0cHM6Ly9zbnMudXMtZWFzdC0xLmFtYXpvbmF3cy5jb20vP0FjdGlvbj1VbnN1YnNjcmliZSZTdWJzY3JpcHRpb25Bcm49YXJuOmF3czpzbnM6MTExMTExMTExMTExOm15LXRvcGljOjAxYjY5MTJjLTAwNzAtNGQ4Yi04YjEzLTU1NWJmYjc2ZTdkNCJ9"
      }
    ]
  }
```

# Amazon SNS message archiving and analytics: An example use case for airline ticketing platforms
<a name="firehose-example-use-case"></a>

This topic provides a tutorial for a common use case of archiving and analyzing Amazon SNS messages. 

The setting of this use case is an airline ticketing platform that operates in a regulated environment.

1. The platform is subject to a compliance framework that requires the company to archive all ticket sales for at least five years.

1. To meet the compliance goal on data retention, the company subscribes an delivery stream to an existing Amazon SNS topic.

1. The destination for the delivery stream is an Amazon Simple Storage Service (Amazon S3) bucket. With this configuration, all events published to the SNS topic are archived in the Amazon S3 bucket.

The following diagram shows the architecture of this configuration:

![\[An AWS architecture for an airline ticketing platform, illustrating how ticket sales data is processed and archived. It shows the flow of data from a Lambda function through an Amazon SNS topic, which then distributes messages to Amazon SQS queues for payment processing and fraud detection, handled by respective Lambda functions. The data is also streamed via Data Firehose to an Amazon S3 bucket for long-term archival, supporting compliance with data retention requirements. This setup enables the platform to run detailed analytics on ticket sales data using tools like Amazon Athena.\]](http://docs.aws.amazon.com/sns/latest/dg/images/sns-archiving-use-case.png)


To run analytics and gain insights on ticket sales, the company runs SQL queries using Amazon Athena. For example, the company can query to learn about the most popular destinations and the most frequent flyers.

To create the AWS resources for this use case, you can use the AWS Management Console or an CloudFormation template.

**Topics**
+ [Setting-up initial AWS resources for message archiving and analytics](firehose-example-initial-resources.md)
+ [Setting-up a Firehose delivery stream for message archiving](firehose-example-create-delivery-stream.md)
+ [Subscribing the delivery stream to the topic](firehose-example-subscribe-delivery-stream-to-topic.md)
+ [Testing and querying a configuration for effective data management](firehose-example-test-and-query.md)
+ [Automating message archiving with an CloudFormation template](firehose-example-cfn.md)

# Setting-up initial AWS resources for Amazon SNS message archiving and analytics
<a name="firehose-example-initial-resources"></a>

This topic describes how to create the resources needed for the [message archiving and analytics example use case](firehose-example-use-case.md):
+ An Amazon Simple Storage Service (Amazon S3) bucket
+ Two Amazon Simple Queue Service (Amazon SQS) queues
+ An Amazon SNS topic
+ Two Amazon SQS subscriptions to the Amazon SNS topic

**To create the initial resources**

1. Create the Amazon S3 bucket:<a name="firehose-use-case-create-bucket"></a>

   1. Open the [Amazon S3 console](https://console.aws.amazon.com/s3/home).

   1. Choose **Create bucket**. 

   1. For **Bucket name**, enter a globally unique name. Keep the other fields as the defaults.

   1. Choose **Create bucket**.

   For more information about Amazon S3 buckets, see [Creating a bucket](https://docs.aws.amazon.com/AmazonS3/latest/userguide/CreatingABucket.html) in the *Amazon Simple Storage Service User Guide* and [Working with Amazon S3 Buckets](https://docs.aws.amazon.com/AmazonS3/latest/userguide/UsingBucket.html) in the *Amazon Simple Storage Service User Guide*.

1. Create the two Amazon SQS queues:

   1. Open the [Amazon SQS console](https://console.aws.amazon.com/sqs/home).

   1. Choose **Create queue**.

   1. For **Type**, choose **Standard**.

   1. For **Name**, enter **ticketPaymentQueue**.

   1. Under **Access policy**, for **Choose method**, choose **Advanced**.

   1. In the JSON policy box, paste the following policy:

------
#### [ JSON ]

****  

      ```
      {
        "Version":"2012-10-17",		 	 	 
        "Statement": [
          {
            "Effect": "Allow",
            "Principal": {
              "Service": "sns.amazonaws.com"
            },
            "Action": "sqs:SendMessage",
            "Resource": "*",
            "Condition": {
              "ArnEquals": {
                "aws:SourceArn": "arn:aws:sns:us-east-1:123456789012:ticketTopic"
              }
            }
          }
        ]
      }
      ```

------

      In this access policy, replace the AWS account number (*123456789012*) with your own, and change the AWS Region (*us-east-1*) accordingly.

   1. Choose **Create queue**.

   1. Repeat these steps to create a second SQS queue named **ticketFraudQueue**.

   For more information on creating SQS queues, see [Creating an Amazon SQS queue (console)](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-configure-create-queue.html) in the *Amazon Simple Queue Service Developer Guide*.

1. Create the SNS topic:

   1. Open the [Topics page](https://console.aws.amazon.com/sns/home#/topics) of the Amazon SNS console.

   1. Choose **Create topic**.

   1. Under **Details**, for **Type**, choose **Standard**.

   1. For **Name**, enter **ticketTopic**.

   1. Choose **Create topic**.

   For more information on creating SNS topics, see [Creating an Amazon SNS topic](sns-create-topic.md).

1. Subscribe both SQS queues to the SNS topic:

   1. In the [Amazon SNS console](https://console.aws.amazon.com/sns/home#/topics), on the **ticketTopic** topic's details page, choose **Create subscription**.

   1. Under **Details**, for **Protocol**, choose **Amazon SQS**.

   1. For **Endpoint**, choose the Amazon Resource Name (ARN) of the **ticketPaymentQueue** queue.

   1. Choose **Create subscription**.

   1. Repeat these steps to create a second subscription using the ARN of the **ticketFraudQueue** queue.

      For more information on subscribing to SNS topics, see [Creating a subscription to an Amazon SNS topic](sns-create-subscribe-endpoint-to-topic.md). You can also subscribe SQS queues to SNS topics from the Amazon SQS console. For more information, see [Subscribing an Amazon SQS queue to an Amazon SNS topic (console)](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-configure-subscribe-queue-sns-topic.html) in the *Amazon Simple Queue Service Developer Guide*.

You've created the initial resources for this example use case. To continue, see [Setting-up a Amazon Data Firehose delivery stream for Amazon SNS message archiving](firehose-example-create-delivery-stream.md).

# Setting-up a Amazon Data Firehose delivery stream for Amazon SNS message archiving
<a name="firehose-example-create-delivery-stream"></a>

This topic explains how to create the Amazon Data Firehose delivery stream for the [message archiving and analytics example use case](firehose-example-use-case.md).

**To create the Amazon Data Firehose delivery stream**

1. Open the [Amazon Data Firehose services console](https://console.aws.amazon.com/kinesis/home).

1. Choose **Firehose** and then choose **Create delivery stream**.

1. On the **New delivery stream** page, for **Delivery stream name**, enter **ticketUploadStream**, and then choose **Next**.

1. On the **Process records** page, choose **Next**.

1. On the **Choose a destination** page, do the following:

   1. For **Destination**, choose **Amazon S3**.

   1. Under **S3 destination**, for **S3 bucket**, choose the S3 bucket that you [created initially](firehose-example-initial-resources.md).

   1. Choose **Next**.

1. On the **Configure settings** page, for **S3 buffer conditions**, do the following:
   + For **Buffer size**, enter **1**.
   + For **Buffer interval**, enter **60**.

   Using these values for the Amazon S3 buffer lets you quickly test the configuration. The first condition that is satisfied triggers data delivery to the S3 bucket.

1. On the **Configure settings** page, for **Permissions**, choose to create an AWS Identity and Access Management (IAM) role with the required permissions assigned automatically. Then choose **Next**.

1. On the **Review** page, choose **Create delivery stream**.

1. From the **Amazon Data Firehose delivery streams page,** choose the delivery stream you just created (**ticketUploadStream**). On the **Details** tab, note the stream's Amazon Resource Name (ARN) for later.

For more information on creating delivery streams, see [Creating an Amazon Data Firehose Delivery Stream](https://docs.aws.amazon.com/firehose/latest/dev/basic-create.html) in the *Amazon Data Firehose Developer Guide*. For more information on creating IAM roles, see [Creating a role to delegate permissions to an AWS service](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html) in the *IAM User Guide*.

You've created the Firehose delivery stream with the required permissions. To continue, see [Subscribing the Firehose delivery stream to the Amazon SNS topic](firehose-example-subscribe-delivery-stream-to-topic.md).

# Subscribing the Firehose delivery stream to the Amazon SNS topic
<a name="firehose-example-subscribe-delivery-stream-to-topic"></a>

This topic explains how to create the following resources for the [message archiving and analytics example use case](firehose-example-use-case.md):
+ The AWS Identity and Access Management (IAM) role that allows the Amazon SNS subscription to put records on the delivery stream.
+ The Firehose delivery stream subscription to the Amazon SNS topic.

**To create the IAM role for the Amazon SNS subscription**

1. Open the [Roles page](https://console.aws.amazon.com/iam/home?#/roles) of the IAM console.

1. Choose **Create role**.

1. For **Select type of trusted entity**, choose **AWS service**.

1. For **Choose a use case**, choose **SNS**. Then choose **Next: Permissions**.

1. Choose **Next: Tags**.

1. Choose **Next: Review**.

1. On the **Review** page, for **Role name**, enter **ticketUploadStreamSubscriptionRole**. Then choose **Create role**.

1. When the role is created, choose its name (**ticketUploadStreamSubscriptionRole**).

1. On the role's **Summary** page, choose **Add inline policy**.

1. On the **Create policy** page, choose the **JSON** tab, and then paste the following policy into the box:

------
#### [ JSON ]

****  

   ```
   {
       "Version":"2012-10-17",		 	 	 
       "Statement": [
           {
               "Action": [
                   "firehose:DescribeDeliveryStream",
                   "firehose:ListDeliveryStreams",
                   "firehose:ListTagsForDeliveryStream",
                   "firehose:PutRecord",
                   "firehose:PutRecordBatch"
               ],
               "Resource": [
                   "arn:aws:firehose:us-east-1:123456789012:deliverystream/ticketUploadStream"
               ],
               "Effect": "Allow"
           }
       ]
   }
   ```

------

   In this policy, replace the AWS account number (*123456789012*) with your own, and change the AWS Region (*us-east-1*) accordingly.

1. Choose **Review policy**.

1. On the **Review policy** page, for **Name**, enter **FirehoseSnsPolicy**. Then choose **Create policy**.

1. On the role's **Summary** page, note the **Role ARN** for later.

For more information on creating IAM roles, see [Creating a role to delegate permissions to an AWS service](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html) in the *IAM User Guide*.

**To subscribe the Firehose delivery stream to the SNS topic**

1. Open the [Topics page](https://console.aws.amazon.com/sns/home#/topics) of the Amazon SNS console.

1. On the **Subscriptions**, tab, choose **Create subscription**.

1. Under **Details**, for **Protocol**, choose ****.

1. For **Endpoint**, enter the Amazon Resource Name (ARN) of the **ticketUploadStream** delivery stream that you created earlier. For example, enter **arn:aws:firehose:us-east-1:123456789012:deliverystream/ticketUploadStream**.

1. For **Subscription role ARN**, enter the ARN of the **ticketUploadStreamSubscriptionRole** IAM role that you created earlier. For example, enter **arn:aws:iam::123456789012:role/ticketUploadStreamSubscriptionRole**.

1. Select the **Enable raw message delivery** check box.

1. Choose **Create subscription**.

You've created the IAM role and SNS topic subscription. To continue, see [Testing and querying an Amazon SNS configuration for effective data management](firehose-example-test-and-query.md).

# Testing and querying an Amazon SNS configuration for effective data management
<a name="firehose-example-test-and-query"></a>

This topic explains how to test the [message archiving and analytics example use case](firehose-example-use-case.md) by publishing a message to the Amazon SNS topic. The instructions include an example query that you can run and adapt to your own needs.

**To test your configuration**

1. Open the [Topics page](https://console.aws.amazon.com/sns/home#/topics) of the Amazon SNS console.

1. Choose the **ticketTopic** topic.

1. Choose **Publish message**.

1. On the **Publish message to topic** page, enter the following for the message body. Add a newline character at the end of the message.

   ```
   {"BookingDate":"2020-12-15","BookingTime":"2020-12-15 04:15:05","Destination":"Miami","FlyingFrom":"Vancouver","TicketNumber":"abcd1234"}
   ```

   Keep all other options as their defaults.

1. Choose **Publish message**.

   For more information on publishing messages, see [Publishing an Amazon SNS message](sns-publishing.md).

1. After the delivery stream interval of 60 seconds, open the [Amazon Simple Storage Service (Amazon S3) console](https://console.aws.amazon.com/s3/home) and choose the Amazon S3 bucket that you [created initially](firehose-example-initial-resources.md).

   The published message appears in the bucket.

**To query the data**

1. Open the [Amazon Athena console](https://console.aws.amazon.com/athena/home).

1. Run a query.

   For example, assume that the `notifications` table in the `default` schema contains the following data:

   ```
   {"BookingDate":"2020-12-15","BookingTime":"2020-12-15 04:15:05","Destination":"Miami","FlyingFrom":"Vancouver","TicketNumber":"abcd1234"}
   {"BookingDate":"2020-12-15","BookingTime":"2020-12-15 11:30:15","Destination":"Miami","FlyingFrom":"Omaha","TicketNumber":"efgh5678"}
   {"BookingDate":"2020-12-15","BookingTime":"2020-12-15 3:30:10","Destination":"Miami","FlyingFrom":"NewYork","TicketNumber":"ijkl9012"}
   {"BookingDate":"2020-12-15","BookingTime":"2020-12-15 12:30:05","Destination":"Delhi","FlyingFrom":"Omaha","TicketNumber":"mnop3456"}
   ```

   To find the top destination, run the following query:

   ```
   SELECT destination
   FROM default.notifications
   GROUP BY destination
   ORDER BY count(*) desc
   LIMIT 1;
   ```

   To query for tickets sold during a specific date and time range, run a query like the following:

   ```
   SELECT * 
   FROM default.notifications 
   WHERE bookingtime 
     BETWEEN TIMESTAMP '2020-12-15 10:00:00' 
     AND TIMESTAMP '2020-12-15 12:00:00';
   ```

   You can adapt both sample queries for your own needs. For more information on using Athena to run queries, see [Getting Started](https://docs.aws.amazon.com/athena/latest/ug/getting-started.html) in the *Amazon Athena User Guide*.

## Cleaning up
<a name="firehose-example-cleanup"></a>

To avoid incurring usage charges after you're done testing, delete the following resources that you created during the tutorial:
+ Amazon SNS subscriptions
+ Amazon SNS topic
+ Amazon Simple Queue Service (Amazon SQS) queues
+ Amazon S3 bucket
+  delivery stream
+ AWS Identity and Access Management (IAM) roles and policies

# Automating Amazon SNS message archiving with an CloudFormation template
<a name="firehose-example-cfn"></a>

To automate the deployment of the Amazon SNS [message archiving and analytics example use case](firehose-example-use-case.md), you can use the following YAML template:

```
---
AWSTemplateFormatVersion: '2010-09-09'
Description: Template for creating an SNS archiving use case
Resources:
  ticketUploadStream:
    DependsOn:
    - ticketUploadStreamRolePolicy
    Type: AWS::KinesisFirehose::DeliveryStream
    Properties:
      S3DestinationConfiguration:
        BucketARN: !Sub 'arn:${AWS::Partition}:s3:::${ticketArchiveBucket}'
        BufferingHints:
          IntervalInSeconds: 60
          SizeInMBs: 1
        CompressionFormat: UNCOMPRESSED
        RoleARN: !GetAtt ticketUploadStreamRole.Arn
  ticketArchiveBucket:
    Type: AWS::S3::Bucket
  ticketTopic:
    Type: AWS::SNS::Topic
  ticketPaymentQueue:
    Type: AWS::SQS::Queue
  ticketFraudQueue:
    Type: AWS::SQS::Queue
  ticketQueuePolicy:
    Type: AWS::SQS::QueuePolicy
    Properties:
      PolicyDocument:
        Statement:
          Effect: Allow
          Principal:
            Service: sns.amazonaws.com
          Action:
            - sqs:SendMessage
          Resource: '*'
          Condition:
            ArnEquals:
              aws:SourceArn: !Ref ticketTopic
      Queues:
        - !Ref ticketPaymentQueue
        - !Ref ticketFraudQueue
  ticketUploadStreamSubscription:
    Type: AWS::SNS::Subscription
    Properties:
      TopicArn: !Ref ticketTopic
      Endpoint: !GetAtt ticketUploadStream.Arn
      Protocol: firehose
      SubscriptionRoleArn: !GetAtt ticketUploadStreamSubscriptionRole.Arn
  ticketPaymentQueueSubscription:
    Type: AWS::SNS::Subscription
    Properties:
      TopicArn: !Ref ticketTopic
      Endpoint: !GetAtt ticketPaymentQueue.Arn
      Protocol: sqs
  ticketFraudQueueSubscription:
    Type: AWS::SNS::Subscription
    Properties:
      TopicArn: !Ref ticketTopic
      Endpoint: !GetAtt ticketFraudQueue.Arn
      Protocol: sqs
  ticketUploadStreamRole:
    Type: AWS::IAM::Role
    Properties:
      AssumeRolePolicyDocument:
        Version: '2012-10-17		 	 	 '
        Statement:
        - Sid: ''
          Effect: Allow
          Principal:
            Service: firehose.amazonaws.com
          Action: sts:AssumeRole
  ticketUploadStreamRolePolicy:
    Type: AWS::IAM::Policy
    Properties:
      PolicyName: FirehoseticketUploadStreamRolePolicy
      PolicyDocument:
        Version: '2012-10-17		 	 	 '
        Statement:
        - Effect: Allow
          Action:
          - s3:AbortMultipartUpload
          - s3:GetBucketLocation
          - s3:GetObject
          - s3:ListBucket
          - s3:ListBucketMultipartUploads
          - s3:PutObject
          Resource:
          - !Sub 'arn:aws:s3:::${ticketArchiveBucket}'
          - !Sub 'arn:aws:s3:::${ticketArchiveBucket}/*'
      Roles:
      - !Ref ticketUploadStreamRole
  ticketUploadStreamSubscriptionRole:
    Type: AWS::IAM::Role
    Properties:
      AssumeRolePolicyDocument:
        Version: '2012-10-17		 	 	 '
        Statement:
        - Effect: Allow
          Principal:
            Service:
            - sns.amazonaws.com
          Action:
          - sts:AssumeRole
      Policies:
      - PolicyName: SNSKinesisFirehoseAccessPolicy
        PolicyDocument:
          Version: '2012-10-17		 	 	 '
          Statement:
          - Action:
            - firehose:DescribeDeliveryStream
            - firehose:ListDeliveryStreams
            - firehose:ListTagsForDeliveryStream
            - firehose:PutRecord
            - firehose:PutRecordBatch
            Effect: Allow
            Resource:
            - !GetAtt ticketUploadStream.Arn
```