

# Write to Kinesis Data Streams using other AWS services
<a name="using-other-services"></a>

The following AWS services can integrate directly with Amazon Kinesis Data Streams to write data to Kinesis data streams. Review the information for each service that you are interested in and refer to the provided references.

**Topics**
+ [

# Write to Kinesis Data Streams using AWS Amplify
](using-other-services-amplify.md)
+ [

# Write to Kinesis Data Streams using Amazon Aurora
](using-other-services-aurora.md)
+ [

# Write to Kinesis Data Streams using Amazon CloudFront
](using-other-services-CloudFront.md)
+ [

# Write to Kinesis Data Streams using Amazon CloudWatch Logs
](using-other-services-cw-logs.md)
+ [

# Write to Kinesis Data Streams using Amazon Connect
](using-other-services-connect.md)
+ [

# Write to Kinesis Data Streams using AWS Database Migration Service
](using-other-services-migration.md)
+ [

# Write to Kinesis Data Streams using Amazon DynamoDB
](using-other-services-ddb.md)
+ [

# Write to Kinesis Data Streams using Amazon EventBridge
](using-other-services-eventbridges.md)
+ [

# Write to Kinesis Data Streams using AWS IoT Core
](using-other-services-iot-core.md)
+ [

# Write to Kinesis Data Streams using Amazon Relational Database Service
](using-other-services-rds.md)
+ [

# Write to Kinesis Data Streams usingAmazon Pinpoint
](using-other-services-pinpoint.md)
+ [

# Write to Kinesis Data Streams using Amazon Quantum Ledger Database (Amazon QLDB)
](using-other-services-quantum-ledger.md)

# Write to Kinesis Data Streams using AWS Amplify
<a name="using-other-services-amplify"></a>

You can use Amazon Kinesis Data Streams to stream data from your mobile applications built with AWS Amplify for real-time processing. You can then build real-time dashboards, capture exceptions and generate alerts, drive recommendations, and make other real-time business or operational decisions. You can also send data to other services such as Amazon Simple Storage Service, Amazon DynamoDB, and Amazon Redshift.

For more information, see [Using Amazon Kinesis](https://docs.amplify.aws/react/build-a-backend/more-features/analytics/streaming-data/) in the *AWS Amplify Developer Center*. 

# Write to Kinesis Data Streams using Amazon Aurora
<a name="using-other-services-aurora"></a>

You can use Amazon Kinesis Data Streams to monitor activities on your Amazon Aurora DB clusters. Using Database Activity Streams, your Aurora DB cluster pushes activities to an Amazon Kinesis Data Stream in real-time. You can then build applications for compliance management that consume these activities, audit them and generate alerts. You can also use Amazon Amazon Firehose to store the data.

For more information, see [Database Activity Streams](https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/DBActivityStreams.html) in the *Amazon Aurora Developer Guide*. 

# Write to Kinesis Data Streams using Amazon CloudFront
<a name="using-other-services-CloudFront"></a>

You can use Amazon Kinesis Data Streams with CloudFront real-time logs and get information about requests made to a distribution in real time. You can then build your own [Kinesis data stream consumer](https://docs.aws.amazon.com/streams/latest/dev/building-consumers.html), or use Amazon Data Firehose to send the log data to Amazon S3, Amazon Redshift, Amazon OpenSearch Service, or a third-party log processing service.

For more information, see [Real-time logs](https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/real-time-logs.html) in the *Amazon CloudFront Developer Guide*. 

# Write to Kinesis Data Streams using Amazon CloudWatch Logs
<a name="using-other-services-cw-logs"></a>

You can use CloudWatch subscriptions to get access to a real-time feed of log events from Amazon CloudWatch Logs and have it delivered to a Kinesis data stream for processing, analysis, and loading to to other systems. 

For more information, see [Real-time processing of log data with subscriptions](https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/Subscriptions.html) in the *Amazon CloudWatch Logs User Guide*. 

# Write to Kinesis Data Streams using Amazon Connect
<a name="using-other-services-connect"></a>

You can use Kinesis Data Streams to export contact records and agent events in real-time from your Amazon Connect instance. You can also enable data streaming from Amazon Connect Customer Profiles to automatically receive updates to a Kinesis data stream about creation of new profiles or changes to existing ones.

You can then build consumer applications to process and analyze the data in real time. For example, using contact records and customer profile data, you can keep your source systems data, such as CRMs and marketing automation tools, up-to-date with the latest information. Using the agents event data, you can create dashboards that display agent information and events, and trigger custom notifications of specific agent activity.

For more information, see [data streaming for your instance](https://docs.aws.amazon.com/connect/latest/adminguide/data-streaming.html), [set up real-time export](https://docs.aws.amazon.com/connect/latest/adminguide/set-up-real-time-export.html), and [agent event streams](https://docs.aws.amazon.com/connect/latest/adminguide/agent-event-streams.html) in the * Amazon Connect Administrator Guide*. 

# Write to Kinesis Data Streams using AWS Database Migration Service
<a name="using-other-services-migration"></a>

You can use AWS Database Migration Service to migrate data to a Kinesis data stream. You can than build consumer applications that process the data records in real time. You can also easily send data downstream to other services such as Amazon Simple Storage Service, Amazon DynamoDB, and Amazon Redshift

For more information, see [Using Kinesis Data Streams](https://docs.aws.amazon.com/dms/latest/userguide/CHAP_Target.Kinesis.html) in the *AWS Database Migration Service User Guide*. 

# Write to Kinesis Data Streams using Amazon DynamoDB
<a name="using-other-services-ddb"></a>

You can use Amazon Kinesis Data Streams to capture changes to Amazon DynamoDB. Kinesis Data Streams captures item-level modifications in any DynamoDB table and replicates them to a Kinesis data stream. Your consumer applications can access this stream to view item-level changes in real time and deliver those changes downstream or take action based on the content.

For more information, see [how Kinesis Data Streams work with DynamoDB](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/kds.html) in the *Amazon DynamoDB Developer Guide*. 

# Write to Kinesis Data Streams using Amazon EventBridge
<a name="using-other-services-eventbridges"></a>

Using Kinesis Data Streams, you can send AWS API call [events](https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-events.html) in EventBridge to a stream, build consumer applications, and process large amounts of data. You can also use Kinesis Data Streams as a target in EventBridge Pipes and deliver records a stream from one of the available sources after optional filtering and enrichment.

For more information, see [Send events to an Amazon Kinesis stream](https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-relay-events-kinesis-stream.html) and [EventBridge Pipes](https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-pipes.html) in the *Amazon EventBridge User Guide*. 

# Write to Kinesis Data Streams using AWS IoT Core
<a name="using-other-services-iot-core"></a>

You can write data in real time from MQTT messages in AWS IoT Core by using AWS IoT Rule actions. You can then build applications that process the data, analyze its contents and generate alerts, and deliver it into analytics applications or other AWS services, 

For more information, see [Kinesis Data Streams](https://docs.aws.amazon.com/iot/latest/developerguide/kinesis-rule-action.html) in the *AWS IoT Core Developer Guide*. 

# Write to Kinesis Data Streams using Amazon Relational Database Service
<a name="using-other-services-rds"></a>

You can use Amazon Kinesis Data Streams to monitor activities on your Amazon RDS instances. Using Database Activity Streams, Amazon RDS pushes activities to a Kinesis data stream in real-time. You can then build applications for compliance management that consume these activities, audit them and generate alerts. You can also use Amazon Data Firehose to store the data.

For more information, see [Database Activity Streams](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/DBActivityStreams.html) in the *Amazon RDS Developer Guide*. 

# Write to Kinesis Data Streams usingAmazon Pinpoint
<a name="using-other-services-pinpoint"></a>

You can set up Amazon Pinpoint to send event data to Amazon Kinesis Data Streams. Amazon Pinpoint can send event data for campaigns, journeys, and transactional email and SMS messages. You can then ingest the data into analytics applications or build your own consumer applications that take actions based on the contents of the events.

For more information, see [Streaming Events](https://docs.aws.amazon.com/pinpoint/latest/developerguide/event-streams.html) in the *Amazon Pinpoint Developer Guide*. 

# Write to Kinesis Data Streams using Amazon Quantum Ledger Database (Amazon QLDB)
<a name="using-other-services-quantum-ledger"></a>

You can create a stream in Amazon QLDB that captures every document revision that is committed to your journal and delivers this data to Amazon Kinesis Data Streams in real time. A QLDB stream is a continuous flow of data from your ledger's journal to a Kinesis data stream resource. Then, you can use the Kinesis streaming platform or the Kinesis Client Library to consume your stream, process the data records, and analyze the data contents. A QLDB stream writes your data to Kinesis Data Streams in three types of records: `control`, `block summary`, and `revision details`. 

For more information, see [Streams](https://docs.aws.amazon.com/qldb/latest/developerguide/streams.html) in the *Amazon QLDB developer Guide*. 