

# AWS and SAP JRA
<a name="rise-jra"></a>

 AWS and SAP Joint Reference Architecture (JRA) is a framework designed to guide customers on how to effectively integrate and utilize both AWS and SAP services to achieve specific business outcomes. It provides architectural guidance and best practices for common scenarios, helping customers optimize their SAP solutions on AWS and leverage the strengths of both platforms.

The AWS and SAP JRA was developed to address common questions from joint customers and partners on how to use SAP and/or AWS services for different business solution scenarios. As we dive deeper into each of the use cases, we will see that the services complement each other, thus working together to solve the respective customer’s business challenges holistically. When you apply AWS and SAP JRA to RISE with SAP on AWS, you will be able to unlock possibilities to get more value out of your investment.

**Topics**
+ [

# Data to Value
](rise-jra-datatovalue.md)
+ [

# Artificial Intelligence
](rise-jra-ai.md)
+ [

# Integration
](rise-jra-integration.md)
+ [

# Custom Application
](rise-jra-customapps.md)
+ [

# Operational Reliability
](rise-jra-operational-reliability.md)
+ [

# Internet of Things
](rise-jra-iot.md)

# Data to Value
<a name="rise-jra-datatovalue"></a>

Enterprises need data-driven intelligence that delivers measurable business outcomes. Running SAP on AWS provides a scalable, secure, and flexible foundation to transform raw data into actionable value. The SAP and AWS Joint Reference Architecture (JRA) provides a framework for connecting data sources, harmonizing SAP and non-SAP data, and enabling AI and analytics-driven innovation through [SAP Business Data Cloud (SAP BDC)](https://www.sap.com/products/data-cloud.html) and [Amazon Sagemaker](https://aws.amazon.com/sagemaker/).

This guide outlines two key joint reference architectures that exemplify how organizations can leverage SAP and AWS services to maximize the value of their enterprise data through AI powered insights, while maintaining flexibility, scalability, and cost efficiency.

**Topics**
+ [

# Integrating data in SAP BDC with AWS data sources
](rise-jra-datatovalue-bdc-aws.md)
+ [

# AI Innovation with FedML-AWS and Sagemaker
](rise-jra-datatovalue-fedml-aws.md)

# Integrating data in SAP BDC with AWS data sources
<a name="rise-jra-datatovalue-bdc-aws"></a>

Non-SAP data from AWS data sources can be harmonized with SAP data via SAP Datasphere data fabric architecture with SAP BDC. The integration architecture supports multiple AWS services, each with specific modes of integration based on live data or replication:

![\[SAP BDC with Managed Services\]](http://docs.aws.amazon.com/sap/latest/general/images/rise-jra-datatovalue-01.png)


 **A. Integration with Amazon Athena** 

Mode of Integration: Federating data live into SAP Datasphere

Amazon Athena is Amazon’s interactive query service that helps query and analyze data in S3. Non-SAP data from Athena can be federated live into remote tables in SAP Datasphere and augmented with SAP data for real-time analytics in [SAP Analytics Cloud](https://www.sap.com/products/data-cloud/cloud-analytics.html).

Here are the steps to integrate Athena with SAP Datasphere:

1. Prepare source with non-SAP and third party data

1. Configure Athena

1. onfigure necessary IAM user and authorizations

1. Setup SAP Datasphere Connection to Athena

1. Build models in SAP Datasphere

This enables live data federation without replicating data, thus reduces cost, provides fast insights, and enterprise-grade security. For detailed step by step, visit [Federating Queries from SAP Datasphere to Amazon S3 via Amazon Athena](https://github.com/SAP-samples/sap-bdc-explore-hyperscaler-data/blob/main/AWS/athena-integration.md).

 **B. Integration with Amazon Redshift** 

Mode of Integration: Federating data live into SAP Datasphere

Amazon Redshift is a fully managed, petabyte-scale data warehouse service optimized for analytical workloads. Through SAP Datasphere data federation architecture, Redshift data can be augmented with SAP data to build unified data models and analytics in SAP Analytics Cloud. [Smart Data Integration (SDI)](https://help.sap.com/docs/HANA_SMART_DATA_INTEGRATION/bf2f0282053648f8a1ef873e65ded81a/323ff4c3c12040bab8f1222a901dd95d.html) connects SAP Datasphere with Redshift via [Camel JDBC Adapter](https://help.sap.com/docs/HANA_SMART_DATA_INTEGRATION/7952ef28a6914997abc01745fef1b607/598cdd48941a41128751892fe68393f4.html?locale=en-US), enabling the creation of virtual tables and real-time or snapshot replication.

Here are the steps to integrate Redshift with SAP Datasphere:

1. Create On-Premise Agent in SAP Datasphere

1. Set Up Redshift Access

1. Configure SAP SDI DP Agent

1. Register Camel JDBC Adapter in SAP Datasphere

1. Upload Third-Party Drivers in SAP Datasphere

1. Create Local Connection to Redshift in SAP Datasphere

1. Import Remote Tables from Redshift

This setup enables live federated queries from SAP Datasphere to Redshift without replicating the data. Benefits include real-time access to Redshift data, pushdown queries for performance optimization, and no data duplication in SAP Datasphere. For detailed step by step, visit [Data Federation between SAP Datasphere and Amazon Redshift](https://github.com/SAP-samples/sap-bdc-explore-hyperscaler-data/blob/main/AWS/redshift-integration.md).

 **C. Integration with Amazon S3** 

Modes of Integration: Replicating data with Replication Flows, Importing data into SAP Datasphere using Data Flows

Amazon S3 provides object storage service which is highly scalable, durable, available and secure. Non-SAP data from S3 buckets can be imported into SAP Datasphere through the Data Flow feature for use with applications such as Financial Planning or business analytics in SAP Analytics Cloud.

Here are the steps to integrate Amazon S3 with SAP Datasphere:

1. Prepare source data in an S3 bucket

1. Configure necessary IAM user and authorizations

1. Create S3 Connection in SAP Datasphere

1. Create a Data Flow

This process allows SAP Datasphere to connect to S3, access non-sap data, and use that data in combination with internal SAP datasets via Data Flows. For detailed step by step, visit [Data integration between SAP Datasphere and in Amazon S3](https://github.com/SAP-samples/sap-bdc-explore-hyperscaler-data/blob/main/AWS/s3-integration.md).

You can find out more from SAP Architecture Center under [Integration with AWS data sources](https://architecture.learning.sap.com/docs/ref-arch/a07a316077/1).

# AI Innovation with FedML-AWS and Sagemaker
<a name="rise-jra-datatovalue-fedml-aws"></a>

In today’s data-driven enterprises, machine learning models are only as powerful as the data they can access. However, business-critical data often resides within SAP systems like SAP BDC, while advanced model development typically takes place in cloud-native platforms like Amazon Sagemaker.

FedML-AWS for Amazon Sagemaker bridges this gap by providing a secure, efficient, and unified framework for federated model training and deployment across SAP and AWS ecosystems. By eliminating data duplication and enabling real-time access to SAP data, FedML-AWS helps accelerate AI initiatives, ensure data governance, and reduce operational complexity, all while leveraging the scalability and performance of AWS and the business context of SAP. With minimal setup, FedML-AWS enables data discovery, model training, and deployment across both SAP and AWS environments to extract value from data.

![\[FedML and Amazon Sagemaker\]](http://docs.aws.amazon.com/sap/latest/general/images/rise-jra-datatovalue-02.png)


FedML, a Python library, is directly imported into Amazon Sagemaker notebook instances. When most training data resides in AWS, but critical SAP data with business semantics is also needed for training, it securely connects to SAP Datasphere (part of BDC) via Python/SQLDBC connectivity, enabling federated access to SAP business data required for model training in Sagemaker.

For more technical details on methods that enable the training data to be read from SAP Datasphere (part of BDC) and trained using Machine Learning model on Amazon Sagemaker, visit [FedML-AWS](https://github.com/SAP-samples/datasphere-fedml/tree/main/AWS). You can find out more from SAP Architecture Center under [Integration with FedML-AWS for Amazon Sagemaker](https://architecture.learning.sap.com/docs/ref-arch/8e1a5fbce3/1).

By combining the strengths of SAP Business Data Cloud (BDC) and AWS services, organizations can unlock the full potential of their enterprise data. From operational systems to advanced AI and analytics, whether harmonizing datasets across Amazon S3, Redshift, and Athena or enabling federated model training with FedML-AWS and Amazon Sagemaker, these architectures provide a scalable and secure foundation for innovation. Together, SAP and AWS empower businesses to move from data silos to data-driven intelligence, accelerating time to insight, optimizing decision-making, and driving measurable business value across the enterprise.

# Artificial Intelligence
<a name="rise-jra-ai"></a>

 [Amazon Bedrock](https://aws.amazon.com/bedrock/) and [SAP Generative AI Hub](https://help.sap.com/docs/ai-launchpad/sap-ai-launchpad/generative-ai-hub) combine through Joint Reference Architecture (JRA) to provide enterprise-grade AI capabilities for RISE with SAP environments. This integration addresses the need for intelligent process automation while maintaining system security and clean core principles.

Amazon Bedrock serves as the foundational AI service layer, providing managed access to various foundation models including Anthropic Claude and Amazon Nova. The service enables organizations to fine-tune these models with proprietary data and implement Retrieval Augmented Generation (RAG) within a secure computing environment.

SAP Generative AI Hub complements this foundation by providing enterprise-specific governance and control mechanisms. The hub manages model selection, knowledge base indexing, and retrieval operations while enforcing necessary safety guardrails and risk controls. This ensures AI deployments remain compliant with enterprise standards and business requirements.

In this documentation, we will focus into JRA aspect as these components create a robust framework for implementing AI capabilities across SAP processes and AWS services, from customer order management to production design, while maintaining enterprise security and reliability standards.

 ** AWS-SAP Joint Reference Architecture in Generative AI** 

![\[Joint Reference Architecture in Generative AI Hub\]](http://docs.aws.amazon.com/sap/latest/general/images/rise-jra-ai-genaihub.png)


Key components from the architecture:
+  [Amazon Bedrock](https://aws.amazon.com/bedrock/) is a service that provides access to various Foundational Models (FMs) through API interfaces. It features models like [Amazon Titan](https://aws.amazon.com/bedrock/amazon-models/titan/), [Amazon Nova](https://aws.amazon.com/ai/generative-ai/nova/) and [Anthropic Claude](https://www.anthropic.com/claude), which are comprehensive new generation FMs with industry leading price performance. These models are versatile and can handle many different applications.
+  [SAP AI Core](https://www.sap.com/india/products/artificial-intelligence/ai-core.html) with [Generative AI Hub](https://help.sap.com/docs/sap-ai-core/sap-ai-core-service-guide/generative-ai-hub-in-sap-ai-core) provides customers access to AI capabilities, including FMs, and offers standardized interfaces for SAP BTP applications. It serves as a management layer that controls access to Bedrock and creates endpoints for applications to utilize FMs. Generative AI Hub enforces centralized safety controls and risk mitigation measures to ensure secure and compliant AI in enterprise deployment. For further details on the SAP’s Generative AI Hub supported models through Bedrock, please refer to [SAP Note 3437766](https://me.sap.com/notes/3437766).
+  [SAP HANA Cloud](https://discovery-center.cloud.sap/serviceCatalog/sap-hana-cloud?region=all) as database management with [vector engine support](https://help.sap.com/docs/hana-cloud-database/sap-hana-cloud-sap-hana-database-vector-engine-guide/introduction?locale=en-US) for RAG implementation that can be used for grounding capabilities by efficiently finding and fetching relevant business documents that relate to specific questions or tasks. This information is then used as context for the foundational model by enhancing its ability to provide accurate and context-specific responses.
+  [SAP Cloud Application Programming (CAP)](https://pages.community.sap.com/topics/cloud-application-programming) Model is a development framework that provides a structured approach to enterprise services and applications. CAP simplifies development by providing integrated frameworks with [SAP UI5](https://sapui5.hana.ondemand.com/) frontend.
+  [SAP Identity Provisioning Services](https://help.sap.com/docs/identity-provisioning) is used for authentication and access management to secure the delivery of these AI capabilities.

The above diagram provides reference architecture for consuming generative AI capabilities of Amazon Bedrock with SAP Generative AI Hub. Using this, SAP workloads can now be supplemented with Foundational Models to harness the power of SAP data, resulting in improved business insights and operational efficiencies at a lower cost.

You can find out more from SAP Architecture Center under [Generative AI and SAP BTP](https://architecture.learning.sap.com/docs/ref-arch/e5eb3b9b1d).

 ** AWS-SAP Joint Reference Architecture in Agent2Agent** 

![\[Joint Reference Architecture in Agent2Agent\]](http://docs.aws.amazon.com/sap/latest/general/images/rise-jra-ai-a2a.png)


Key components from the architecture:
+  [Amazon Bedrock Agents](https://aws.amazon.com/bedrock/) is a service that provides capability of reasoning of foundation models, APIs and data to break down user requests, gathers relevant information, and efficiently completes tasks. With its multi-agent collaboration, it allows developers to build, deploy, and manage multiple specialized agents seamlessly working together to address increasingly complex business workflows.
+  [Amazon Bedrock AgentCore](https://aws.amazon.com/bedrock/agentcore/) enables you to deploy and operate highly capable AI agents securely, at scale. AgentCore services can be used together or independently and work with any framework including CrewAI, LangGraph, LlamaIndex, and Strands Agents, as well as any foundation model in or outside of Amazon Bedrock, giving you ultimate flexibility. AgentCore eliminates the undifferentiated heavy lifting of building specialized agent infrastructure, so you can accelerate agents to production.

You can find out more from SAP Architecture Center under [Agent2Agent (A2A) Interoperability in Enterprise AI](https://architecture.learning.sap.com/docs/ref-arch/e5eb3b9b1d/8).

# Integration
<a name="rise-jra-integration"></a>

In the RISE with SAP landscape, SAP Business Technology Platform (BTP), particularly the [SAP Integration Suite](https://help.sap.com/docs/integration-suite/sap-integration-suite/what-is-sap-integration-suite?locale=en-US), often facilitates integration scenarios. This service is capable of supporting integrations across cloud, on-premises, and hybrid environments within the SAP ecosystem.

There are two deployments options for SAP Integration Suite

 **A. Standard Deployment** 

In SAP Integration Suite, integration developers create integration flows and Application Programming Interfaces (APIs). The created integration and API content is deployed to SAP’s Integration Suite runtime environment. Once deployed, the integration content (e.g., a set of integration flows) becomes operational, enabling data exchange with connected sender and receiver systems.

 **B. Hybrid Deployment Using Edge Integration Cell** 

 [Edge Integration Cell](https://help.sap.com/docs/integration-suite/sap-integration-suite/what-is-sap-integration-suite-edge-integration-cell?locale=en-US) is an optional hybrid integration runtime offered as part of SAP Integration Suite, which enables you to manage APIs and run integration scenarios within your private landscape. The hybrid deployment model of Edge Integration Cell enables you to design and monitor your integration content in the cloud. It also allows you to deploy and run your integration content in your private landscape. Its runtime environment is realized as a Kubernetes container, facilitating secure, internal data exchange.

For more detailed information, you can refer to [SAP Note 3426066 FAQ: Edge Integration Cell simple questions](https://me.sap.com/notes/3426066/E) and [SAP Note 3391207 SAP Integration Suite : restrictions for the Edge Integration Cell](https://me.sap.com/notes/3391207/E).

 **Deploy Edge Integration Cell on AWS ** 

Edge Integration Cell (EIC) can be deployed on AWS to leverage its scalable infrastructure while maintaining secure and controlled execution in a customer-managed environment. This architecture combines AWS-native services with EIC’s hybrid capabilities, ensuring a seamless integration experience. Edge integration cell on AWS can be deployed in standard or High Availability (HA) architecture.

You can refer to the detailed EIC architecture, SAP pre-requisites, AWS pre-requisite in [this sap-samples github link](https://github.com/SAP-samples/btp-edge-integration-cell-aws).

![\[Joint Reference Architecture in Edge Integration Cell\]](http://docs.aws.amazon.com/sap/latest/general/images/rise-jra-integration.png)


Key Components
+  **Edge Integration Cell** is a unified runtime pipeline consisting of the following key components:
  +  **Worker** is a Camel-based runtime of Integration Suite that executes integration flows.
  +  **Policy Engine** is an Envoy-based runtime with SAP-built extensions for enforcing policies like security or traffic management on API proxies.
+  **The Message Service** is implements asynchronous integration pattern based on JMS protocol. For the Cloud offering, this instance is managed by SAP.
+  **The PostgreSQL database** is a relational database system for storing structured data and is managed by SAP for the public cloud offering.
+  **Redis** is an in-memory data store used for caching.

 **Edge Integration Cell Sizing** 

Detailed below is the minimum sizing for Edge integration cell (EIC). For a more detailed sizing based on scenarios, you can refer to [SAP Notes 3247839](https://me.sap.com/notes/3247839) and [Sizing Guide for Edge Integration Cell](https://help.sap.com/docs/integration-suite/sap-integration-suite/sizing-guidelines).

 **Sizing of worker node** : Minimum CPU and Memory requirements for High Availability (HA) and non-HA (agent or worker nodes)


| Deployment Type | CPU/Memory | Persistence Storage | 
| --- | --- | --- | 
|  Non-HA  |  8 vCPU/32 GiB ( m6a.2xlarge)  |  101 GiB of Amazon EBS GP3  | 
|  HA  |  16 vCPU/64 GiB( m6a.4xlarge)  |  204 GiB of Amazon GP3  | 

Minimum 3 worker nodes required in both HA and non-HA configuration.

 **External Storage** : Minimum Sizing for Postgres and Redis for HA


| Database | CPU/Memory | Persistence Storage | 
| --- | --- | --- | 
|  Postgres  |  1 CPU / 2 GiB (db.t2.small)  |  50 GiB of EBS GP3  | 
|  Redis  |  1 CPU / 1 GiB (cache.t2.small)  |  N/A  | 


|  | 
| --- |
|   **Pricing example** - With minimum configuration, we calculated an indicative monthly costs in USD to deploy SAP Edge Integration Cell in us-east-1 region Load balancer (NLB), with 10GB/hour data = \$160.23 Amazon EKS cluster = \$173.00 Three worker nodes with m6a.2xlarge = \$1421.75 (3 year No Upfront EC2 Instance Savings Plan) RDS PostgreSQL Multi-AZ = \$1104.21 ElastiCache Redis = \$124.82 Total cost for running EIC in HA mode \$1 \$1684 billed to AWS account managed by customer.  | 

You can find out more from SAP Architecture Center under [Edge Integration Cell on AWS](https://architecture.learning.sap.com/docs/ref-arch/263f576c90/1).

# Custom Application
<a name="rise-jra-customapps"></a>

Custom applications are created by customers to address their unique business needs and challenges that cannot be fully met by off-the-shelf software solutions. Organizations often require specific functionality, workflows, or integrations that align precisely with their business processes, industry regulations, or competitive advantages. By developing custom applications, companies can maintain complete control over their software’s features, security requirements, and user experience while ensuring seamless integration with their existing systems and databases. Custom applications also allow businesses to adapt quickly to changing market conditions and scale their solutions as they grow, ultimately providing them with a tailored tool that directly supports their operational efficiency and strategic objectives.

When developing custom applications that interact with SAP systems, it’s crucial to adhere to [SAP’s clean core concept](https://www.sap.com/sea/products/erp/rise/methodology/clean-core.html), which emphasizes keeping the core SAP system as clean as possible while building extensions and customizations outside the core. This approach ensures long-term maintainability and reduces the total cost of ownership by making it easier to implement SAP updates, upgrades, and innovations without disrupting custom functionality. By leveraging [SAP Business Technology Platform (BTP)](https://www.sap.com/sea/products/technology-platform.html), [AWS Cloud Services](https://aws.amazon.com/products/) and following clean core principles, organizations can create side-by-side extensions, custom applications, and integrations that preserve system stability while maintaining the agility to adapt to changing business requirements. This architectural strategy enables businesses to benefit from both customization and standardization, ensuring their applications remain sustainable and future-proof within the SAP ecosystem.

Some of the key AWS Services that will help on this custom applications:
+  [Amazon Simple Notification Service (Amazon SNS)](https://aws.amazon.com/sns/) is a web service that makes it easy to set up, operate, and send notifications from the cloud. It provides developers with a highly scalable, flexible, and cost-effective capability to publish messages from an application and immediately deliver them to subscribers or other applications. For example: you can send email to notify a failed delivery of goods, trigger an event based programs, and others.
+  [Amazon Simple Queue Services (SQS)](https://aws.amazon.com/sqs/) lets you send, store, and receive messages between software components at any volume, without losing messages or requiring other services to be available. For example: you can queue burst of high volume incoming messages for sequential processing.
+  [Amazon EventBridge](https://aws.amazon.com/eventbridge) is a service that provides [real-time access to changes](https://aws.amazon.com/eventbridge/integrations/) in data in AWS services, your own applications, and software as a service (SaaS) applications without writing code. For example: you can trigger a near-real-time event based ordering through an API Gateway to external SaaS from SAP when an out-of-stock situation happened in a warehouse.
+  [AWS SDK for ABAP](https://aws.amazon.com/sdk-for-sap-abap/) simplifies the use of AWS services alongside SAP applications with a client library of modules that are consistent and familiar to ABAP developers. For example: you can use this to automatically check the mailing address validation in SAP Business Partner maintenance screen using Amazon Location Services.
+  [AWS AI Services](https://aws.amazon.com/ai/services/), such as : [Amazon Polly](https://aws.amazon.com/polly/) to turn text to lifelike speech, [Amazon Transcribe](https://aws.amazon.com/transcribe/) to convert speech to text, [Amazon Rekognition](https://aws.amazon.com/rekognition/) to extract information and insights from images and videos.
+ For more AWS services that you can use, please refer to [this link](https://aws.amazon.com/products/?nc2=h_prod).

You can upskill yourself and your team members to [Build Resilient Applications on SAP BTP with Amazon Web Services](https://learning.sap.com/courses/build-resilient-applications-on-sap-btp-with-amazon-web-services) learning module which was jointly built by AWS and SAP.

In the following sections, we will cover architectural patterns and reference architectures that leverage SAP and AWS technologies to extend SAP processes while keeping the core clean.

 **Event-Based Application** 

In traditional business process architectures, systems often operate in silos, with tightly coupled components and rigid, predefined workflows. This approach struggles to keep pace with the dynamic nature of modern business environments. Event-based architecture emerged as a solution to these limitations, addressing several critical challenges.

With event-based architectures, you can implement end-to-end Business Processes by decoupling system components by using asynchronous communication. With this approach, you can implement a more resilient systems and business processes that can better handle network issues, service outages, and other disruptions following [AWS Well Architected Framework for SAP Lens](https://docs.aws.amazon.com/wellarchitected/latest/sap-lens/sap-lens.html).

Example of Event Based notification through Amazon SNS :

![\[Event-based notification with SNS\]](http://docs.aws.amazon.com/sap/latest/general/images/rise-jra-sns.png)


In the architecture above, a user updates a Business Partner in SAP S/4HANA, you can trigger the update event through SAP Event Mesh. The CAP Application that is enhanced with AWS SDK for Java to trigger the Amazon SNS topic which enables you to notify Data Owner for this change either through an email, text message and mobile push notification. You can find out more in [this github respository](https://github.com/SAP-samples/cloud-cap-amazon-sns-integration).

Example of Event Based notification through Amazon SQS and EventBridge, as well as [AWS IoT services](https://aws.amazon.com/iot/) :

![\[Event-based notification with SQS and Event Bridge\]](http://docs.aws.amazon.com/sap/latest/general/images/rise-jra-sqs.png)


In the architecture above, Event-Driven Integration Architectures: Leverages SAP BTP for Industry 4.0 scenarios, showcasing the versatility of SAP-AWS integration to support Predictive Maintenance scenario reducing downtime for your manufacturing line. This leverages AWS IoT Services, Amazon SQS as well as Amazon EventBridge to provide early sensor data such as speed, temperature, vibration, and others that will indicate the need of maintenance before any outage or downtime occurs for certain mechanism.

 **Artificial Intelligence and Machine Learning Application** 

Safety hazards in every workplace come in many different forms: sharp edges, falling objects, flying sparks, chemicals, noise, and other potentially dangerous situations. Safety regulators such as Occupational Safety and Health Administration (OSHA) and European Commission often require that businesses protect their employees and customers from hazards that can cause injury by providing them personal protective equipment (PPE) and ensuring their use. With Amazon Rekognition PPE detection, customers can analyze images from their on-premises cameras across all locations to automatically detect if persons in the images are wearing the required Personal Protective Equipment (PPE) such as face covers, hand covers, and head covers. SAP customers use SAP Environment health and safety module to record these detections manually as safety observations.

We provide an integration framework between [Amazon Rekognition](https://aws.amazon.com/rekognition/) and [SAP Environment, Health and Safety (EHS)](https://help.sap.com/docs/SAP_S4HANA_ON-PREMISE/1b3596cc5dd5428d887966a4193ddc29/5b22b8d6606b4d32b8af9283901d3bdc.html?locale=en-US) and adopt the open-source Events-to-Business-Actions Framework, which will automate the process of creating safety observations.

![\[Safety at scale with Amazon Rekognition PPE Detection\]](http://docs.aws.amazon.com/sap/latest/general/images/rise-jra-ppe.png)


In the architecture above, the information flow begins with CCTV cameras capturing images at a factory and storing them in [Amazon S3](https://aws.amazon.com/s3/). An [AWS Lambda](https://aws.amazon.com/pm/lambda/) function triggers Amazon Rekognition’s PPE detection model to inspect for safety equipment compliance. If violations are detected, the Lambda function retrieves credentials from AWS Secrets Manager and communicates with [SAP Integration Suite’s Advanced Event Mesh](https://www.sap.com/products/technology-platform/integration-suite/advanced-event-mesh.html). The event is then processed by the Event-to-Business-Action framework, which uses [SAP Build Process Automation](https://www.sap.com/sea/products/technology-platform/process-automation.html)'s Business Rules to determine appropriate actions. Finally, the system creates an EHS Incident Report Safety Observation in the SAP S/4HANA system through SAP Destination Service and Private Link Service. You can find out more in [this github repository](https://github.com/SAP-samples/btp-aws-ppe-detection-ehs).

# Operational Reliability
<a name="rise-jra-operational-reliability"></a>

Modern enterprises face significant hurdles in maintaining continuous availability of SAP services, particularly during regional outages or maintenance windows. Business continuity and operational reliability are critical concerns when deploying SAP Business Technology Platform (SAP BTP) and RISE with SAP.

 [Amazon Route 53](https://aws.amazon.com/route53/) is a highly available, scalable, and globally distributed Domain Name System (DNS) web service, addresses these challenges effectively. It enables customers to implement [AWS multi-region architecture](https://docs.aws.amazon.com/prescriptive-guidance/latest/aws-multi-region-fundamentals/introduction.html) for their SAP environments, providing robust fault tolerance and enhanced reliability. By leveraging Route 53’s capabilities, organizations can build resilient SAP environments that meet stringent availability requirements. This DNS service seamlessly integrates with SAP BTP services, ensuring business operations continue smoothly even during regional disruptions.

 **Understanding Amazon Route 53 in the SAP Context** 

Amazon Route 53 serves as a foundational component for building resilient SAP environments by providing intelligent DNS routing capabilities. In the context of SAP BTP and RISE with SAP, Route 53 addresses critical reliability challenges that cannot be solved through standard Availability Zone (AZ) configurations alone. While SAP BTP services support multi-AZs deployments within a single region, this approach remains vulnerable to region-wide failures. Route 53 extends this resilience by enabling traffic routing across multiple geographic regions, effectively creating a global safety net for mission-critical SAP applications.

Route 53’s architecture is designed with maximum reliability in mind through the separation of control plane and data plane functions. The data plane is explicitly designed to be [statically stable](https://aws.amazon.com/builders-library/static-stability-using-availability-zones/) in the face of, e.g. a control plane failure or partition event. This architectural separation ensures that DNS resolution remains highly available, making Route 53 an ideal foundation for disaster recovery scenarios in SAP environments. The service continuously monitors endpoint health and automatically redirects users to healthy resources when failures are detected.

Beyond simple failover capabilities, Route 53 offers sophisticated routing policies that can be tailored to specific business requirements. These include latency-based routing to direct users to the lowest-latency endpoint, geolocation routing to comply with data sovereignty regulations, and weighted routing to distribute traffic according to defined proportions. For global organizations using SAP services, these capabilities translate into consistent performance and availability for users across different geographic locations, enhancing the overall user experience while maintaining system reliability.

 **Amazon Route 53 Architecture for SAP BTP Multi-Region Resiliency** 

The foundation of a resilient SAP BTP environment using Amazon Route 53 is a well-designed multi-region architecture. This approach begins with geographic redundancy, where critical application components are deployed across different regions to eliminate a [single point of failure](https://en.wikipedia.org/wiki/Single_point_of_failure). Route 53 serves as the intelligent traffic director in this architecture, continuously monitoring the health of endpoints and making real-time routing decisions based on availability and performance metrics. When [integrated with SAP BTP’s Custom Domain service](https://github.com/SAP-samples/btp-services-intelligent-routing/tree/launchpad_aws/04-Map%20Custom%20Domain%20Routes), Route 53 provides a seamless user experience through consistent URLs, even as traffic is redirected between regions during failover events.

You can find out more in [SAP Architecture Center – Architecting Multi-Region Resiliency – Load Balancers](https://architecture.learning.sap.com/docs/ref-arch/81805673c0/3).

 **Amazon Route 53 Routing Options** 

Route 53 offers various [routing policies](https://docs.aws.amazon.com/Route53/latest/DeveloperGuide/routing-policy.html) for SAP BTP implementations:
+  **Simple routing**: Directs traffic to a single resource
+  **Weighted routing**: Distributes traffic across multiple resources in specified proportions
+  **Latency-based routing**: Routes users to the region with lowest network latency
+  **Failover routing**: Automatically redirects from unhealthy primary to healthy secondary resource
+  **Geolocation routing**: Directs traffic based on users' geographic locations
+  **Geoproximity routing**: Routes based on geographic location with optional biasing
+  **Multi-value answer routing**: Responds with up to eight healthy records selected randomly

These options can be combined to create sophisticated routing strategies tailored to specific SAP environment requirements.

 **Amazon Route 53 Implementation Patterns for SAP Environments** 

Two primary implementation patterns have emerged for SAP environments: active-passive and active-active configuration.

 **Pattern 1. Active-Passive Implementation** 

In an active-passive configuration, Route 53 directs all traffic to a primary SAP BTP region during normal operations, with a secondary region serving as a standby. This approach offers simplicity and cost-effectiveness while still providing disaster recovery capabilities. The active-passive pattern works particularly well for [SAP Build Work Zone](https://discovery-center.cloud.sap/serviceCatalog/sap-build-work-zone-standard-edition?region=all) deployments where consistent user experience is critical.

You implement this by deploying the Work Zone service in the primary region with all necessary configurations, and then using [SAP Cloud Transport Management service](https://www.sap.com/sea/products/technology-platform/cloud-transport-management.html), you replicate this setup to a secondary region. Both regions are configured with identical domains using SAP BTP Custom Domain service, while Route 53 is set up with failover routing policy and health checks monitoring the primary endpoint. When issues occur in the primary region, Route 53 automatically redirects users to the secondary region with minimal disruption.

TTL optimization directly impacts failover speed and DNS query volume. Short TTL values enable fast failover but increase DNS query traffic. The specific TTL value should align with the Recovery Point Objective (RPO) requirements. For detailed implementation steps, refer to the [SAP blog post Route Multi-Region Traffic to SAP Build Work Zone using Amazon Route 53](https://community.sap.com/t5/technology-blog-posts-by-sap/route-multi-region-traffic-to-sap-build-work-zone-standard-edition-using/ba-p/13561468) and [this github repository](https://github.com/SAP-samples/btp-services-intelligent-routing/tree/launchpad_aws).

![\[Active-Passive Implementation\]](http://docs.aws.amazon.com/sap/latest/general/images/rise-jra-opsreliability-active-passive.png)


 **Active-Active Implementation** 

The active-active pattern distributes traffic across multiple regions simultaneously, optimizing resource utilization and minimizing regional failure impact. This approach is ideal for global organizations with users across different geographic locations. A typical implementation for [SAP Cloud Application Programming (CAP)](https://pages.community.sap.com/topics/cloud-application-programming) involves deploying identical applications in multiple SAP BTP subaccounts across different regions, connected to an [Amazon Aurora](https://aws.amazon.com/rds/aurora/), which is a high performance global database cluster spanning multiple regions.

Data consistency is maintained by configuring Aurora for "read local/write global" operations, directing all writes to the primary region while allowing reads from any region. Route 53 implements latency-based or geolocation routing policies to direct users to the nearest healthy region. This setup not only provides resilience against regional outages but also improves performance by reducing latency for globally distributed users.

For implementation details, see [Distributed Resiliency of SAP CAP applications using Amazon Aurora with Amazon Route 53](https://community.sap.com/t5/-/-/m-p/13570134) and [SAP CAP Application Dynamic Data Source Routing](https://community.sap.com/t5/-/-/m-p/13558920). You can also refer to this [github repository](https://github.com/SAP-samples/cap-distributed-resiliency/tree/Data-Source-Routing/source).

![\[Active-Active Implementation\]](http://docs.aws.amazon.com/sap/latest/general/images/rise-jra-opsreliability-active-active.png)


 **Solution guidance and other considerations** 

Each implementation pattern requires careful consideration of data consistency, authentication mechanisms, and operational processes to ensure seamless user experiences during normal operations and failover events.

For broader architectural guidance, refer to [SAP BTP Multi-Region reference architectures for High Availability](https://community.sap.com/t5/-/-/m-p/13524196) and AWS's guide on [Creating Disaster Recovery Mechanisms Using Amazon Route 53](https://aws.amazon.com/blogs/networking-and-content-delivery/creating-disaster-recovery-mechanisms-using-amazon-route-53/).

# Internet of Things
<a name="rise-jra-iot"></a>

Internet of Things (IoT) refers to a network of interconnected physical devices, vehicles, home appliances, and other items embedded with electronics, software, sensors, and network connectivity, enabling these objects to collect and exchange data. IoT allows objects to be sensed and controlled remotely across existing network infrastructure, creating opportunities for direct integration between the physical world and computer-based systems.

 AWS IoT provides a comprehensive suite of services to connect, manage, and secure IoT devices at scale. At its core, [AWS IoT Core](https://aws.amazon.com/iot-core/) serves as the foundation, enabling secure device connectivity and message routing. [AWS IoT Device Management](https://aws.amazon.com/iot-device-management/) helps register, organize, monitor, and remotely manage IoT devices throughout their lifecycle. [AWS IoT Greengrass](https://aws.amazon.com/greengrass/) extends cloud capabilities to edge devices, allowing them to act locally on data while still maintaining cloud connectivity. Other complementary services in the AWS IoT family include [IoT Events](https://aws.amazon.com/iot-events/), [IoT TwinMaker](https://aws.amazon.com/iot-twinmaker/), [IoT ExpressLink](https://aws.amazon.com/iot-expresslink/), and [IoT FleetWise](https://aws.amazon.com/iot-fleetwise/), each serving specific IoT use cases and requirements.

 ** AWS IoT with SAP** 

![\[IoT with SAP\]](http://docs.aws.amazon.com/sap/latest/general/images/rise-jra-iot-sap.png)


The combination of AWS IoT services and SAP business applications creates a powerful platform for digital transformation, enabling organizations to implement smart solutions across various domains - from connected products to smart city applications. This integration helps organizations harness real-time data for improved operational visibility, enhanced customer experiences, and innovative business models, driving efficiency and accelerating innovation across the enterprise ecosystem.

In [Smart Products & Services](https://aws.amazon.com/industrial/smart-products-and-services/) scenarios, AWS IoT services enable intelligent operations through [AWS IoT SiteWise](https://aws.amazon.com/iot-sitewise/) and other services, delivering real-time insights that integrate seamlessly with SAP business modules. AWS IoT Device Management provides comprehensive monitoring across connected devices, with continuous data streams enriching SAP systems for informed decision-making. Edge computing capabilities through AWS IoT Greengrass ensure efficient data processing at the source, enabling rapid response times and optimal performance, particularly valuable for remote operations.

 AWS IoT services can integrate with [SAP Business Technology Platform (BTP)](https://www.sap.com/products/technology-platform.html) to create powerful end-to-end IoT solutions. Through SAP BTP event-driven architecture and Enterprise Messaging services, IoT data from AWS can be efficiently consumed by SAP applications in real-time. The [Cloud Application Programming (CAP)](https://pages.community.sap.com/topics/cloud-application-programming) model in SAP BTP enables rapid development of IoT-enabled business applications that can process and act on IoT data from AWS. The integration can be achieved through various methods, such as using [SAP Cloud Integration ](https://help.sap.com/docs/cloud-integration/sap-cloud-integration/sap-cloud-integration?locale=en-US), [API Management](https://help.sap.com/docs/sap-api-management/sap-api-management/what-is-api-management?locale=en-US), or direct REST APIs. For example, sensor data collected through AWS IoT Core can trigger events in SAP BTP, which can then be processed by CAP applications to update business processes, generate alerts, or trigger automated workflows in SAP systems.

 ** AWS IoT Security** 

While AWS maintains robust cloud security mechanisms to protect data movement between AWS IoT and other AWS services, customers are responsible for managing device credentials (including X.509 certificates, AWS credentials, Amazon Cognito identities, federated identities, or custom authentication tokens) and implementing appropriate access policies.

 AWS IoT implements comprehensive security measures to ensure secure device connectivity and data transmission. Devices can connect to AWS IoT using X.509 certificates or Amazon Cognito identities over Transport Layer Security (TLS) connections, with additional authentication options available for development and specific API-based applications. The AWS IoT message broker handles device authentication and manages access permissions through AWS IoT policies, while custom authentication can be implemented using custom authorizers.

Furthermore, the AWS IoT rules engine securely forwards device data to other devices or AWS services based on user-defined rules, utilizing AWS Identity and Access Management (IAM) to ensure secure data transfer to intended destinations. Customer may leverage [AWS IoT Device Defender](https://aws.amazon.com/iot-device-defender/?p=ft&c=iot&z=3&refid=a3593a2f-ae1f-4cc4-a14f-ff76e52593aa), a fully managed service that helps you secure your fleet of IoT devices.

You can find out more of [Security in AWS IoT](https://docs.aws.amazon.com/iot/latest/developerguide/security.html).

 ** AWS and SAP Joint Reference Architecture for Internet of Things** 

JRA architecture below shows the combination of AWS IoT services and SAP BTP services to build loosely coupled Edge-to-Business Process architectures.

![\[JRA for Internet Of Things\]](http://docs.aws.amazon.com/sap/latest/general/images/rise-jra-iot.png)


 **IoT events** - Edge locations can be environments like factories or shop floors where IoT devices such as cameras, PLCs, SCADA systems, IoT sensors or industrial assets collect data including temperature, vibration, and other metrics. The collected data is transmitted to AWS IoT services in the cloud using appropriate connectors running on edge runtime environments like AWS IoT Greengrass, with protocols specific to each device type. Customers have the option to sanitize data at the edge using AWS Edge computing services before transmission to the cloud. AWS IoT SiteWise Edge extends cloud capabilities to industrial edge environments, while AWS IoT Greengrass serves as a general-purpose edge framework. This edge processing helps reduce noise in data, improves data quality, and optimizes costs.

 **IoT Data Processing on AWS ** - Data received from edge locations is first processed by AWS services such as Amazon Rekognition for computer vision use cases or other AWS services for data analysis, where IT (Information Technology) and OT (Operational Technology) data insights are combined to trigger intelligent workflow automation. AWS Lambda then triggers an event to SAP BTP for the next course of action

 **SAP Business Workflow on BTP** - Control is transferred to SAP BTP services like [Event Mesh](https://www.sap.com/products/technology-platform/integration-suite/capabilities/event-mesh.html), which allows applications to communicate through asynchronous events and [Events-to-Business-Actions-Framework](https://github.com/SAP-samples/btp-events-to-business-actions-framework). This framework responds to and integrates events generated from different sources like industrial production processes, warehouses, etc., into enterprise business systems. Based on the events category and type, respective actions are triggered in SAP applications. The processor module leverages the [decisions](https://help.sap.com/docs/build-process-automation/sap-build-process-automation/create-decision) capability of [SAP Build Process Automation](https://www.sap.com/products/technology-platform/process-automation.html) to initiate business actions and also supported by other BTP services, such as HANA Cloud for storing application data. Customers can leverage private connectivity between SAP BTP and SAP RISE on AWS environment through [SAP Private Link](https://help.sap.com/docs/private-link/private-link1/what-is-sap-private-link-service) and [AWS PrivateLink service](https://aws.amazon.com/privatelink/).

 **Business Actions on RISE with SAP** - Finally, based on the business rules, appropriate SAP business processes are triggered on the RISE with SAP systems like creation of maintenance order for predictive maintenance or creation of a safety observation for EHS.

![\[JRA for Internet Of Things and Genenerative AI\]](http://docs.aws.amazon.com/sap/latest/general/images/rise-jra-iot-genai.png)


This is an alternative architecture to the one discussed in the previous section, with the following differences.

 **IoT events** – Same as Figure 1.

 **IoT Data Processing on AWS ** – Data received from edge locations is forwarded directly to the SAP BTP layer for subsequent actions, including data transformation. In this case, we are using SAP Integration Suite, [Advanced Event Mesh](https://www.sap.com/products/technology-platform/integration-suite/advanced-event-mesh.html), which has an out-of-the-box connector for S3.

 **IoT Data Processing on SAP BTP** – Control is transferred to SAP BTP services like SAP Integration Suite, Advanced Event Mesh and Events-to-Business Actions Framework. Data transformation on SAP BTP is handled using GenAI services like [Generative AI Hub](https://help.sap.com/docs/ai-launchpad/sap-ai-launchpad/generative-ai-hub), which leverages AWS Generative Foundation Models such as [Amazon Nova](https://aws.amazon.com/ai/generative-ai/nova/) to derive insights from the data for further processing. Based on the processed data, event categories and types, respective actions are triggered in SAP applications. The processor module, part of the Events-to-Business-Action framework, leverages the Decisions capability of SAP Build Process Automation to initiate business actions. Additionally, SAP HANA Cloud can be used as a vector engine for Retrieval-Augmented Generation (RAG) framework and Knowledge Graph, in addition to storing application data.

This integration enables scenarios such as predictive maintenance, real-time asset monitoring, and supply chain optimization by combining AWS's robust IoT and Generative AI capabilities with SAP’s enterprise business processes and data models.

You can find out more from SAP Architecture Center under [Build Events-to-Business Actions Scenarios with SAP BTP and AWS IoT SiteWise](https://architecture.learning.sap.com/docs/ref-arch/fbdc46aaae/3).