Skip to content

/AWS1/CL_CRLWORKERCOMPCONFPRPS

The configuration properties for the worker compute environment. These properties allow you to customize the compute settings for your Clean Rooms workloads.

CONSTRUCTOR

IMPORTING

Optional arguments:

it_spark TYPE /AWS1/CL_CRLSPARKPROPERTIES_W=>TT_SPARKPROPERTIES TT_SPARKPROPERTIES

The Spark configuration properties for SQL workloads. This map contains key-value pairs that configure Apache Spark settings to optimize performance for your data processing jobs. You can specify up to 50 Spark properties, with each key being 1-200 characters and each value being 0-500 characters. These properties allow you to adjust compute capacity for large datasets and complex workloads.


Queryable Attributes

spark

The Spark configuration properties for SQL workloads. This map contains key-value pairs that configure Apache Spark settings to optimize performance for your data processing jobs. You can specify up to 50 Spark properties, with each key being 1-200 characters and each value being 0-500 characters. These properties allow you to adjust compute capacity for large datasets and complex workloads.

Accessible with the following methods

Method Description
GET_SPARK() Getter for SPARK, with configurable default
ASK_SPARK() Getter for SPARK w/ exceptions if field has no value
HAS_SPARK() Determine if SPARK has a value