Represents a {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task google_dataplex_task}.
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTask(
scope: Construct,
id: str,
connection: typing.Union[SSHProvisionerConnection, WinrmProvisionerConnection] = None,
count: typing.Union[typing.Union[int, float], TerraformCount] = None,
depends_on: typing.List[ITerraformDependable] = None,
for_each: ITerraformIterator = None,
lifecycle: TerraformResourceLifecycle = None,
provider: TerraformProvider = None,
provisioners: typing.List[typing.Union[FileProvisioner, LocalExecProvisioner, RemoteExecProvisioner]] = None,
execution_spec: DataplexTaskExecutionSpec,
trigger_spec: DataplexTaskTriggerSpec,
description: str = None,
display_name: str = None,
id: str = None,
labels: typing.Mapping[str] = None,
lake: str = None,
location: str = None,
notebook: DataplexTaskNotebook = None,
project: str = None,
spark: DataplexTaskSpark = None,
task_id: str = None,
timeouts: DataplexTaskTimeouts = None
)
Name | Type | Description |
---|---|---|
scope |
constructs.Construct |
The scope in which to define this construct. |
id |
str |
The scoped construct ID. |
connection |
typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection] |
No description. |
count |
typing.Union[typing.Union[int, float], cdktf.TerraformCount] |
No description. |
depends_on |
typing.List[cdktf.ITerraformDependable] |
No description. |
for_each |
cdktf.ITerraformIterator |
No description. |
lifecycle |
cdktf.TerraformResourceLifecycle |
No description. |
provider |
cdktf.TerraformProvider |
No description. |
provisioners |
typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]] |
No description. |
execution_spec |
DataplexTaskExecutionSpec |
execution_spec block. |
trigger_spec |
DataplexTaskTriggerSpec |
trigger_spec block. |
description |
str |
User-provided description of the task. |
display_name |
str |
User friendly display name. |
id |
str |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#id DataplexTask#id}. |
labels |
typing.Mapping[str] |
User-defined labels for the task. |
lake |
str |
The lake in which the task will be created in. |
location |
str |
The location in which the task will be created in. |
notebook |
DataplexTaskNotebook |
notebook block. |
project |
str |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#project DataplexTask#project}. |
spark |
DataplexTaskSpark |
spark block. |
task_id |
str |
The task Id of the task. |
timeouts |
DataplexTaskTimeouts |
timeouts block. |
- Type: constructs.Construct
The scope in which to define this construct.
- Type: str
The scoped construct ID.
Must be unique amongst siblings in the same scope
- Type: typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection]
- Type: typing.Union[typing.Union[int, float], cdktf.TerraformCount]
- Type: typing.List[cdktf.ITerraformDependable]
- Type: cdktf.ITerraformIterator
- Type: cdktf.TerraformResourceLifecycle
- Type: cdktf.TerraformProvider
- Type: typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]]
execution_spec block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#execution_spec DataplexTask#execution_spec}
- Type: DataplexTaskTriggerSpec
trigger_spec block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#trigger_spec DataplexTask#trigger_spec}
- Type: str
User-provided description of the task.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#description DataplexTask#description}
- Type: str
User friendly display name.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#display_name DataplexTask#display_name}
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#id DataplexTask#id}.
Please be aware that the id field is automatically added to all resources in Terraform providers using a Terraform provider SDK version below 2. If you experience problems setting this value it might not be settable. Please take a look at the provider documentation to ensure it should be settable.
- Type: typing.Mapping[str]
User-defined labels for the task.
Note: This field is non-authoritative, and will only manage the labels present in your configuration. Please refer to the field 'effective_labels' for all of the labels present on the resource.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#labels DataplexTask#labels}
- Type: str
The lake in which the task will be created in.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#lake DataplexTask#lake}
- Type: str
The location in which the task will be created in.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#location DataplexTask#location}
- Type: DataplexTaskNotebook
notebook block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#notebook DataplexTask#notebook}
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#project DataplexTask#project}.
- Type: DataplexTaskSpark
spark block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#spark DataplexTask#spark}
- Type: str
The task Id of the task.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#task_id DataplexTask#task_id}
- Type: DataplexTaskTimeouts
timeouts block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#timeouts DataplexTask#timeouts}
Name | Description |
---|---|
to_string |
Returns a string representation of this construct. |
add_override |
No description. |
override_logical_id |
Overrides the auto-generated logical ID with a specific ID. |
reset_override_logical_id |
Resets a previously passed logical Id to use the auto-generated logical id again. |
to_hcl_terraform |
No description. |
to_metadata |
No description. |
to_terraform |
Adds this resource to the terraform JSON output. |
add_move_target |
Adds a user defined moveTarget string to this resource to be later used in .moveTo(moveTarget) to resolve the location of the move. |
get_any_map_attribute |
No description. |
get_boolean_attribute |
No description. |
get_boolean_map_attribute |
No description. |
get_list_attribute |
No description. |
get_number_attribute |
No description. |
get_number_list_attribute |
No description. |
get_number_map_attribute |
No description. |
get_string_attribute |
No description. |
get_string_map_attribute |
No description. |
has_resource_move |
No description. |
import_from |
No description. |
interpolation_for_attribute |
No description. |
move_from_id |
Move the resource corresponding to "id" to this resource. |
move_to |
Moves this resource to the target resource given by moveTarget. |
move_to_id |
Moves this resource to the resource corresponding to "id". |
put_execution_spec |
No description. |
put_notebook |
No description. |
put_spark |
No description. |
put_timeouts |
No description. |
put_trigger_spec |
No description. |
reset_description |
No description. |
reset_display_name |
No description. |
reset_id |
No description. |
reset_labels |
No description. |
reset_lake |
No description. |
reset_location |
No description. |
reset_notebook |
No description. |
reset_project |
No description. |
reset_spark |
No description. |
reset_task_id |
No description. |
reset_timeouts |
No description. |
def to_string() -> str
Returns a string representation of this construct.
def add_override(
path: str,
value: typing.Any
) -> None
- Type: str
- Type: typing.Any
def override_logical_id(
new_logical_id: str
) -> None
Overrides the auto-generated logical ID with a specific ID.
- Type: str
The new logical ID to use for this stack element.
def reset_override_logical_id() -> None
Resets a previously passed logical Id to use the auto-generated logical id again.
def to_hcl_terraform() -> typing.Any
def to_metadata() -> typing.Any
def to_terraform() -> typing.Any
Adds this resource to the terraform JSON output.
def add_move_target(
move_target: str
) -> None
Adds a user defined moveTarget string to this resource to be later used in .moveTo(moveTarget) to resolve the location of the move.
- Type: str
The string move target that will correspond to this resource.
def get_any_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Any]
- Type: str
def get_boolean_attribute(
terraform_attribute: str
) -> IResolvable
- Type: str
def get_boolean_map_attribute(
terraform_attribute: str
) -> typing.Mapping[bool]
- Type: str
def get_list_attribute(
terraform_attribute: str
) -> typing.List[str]
- Type: str
def get_number_attribute(
terraform_attribute: str
) -> typing.Union[int, float]
- Type: str
def get_number_list_attribute(
terraform_attribute: str
) -> typing.List[typing.Union[int, float]]
- Type: str
def get_number_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Union[int, float]]
- Type: str
def get_string_attribute(
terraform_attribute: str
) -> str
- Type: str
def get_string_map_attribute(
terraform_attribute: str
) -> typing.Mapping[str]
- Type: str
def has_resource_move() -> typing.Union[TerraformResourceMoveByTarget, TerraformResourceMoveById]
def import_from(
id: str,
provider: TerraformProvider = None
) -> None
- Type: str
- Type: cdktf.TerraformProvider
def interpolation_for_attribute(
terraform_attribute: str
) -> IResolvable
- Type: str
def move_from_id(
id: str
) -> None
Move the resource corresponding to "id" to this resource.
Note that the resource being moved from must be marked as moved using it's instance function.
- Type: str
Full id of resource being moved from, e.g. "aws_s3_bucket.example".
def move_to(
move_target: str,
index: typing.Union[str, typing.Union[int, float]] = None
) -> None
Moves this resource to the target resource given by moveTarget.
- Type: str
The previously set user defined string set by .addMoveTarget() corresponding to the resource to move to.
- Type: typing.Union[str, typing.Union[int, float]]
Optional The index corresponding to the key the resource is to appear in the foreach of a resource to move to.
def move_to_id(
id: str
) -> None
Moves this resource to the resource corresponding to "id".
- Type: str
Full id of resource to move to, e.g. "aws_s3_bucket.example".
def put_execution_spec(
service_account: str,
args: typing.Mapping[str] = None,
kms_key: str = None,
max_job_execution_lifetime: str = None,
project: str = None
) -> None
- Type: str
Service account to use to execute a task.
If not provided, the default Compute service account for the project is used.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#service_account DataplexTask#service_account}
- Type: typing.Mapping[str]
The arguments to pass to the task.
The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${taskId} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument. An object containing a list of 'key': value pairs. Example: { 'name': 'wrench', 'mass': '1.3kg', 'count': '3' }.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#args DataplexTask#args}
- Type: str
The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{locationId}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#kms_key DataplexTask#kms_key}
- Type: str
The maximum duration after which the job execution is expired.
A duration in seconds with up to nine fractional digits, ending with 's'. Example: '3.5s'.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#max_job_execution_lifetime DataplexTask#max_job_execution_lifetime}
- Type: str
The project in which jobs are run.
By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#project DataplexTask#project}
def put_notebook(
notebook: str,
archive_uris: typing.List[str] = None,
file_uris: typing.List[str] = None,
infrastructure_spec: DataplexTaskNotebookInfrastructureSpec = None
) -> None
- Type: str
Path to input notebook.
This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#notebook DataplexTask#notebook}
- Type: typing.List[str]
Cloud Storage URIs of archives to be extracted into the working directory of each executor.
Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#archive_uris DataplexTask#archive_uris}
- Type: typing.List[str]
Cloud Storage URIs of files to be placed in the working directory of each executor.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#file_uris DataplexTask#file_uris}
infrastructure_spec block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#infrastructure_spec DataplexTask#infrastructure_spec}
def put_spark(
archive_uris: typing.List[str] = None,
file_uris: typing.List[str] = None,
infrastructure_spec: DataplexTaskSparkInfrastructureSpec = None,
main_class: str = None,
main_jar_file_uri: str = None,
python_script_file: str = None,
sql_script: str = None,
sql_script_file: str = None
) -> None
- Type: typing.List[str]
Cloud Storage URIs of archives to be extracted into the working directory of each executor.
Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#archive_uris DataplexTask#archive_uris}
- Type: typing.List[str]
Cloud Storage URIs of files to be placed in the working directory of each executor.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#file_uris DataplexTask#file_uris}
infrastructure_spec block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#infrastructure_spec DataplexTask#infrastructure_spec}
- Type: str
The name of the driver's main class.
The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#main_class DataplexTask#main_class}
- Type: str
The Cloud Storage URI of the jar file that contains the main class.
The execution args are passed in as a sequence of named process arguments (--key=value).
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#main_jar_file_uri DataplexTask#main_jar_file_uri}
- Type: str
The Gcloud Storage URI of the main Python file to use as the driver.
Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#python_script_file DataplexTask#python_script_file}
- Type: str
The query text. The execution args are used to declare a set of script variables (set key='value';).
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#sql_script DataplexTask#sql_script}
- Type: str
A reference to a query file.
This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key='value';).
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#sql_script_file DataplexTask#sql_script_file}
def put_timeouts(
create: str = None,
delete: str = None,
update: str = None
) -> None
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#create DataplexTask#create}.
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#delete DataplexTask#delete}.
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#update DataplexTask#update}.
def put_trigger_spec(
type: str,
disabled: typing.Union[bool, IResolvable] = None,
max_retries: typing.Union[int, float] = None,
schedule: str = None,
start_time: str = None
) -> None
- Type: str
Trigger type of the user-specified Task Possible values: ["ON_DEMAND", "RECURRING"].
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#type DataplexTask#type}
- Type: typing.Union[bool, cdktf.IResolvable]
Prevent the task from executing.
This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#disabled DataplexTask#disabled}
- Type: typing.Union[int, float]
Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#max_retries DataplexTask#max_retries}
- Type: str
Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: 'CRON_TZ=${IANA_TIME_ZONE}' or 'TZ=${IANA_TIME_ZONE}'. The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#schedule DataplexTask#schedule}
- Type: str
The first run of the task will be after this time.
If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#start_time DataplexTask#start_time}
def reset_description() -> None
def reset_display_name() -> None
def reset_id() -> None
def reset_labels() -> None
def reset_lake() -> None
def reset_location() -> None
def reset_notebook() -> None
def reset_project() -> None
def reset_spark() -> None
def reset_task_id() -> None
def reset_timeouts() -> None
Name | Description |
---|---|
is_construct |
Checks if x is a construct. |
is_terraform_element |
No description. |
is_terraform_resource |
No description. |
generate_config_for_import |
Generates CDKTF code for importing a DataplexTask resource upon running "cdktf plan ". |
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTask.is_construct(
x: typing.Any
)
Checks if x
is a construct.
Use this method instead of instanceof
to properly detect Construct
instances, even when the construct library is symlinked.
Explanation: in JavaScript, multiple copies of the constructs
library on
disk are seen as independent, completely different libraries. As a
consequence, the class Construct
in each copy of the constructs
library
is seen as a different class, and an instance of one class will not test as
instanceof
the other class. npm install
will not create installations
like this, but users may manually symlink construct libraries together or
use a monorepo tool: in those cases, multiple copies of the constructs
library can be accidentally installed, and instanceof
will behave
unpredictably. It is safest to avoid using instanceof
, and using
this type-testing method instead.
- Type: typing.Any
Any object.
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTask.is_terraform_element(
x: typing.Any
)
- Type: typing.Any
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTask.is_terraform_resource(
x: typing.Any
)
- Type: typing.Any
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTask.generate_config_for_import(
scope: Construct,
import_to_id: str,
import_from_id: str,
provider: TerraformProvider = None
)
Generates CDKTF code for importing a DataplexTask resource upon running "cdktf plan ".
- Type: constructs.Construct
The scope in which to define this construct.
- Type: str
The construct id used in the generated config for the DataplexTask to import.
- Type: str
The id of the existing DataplexTask that should be imported.
Refer to the {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#import import section} in the documentation of this resource for the id to use
- Type: cdktf.TerraformProvider
? Optional instance of the provider where the DataplexTask to import is found.
Name | Type | Description |
---|---|---|
node |
constructs.Node |
The tree node. |
cdktf_stack |
cdktf.TerraformStack |
No description. |
fqn |
str |
No description. |
friendly_unique_id |
str |
No description. |
terraform_meta_arguments |
typing.Mapping[typing.Any] |
No description. |
terraform_resource_type |
str |
No description. |
terraform_generator_metadata |
cdktf.TerraformProviderGeneratorMetadata |
No description. |
connection |
typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection] |
No description. |
count |
typing.Union[typing.Union[int, float], cdktf.TerraformCount] |
No description. |
depends_on |
typing.List[str] |
No description. |
for_each |
cdktf.ITerraformIterator |
No description. |
lifecycle |
cdktf.TerraformResourceLifecycle |
No description. |
provider |
cdktf.TerraformProvider |
No description. |
provisioners |
typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]] |
No description. |
create_time |
str |
No description. |
effective_labels |
cdktf.StringMap |
No description. |
execution_spec |
DataplexTaskExecutionSpecOutputReference |
No description. |
execution_status |
DataplexTaskExecutionStatusList |
No description. |
name |
str |
No description. |
notebook |
DataplexTaskNotebookOutputReference |
No description. |
spark |
DataplexTaskSparkOutputReference |
No description. |
state |
str |
No description. |
terraform_labels |
cdktf.StringMap |
No description. |
timeouts |
DataplexTaskTimeoutsOutputReference |
No description. |
trigger_spec |
DataplexTaskTriggerSpecOutputReference |
No description. |
uid |
str |
No description. |
update_time |
str |
No description. |
description_input |
str |
No description. |
display_name_input |
str |
No description. |
execution_spec_input |
DataplexTaskExecutionSpec |
No description. |
id_input |
str |
No description. |
labels_input |
typing.Mapping[str] |
No description. |
lake_input |
str |
No description. |
location_input |
str |
No description. |
notebook_input |
DataplexTaskNotebook |
No description. |
project_input |
str |
No description. |
spark_input |
DataplexTaskSpark |
No description. |
task_id_input |
str |
No description. |
timeouts_input |
typing.Union[cdktf.IResolvable, DataplexTaskTimeouts] |
No description. |
trigger_spec_input |
DataplexTaskTriggerSpec |
No description. |
description |
str |
No description. |
display_name |
str |
No description. |
id |
str |
No description. |
labels |
typing.Mapping[str] |
No description. |
lake |
str |
No description. |
location |
str |
No description. |
project |
str |
No description. |
task_id |
str |
No description. |
node: Node
- Type: constructs.Node
The tree node.
cdktf_stack: TerraformStack
- Type: cdktf.TerraformStack
fqn: str
- Type: str
friendly_unique_id: str
- Type: str
terraform_meta_arguments: typing.Mapping[typing.Any]
- Type: typing.Mapping[typing.Any]
terraform_resource_type: str
- Type: str
terraform_generator_metadata: TerraformProviderGeneratorMetadata
- Type: cdktf.TerraformProviderGeneratorMetadata
connection: typing.Union[SSHProvisionerConnection, WinrmProvisionerConnection]
- Type: typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection]
count: typing.Union[typing.Union[int, float], TerraformCount]
- Type: typing.Union[typing.Union[int, float], cdktf.TerraformCount]
depends_on: typing.List[str]
- Type: typing.List[str]
for_each: ITerraformIterator
- Type: cdktf.ITerraformIterator
lifecycle: TerraformResourceLifecycle
- Type: cdktf.TerraformResourceLifecycle
provider: TerraformProvider
- Type: cdktf.TerraformProvider
provisioners: typing.List[typing.Union[FileProvisioner, LocalExecProvisioner, RemoteExecProvisioner]]
- Type: typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]]
create_time: str
- Type: str
effective_labels: StringMap
- Type: cdktf.StringMap
execution_spec: DataplexTaskExecutionSpecOutputReference
execution_status: DataplexTaskExecutionStatusList
name: str
- Type: str
notebook: DataplexTaskNotebookOutputReference
spark: DataplexTaskSparkOutputReference
state: str
- Type: str
terraform_labels: StringMap
- Type: cdktf.StringMap
timeouts: DataplexTaskTimeoutsOutputReference
trigger_spec: DataplexTaskTriggerSpecOutputReference
uid: str
- Type: str
update_time: str
- Type: str
description_input: str
- Type: str
display_name_input: str
- Type: str
execution_spec_input: DataplexTaskExecutionSpec
id_input: str
- Type: str
labels_input: typing.Mapping[str]
- Type: typing.Mapping[str]
lake_input: str
- Type: str
location_input: str
- Type: str
notebook_input: DataplexTaskNotebook
- Type: DataplexTaskNotebook
project_input: str
- Type: str
spark_input: DataplexTaskSpark
- Type: DataplexTaskSpark
task_id_input: str
- Type: str
timeouts_input: typing.Union[IResolvable, DataplexTaskTimeouts]
- Type: typing.Union[cdktf.IResolvable, DataplexTaskTimeouts]
trigger_spec_input: DataplexTaskTriggerSpec
- Type: DataplexTaskTriggerSpec
description: str
- Type: str
display_name: str
- Type: str
id: str
- Type: str
labels: typing.Mapping[str]
- Type: typing.Mapping[str]
lake: str
- Type: str
location: str
- Type: str
project: str
- Type: str
task_id: str
- Type: str
Name | Type | Description |
---|---|---|
tfResourceType |
str |
No description. |
tfResourceType: str
- Type: str
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskConfig(
connection: typing.Union[SSHProvisionerConnection, WinrmProvisionerConnection] = None,
count: typing.Union[typing.Union[int, float], TerraformCount] = None,
depends_on: typing.List[ITerraformDependable] = None,
for_each: ITerraformIterator = None,
lifecycle: TerraformResourceLifecycle = None,
provider: TerraformProvider = None,
provisioners: typing.List[typing.Union[FileProvisioner, LocalExecProvisioner, RemoteExecProvisioner]] = None,
execution_spec: DataplexTaskExecutionSpec,
trigger_spec: DataplexTaskTriggerSpec,
description: str = None,
display_name: str = None,
id: str = None,
labels: typing.Mapping[str] = None,
lake: str = None,
location: str = None,
notebook: DataplexTaskNotebook = None,
project: str = None,
spark: DataplexTaskSpark = None,
task_id: str = None,
timeouts: DataplexTaskTimeouts = None
)
Name | Type | Description |
---|---|---|
connection |
typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection] |
No description. |
count |
typing.Union[typing.Union[int, float], cdktf.TerraformCount] |
No description. |
depends_on |
typing.List[cdktf.ITerraformDependable] |
No description. |
for_each |
cdktf.ITerraformIterator |
No description. |
lifecycle |
cdktf.TerraformResourceLifecycle |
No description. |
provider |
cdktf.TerraformProvider |
No description. |
provisioners |
typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]] |
No description. |
execution_spec |
DataplexTaskExecutionSpec |
execution_spec block. |
trigger_spec |
DataplexTaskTriggerSpec |
trigger_spec block. |
description |
str |
User-provided description of the task. |
display_name |
str |
User friendly display name. |
id |
str |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#id DataplexTask#id}. |
labels |
typing.Mapping[str] |
User-defined labels for the task. |
lake |
str |
The lake in which the task will be created in. |
location |
str |
The location in which the task will be created in. |
notebook |
DataplexTaskNotebook |
notebook block. |
project |
str |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#project DataplexTask#project}. |
spark |
DataplexTaskSpark |
spark block. |
task_id |
str |
The task Id of the task. |
timeouts |
DataplexTaskTimeouts |
timeouts block. |
connection: typing.Union[SSHProvisionerConnection, WinrmProvisionerConnection]
- Type: typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection]
count: typing.Union[typing.Union[int, float], TerraformCount]
- Type: typing.Union[typing.Union[int, float], cdktf.TerraformCount]
depends_on: typing.List[ITerraformDependable]
- Type: typing.List[cdktf.ITerraformDependable]
for_each: ITerraformIterator
- Type: cdktf.ITerraformIterator
lifecycle: TerraformResourceLifecycle
- Type: cdktf.TerraformResourceLifecycle
provider: TerraformProvider
- Type: cdktf.TerraformProvider
provisioners: typing.List[typing.Union[FileProvisioner, LocalExecProvisioner, RemoteExecProvisioner]]
- Type: typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]]
execution_spec: DataplexTaskExecutionSpec
execution_spec block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#execution_spec DataplexTask#execution_spec}
trigger_spec: DataplexTaskTriggerSpec
- Type: DataplexTaskTriggerSpec
trigger_spec block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#trigger_spec DataplexTask#trigger_spec}
description: str
- Type: str
User-provided description of the task.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#description DataplexTask#description}
display_name: str
- Type: str
User friendly display name.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#display_name DataplexTask#display_name}
id: str
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#id DataplexTask#id}.
Please be aware that the id field is automatically added to all resources in Terraform providers using a Terraform provider SDK version below 2. If you experience problems setting this value it might not be settable. Please take a look at the provider documentation to ensure it should be settable.
labels: typing.Mapping[str]
- Type: typing.Mapping[str]
User-defined labels for the task.
Note: This field is non-authoritative, and will only manage the labels present in your configuration. Please refer to the field 'effective_labels' for all of the labels present on the resource.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#labels DataplexTask#labels}
lake: str
- Type: str
The lake in which the task will be created in.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#lake DataplexTask#lake}
location: str
- Type: str
The location in which the task will be created in.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#location DataplexTask#location}
notebook: DataplexTaskNotebook
- Type: DataplexTaskNotebook
notebook block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#notebook DataplexTask#notebook}
project: str
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#project DataplexTask#project}.
spark: DataplexTaskSpark
- Type: DataplexTaskSpark
spark block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#spark DataplexTask#spark}
task_id: str
- Type: str
The task Id of the task.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#task_id DataplexTask#task_id}
timeouts: DataplexTaskTimeouts
- Type: DataplexTaskTimeouts
timeouts block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#timeouts DataplexTask#timeouts}
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskExecutionSpec(
service_account: str,
args: typing.Mapping[str] = None,
kms_key: str = None,
max_job_execution_lifetime: str = None,
project: str = None
)
Name | Type | Description |
---|---|---|
service_account |
str |
Service account to use to execute a task. |
args |
typing.Mapping[str] |
The arguments to pass to the task. |
kms_key |
str |
The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{locationId}/keyRings/{key-ring-name}/cryptoKeys/{key-name}. |
max_job_execution_lifetime |
str |
The maximum duration after which the job execution is expired. |
project |
str |
The project in which jobs are run. |
service_account: str
- Type: str
Service account to use to execute a task.
If not provided, the default Compute service account for the project is used.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#service_account DataplexTask#service_account}
args: typing.Mapping[str]
- Type: typing.Mapping[str]
The arguments to pass to the task.
The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${taskId} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument. An object containing a list of 'key': value pairs. Example: { 'name': 'wrench', 'mass': '1.3kg', 'count': '3' }.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#args DataplexTask#args}
kms_key: str
- Type: str
The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{locationId}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#kms_key DataplexTask#kms_key}
max_job_execution_lifetime: str
- Type: str
The maximum duration after which the job execution is expired.
A duration in seconds with up to nine fractional digits, ending with 's'. Example: '3.5s'.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#max_job_execution_lifetime DataplexTask#max_job_execution_lifetime}
project: str
- Type: str
The project in which jobs are run.
By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#project DataplexTask#project}
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskExecutionStatus()
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskExecutionStatusLatestJob()
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskNotebook(
notebook: str,
archive_uris: typing.List[str] = None,
file_uris: typing.List[str] = None,
infrastructure_spec: DataplexTaskNotebookInfrastructureSpec = None
)
Name | Type | Description |
---|---|---|
notebook |
str |
Path to input notebook. |
archive_uris |
typing.List[str] |
Cloud Storage URIs of archives to be extracted into the working directory of each executor. |
file_uris |
typing.List[str] |
Cloud Storage URIs of files to be placed in the working directory of each executor. |
infrastructure_spec |
DataplexTaskNotebookInfrastructureSpec |
infrastructure_spec block. |
notebook: str
- Type: str
Path to input notebook.
This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#notebook DataplexTask#notebook}
archive_uris: typing.List[str]
- Type: typing.List[str]
Cloud Storage URIs of archives to be extracted into the working directory of each executor.
Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#archive_uris DataplexTask#archive_uris}
file_uris: typing.List[str]
- Type: typing.List[str]
Cloud Storage URIs of files to be placed in the working directory of each executor.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#file_uris DataplexTask#file_uris}
infrastructure_spec: DataplexTaskNotebookInfrastructureSpec
infrastructure_spec block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#infrastructure_spec DataplexTask#infrastructure_spec}
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskNotebookInfrastructureSpec(
batch: DataplexTaskNotebookInfrastructureSpecBatch = None,
container_image: DataplexTaskNotebookInfrastructureSpecContainerImage = None,
vpc_network: DataplexTaskNotebookInfrastructureSpecVpcNetwork = None
)
Name | Type | Description |
---|---|---|
batch |
DataplexTaskNotebookInfrastructureSpecBatch |
batch block. |
container_image |
DataplexTaskNotebookInfrastructureSpecContainerImage |
container_image block. |
vpc_network |
DataplexTaskNotebookInfrastructureSpecVpcNetwork |
vpc_network block. |
batch: DataplexTaskNotebookInfrastructureSpecBatch
batch block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#batch DataplexTask#batch}
container_image: DataplexTaskNotebookInfrastructureSpecContainerImage
container_image block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#container_image DataplexTask#container_image}
vpc_network: DataplexTaskNotebookInfrastructureSpecVpcNetwork
vpc_network block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#vpc_network DataplexTask#vpc_network}
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskNotebookInfrastructureSpecBatch(
executors_count: typing.Union[int, float] = None,
max_executors_count: typing.Union[int, float] = None
)
Name | Type | Description |
---|---|---|
executors_count |
typing.Union[int, float] |
Total number of job executors. Executor Count should be between 2 and 100. [Default=2]. |
max_executors_count |
typing.Union[int, float] |
Max configurable executors. |
executors_count: typing.Union[int, float]
- Type: typing.Union[int, float]
Total number of job executors. Executor Count should be between 2 and 100. [Default=2].
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#executors_count DataplexTask#executors_count}
max_executors_count: typing.Union[int, float]
- Type: typing.Union[int, float]
Max configurable executors.
If maxExecutorsCount > executorsCount, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. [Default=1000]
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#max_executors_count DataplexTask#max_executors_count}
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskNotebookInfrastructureSpecContainerImage(
image: str = None,
java_jars: typing.List[str] = None,
properties: typing.Mapping[str] = None,
python_packages: typing.List[str] = None
)
Name | Type | Description |
---|---|---|
image |
str |
Container image to use. |
java_jars |
typing.List[str] |
A list of Java JARS to add to the classpath. |
properties |
typing.Mapping[str] |
Override to common configuration of open source components installed on the Dataproc cluster. |
python_packages |
typing.List[str] |
A list of python packages to be installed. |
image: str
- Type: str
Container image to use.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#image DataplexTask#image}
java_jars: typing.List[str]
- Type: typing.List[str]
A list of Java JARS to add to the classpath.
Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#java_jars DataplexTask#java_jars}
properties: typing.Mapping[str]
- Type: typing.Mapping[str]
Override to common configuration of open source components installed on the Dataproc cluster.
The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#properties DataplexTask#properties}
python_packages: typing.List[str]
- Type: typing.List[str]
A list of python packages to be installed.
Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#python_packages DataplexTask#python_packages}
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskNotebookInfrastructureSpecVpcNetwork(
network: str = None,
network_tags: typing.List[str] = None,
sub_network: str = None
)
Name | Type | Description |
---|---|---|
network |
str |
The Cloud VPC network in which the job is run. |
network_tags |
typing.List[str] |
List of network tags to apply to the job. |
sub_network |
str |
The Cloud VPC sub-network in which the job is run. |
network: str
- Type: str
The Cloud VPC network in which the job is run.
By default, the Cloud VPC network named Default within the project is used.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#network DataplexTask#network}
network_tags: typing.List[str]
- Type: typing.List[str]
List of network tags to apply to the job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#network_tags DataplexTask#network_tags}
sub_network: str
- Type: str
The Cloud VPC sub-network in which the job is run.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#sub_network DataplexTask#sub_network}
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskSpark(
archive_uris: typing.List[str] = None,
file_uris: typing.List[str] = None,
infrastructure_spec: DataplexTaskSparkInfrastructureSpec = None,
main_class: str = None,
main_jar_file_uri: str = None,
python_script_file: str = None,
sql_script: str = None,
sql_script_file: str = None
)
Name | Type | Description |
---|---|---|
archive_uris |
typing.List[str] |
Cloud Storage URIs of archives to be extracted into the working directory of each executor. |
file_uris |
typing.List[str] |
Cloud Storage URIs of files to be placed in the working directory of each executor. |
infrastructure_spec |
DataplexTaskSparkInfrastructureSpec |
infrastructure_spec block. |
main_class |
str |
The name of the driver's main class. |
main_jar_file_uri |
str |
The Cloud Storage URI of the jar file that contains the main class. |
python_script_file |
str |
The Gcloud Storage URI of the main Python file to use as the driver. |
sql_script |
str |
The query text. The execution args are used to declare a set of script variables (set key='value';). |
sql_script_file |
str |
A reference to a query file. |
archive_uris: typing.List[str]
- Type: typing.List[str]
Cloud Storage URIs of archives to be extracted into the working directory of each executor.
Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#archive_uris DataplexTask#archive_uris}
file_uris: typing.List[str]
- Type: typing.List[str]
Cloud Storage URIs of files to be placed in the working directory of each executor.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#file_uris DataplexTask#file_uris}
infrastructure_spec: DataplexTaskSparkInfrastructureSpec
infrastructure_spec block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#infrastructure_spec DataplexTask#infrastructure_spec}
main_class: str
- Type: str
The name of the driver's main class.
The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#main_class DataplexTask#main_class}
main_jar_file_uri: str
- Type: str
The Cloud Storage URI of the jar file that contains the main class.
The execution args are passed in as a sequence of named process arguments (--key=value).
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#main_jar_file_uri DataplexTask#main_jar_file_uri}
python_script_file: str
- Type: str
The Gcloud Storage URI of the main Python file to use as the driver.
Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#python_script_file DataplexTask#python_script_file}
sql_script: str
- Type: str
The query text. The execution args are used to declare a set of script variables (set key='value';).
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#sql_script DataplexTask#sql_script}
sql_script_file: str
- Type: str
A reference to a query file.
This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key='value';).
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#sql_script_file DataplexTask#sql_script_file}
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskSparkInfrastructureSpec(
batch: DataplexTaskSparkInfrastructureSpecBatch = None,
container_image: DataplexTaskSparkInfrastructureSpecContainerImage = None,
vpc_network: DataplexTaskSparkInfrastructureSpecVpcNetwork = None
)
Name | Type | Description |
---|---|---|
batch |
DataplexTaskSparkInfrastructureSpecBatch |
batch block. |
container_image |
DataplexTaskSparkInfrastructureSpecContainerImage |
container_image block. |
vpc_network |
DataplexTaskSparkInfrastructureSpecVpcNetwork |
vpc_network block. |
batch: DataplexTaskSparkInfrastructureSpecBatch
batch block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#batch DataplexTask#batch}
container_image: DataplexTaskSparkInfrastructureSpecContainerImage
container_image block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#container_image DataplexTask#container_image}
vpc_network: DataplexTaskSparkInfrastructureSpecVpcNetwork
vpc_network block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#vpc_network DataplexTask#vpc_network}
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskSparkInfrastructureSpecBatch(
executors_count: typing.Union[int, float] = None,
max_executors_count: typing.Union[int, float] = None
)
Name | Type | Description |
---|---|---|
executors_count |
typing.Union[int, float] |
Total number of job executors. Executor Count should be between 2 and 100. [Default=2]. |
max_executors_count |
typing.Union[int, float] |
Max configurable executors. |
executors_count: typing.Union[int, float]
- Type: typing.Union[int, float]
Total number of job executors. Executor Count should be between 2 and 100. [Default=2].
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#executors_count DataplexTask#executors_count}
max_executors_count: typing.Union[int, float]
- Type: typing.Union[int, float]
Max configurable executors.
If maxExecutorsCount > executorsCount, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. [Default=1000]
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#max_executors_count DataplexTask#max_executors_count}
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskSparkInfrastructureSpecContainerImage(
image: str = None,
java_jars: typing.List[str] = None,
properties: typing.Mapping[str] = None,
python_packages: typing.List[str] = None
)
Name | Type | Description |
---|---|---|
image |
str |
Container image to use. |
java_jars |
typing.List[str] |
A list of Java JARS to add to the classpath. |
properties |
typing.Mapping[str] |
Override to common configuration of open source components installed on the Dataproc cluster. |
python_packages |
typing.List[str] |
A list of python packages to be installed. |
image: str
- Type: str
Container image to use.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#image DataplexTask#image}
java_jars: typing.List[str]
- Type: typing.List[str]
A list of Java JARS to add to the classpath.
Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#java_jars DataplexTask#java_jars}
properties: typing.Mapping[str]
- Type: typing.Mapping[str]
Override to common configuration of open source components installed on the Dataproc cluster.
The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#properties DataplexTask#properties}
python_packages: typing.List[str]
- Type: typing.List[str]
A list of python packages to be installed.
Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#python_packages DataplexTask#python_packages}
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskSparkInfrastructureSpecVpcNetwork(
network: str = None,
network_tags: typing.List[str] = None,
sub_network: str = None
)
Name | Type | Description |
---|---|---|
network |
str |
The Cloud VPC network in which the job is run. |
network_tags |
typing.List[str] |
List of network tags to apply to the job. |
sub_network |
str |
The Cloud VPC sub-network in which the job is run. |
network: str
- Type: str
The Cloud VPC network in which the job is run.
By default, the Cloud VPC network named Default within the project is used.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#network DataplexTask#network}
network_tags: typing.List[str]
- Type: typing.List[str]
List of network tags to apply to the job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#network_tags DataplexTask#network_tags}
sub_network: str
- Type: str
The Cloud VPC sub-network in which the job is run.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#sub_network DataplexTask#sub_network}
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskTimeouts(
create: str = None,
delete: str = None,
update: str = None
)
Name | Type | Description |
---|---|---|
create |
str |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#create DataplexTask#create}. |
delete |
str |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#delete DataplexTask#delete}. |
update |
str |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#update DataplexTask#update}. |
create: str
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#create DataplexTask#create}.
delete: str
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#delete DataplexTask#delete}.
update: str
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#update DataplexTask#update}.
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskTriggerSpec(
type: str,
disabled: typing.Union[bool, IResolvable] = None,
max_retries: typing.Union[int, float] = None,
schedule: str = None,
start_time: str = None
)
Name | Type | Description |
---|---|---|
type |
str |
Trigger type of the user-specified Task Possible values: ["ON_DEMAND", "RECURRING"]. |
disabled |
typing.Union[bool, cdktf.IResolvable] |
Prevent the task from executing. |
max_retries |
typing.Union[int, float] |
Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task. |
schedule |
str |
Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: 'CRON_TZ=${IANA_TIME_ZONE}' or 'TZ=${IANA_TIME_ZONE}'. The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks. |
start_time |
str |
The first run of the task will be after this time. |
type: str
- Type: str
Trigger type of the user-specified Task Possible values: ["ON_DEMAND", "RECURRING"].
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#type DataplexTask#type}
disabled: typing.Union[bool, IResolvable]
- Type: typing.Union[bool, cdktf.IResolvable]
Prevent the task from executing.
This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#disabled DataplexTask#disabled}
max_retries: typing.Union[int, float]
- Type: typing.Union[int, float]
Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#max_retries DataplexTask#max_retries}
schedule: str
- Type: str
Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: 'CRON_TZ=${IANA_TIME_ZONE}' or 'TZ=${IANA_TIME_ZONE}'. The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#schedule DataplexTask#schedule}
start_time: str
- Type: str
The first run of the task will be after this time.
If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#start_time DataplexTask#start_time}
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskExecutionSpecOutputReference(
terraform_resource: IInterpolatingParent,
terraform_attribute: str
)
Name | Type | Description |
---|---|---|
terraform_resource |
cdktf.IInterpolatingParent |
The parent resource. |
terraform_attribute |
str |
The attribute on the parent resource this class is referencing. |
- Type: cdktf.IInterpolatingParent
The parent resource.
- Type: str
The attribute on the parent resource this class is referencing.
Name | Description |
---|---|
compute_fqn |
No description. |
get_any_map_attribute |
No description. |
get_boolean_attribute |
No description. |
get_boolean_map_attribute |
No description. |
get_list_attribute |
No description. |
get_number_attribute |
No description. |
get_number_list_attribute |
No description. |
get_number_map_attribute |
No description. |
get_string_attribute |
No description. |
get_string_map_attribute |
No description. |
interpolation_for_attribute |
No description. |
resolve |
Produce the Token's value at resolution time. |
to_string |
Return a string representation of this resolvable object. |
reset_args |
No description. |
reset_kms_key |
No description. |
reset_max_job_execution_lifetime |
No description. |
reset_project |
No description. |
def compute_fqn() -> str
def get_any_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Any]
- Type: str
def get_boolean_attribute(
terraform_attribute: str
) -> IResolvable
- Type: str
def get_boolean_map_attribute(
terraform_attribute: str
) -> typing.Mapping[bool]
- Type: str
def get_list_attribute(
terraform_attribute: str
) -> typing.List[str]
- Type: str
def get_number_attribute(
terraform_attribute: str
) -> typing.Union[int, float]
- Type: str
def get_number_list_attribute(
terraform_attribute: str
) -> typing.List[typing.Union[int, float]]
- Type: str
def get_number_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Union[int, float]]
- Type: str
def get_string_attribute(
terraform_attribute: str
) -> str
- Type: str
def get_string_map_attribute(
terraform_attribute: str
) -> typing.Mapping[str]
- Type: str
def interpolation_for_attribute(
property: str
) -> IResolvable
- Type: str
def resolve(
_context: IResolveContext
) -> typing.Any
Produce the Token's value at resolution time.
- Type: cdktf.IResolveContext
def to_string() -> str
Return a string representation of this resolvable object.
Returns a reversible string representation.
def reset_args() -> None
def reset_kms_key() -> None
def reset_max_job_execution_lifetime() -> None
def reset_project() -> None
Name | Type | Description |
---|---|---|
creation_stack |
typing.List[str] |
The creation stack of this resolvable which will be appended to errors thrown during resolution. |
fqn |
str |
No description. |
args_input |
typing.Mapping[str] |
No description. |
kms_key_input |
str |
No description. |
max_job_execution_lifetime_input |
str |
No description. |
project_input |
str |
No description. |
service_account_input |
str |
No description. |
args |
typing.Mapping[str] |
No description. |
kms_key |
str |
No description. |
max_job_execution_lifetime |
str |
No description. |
project |
str |
No description. |
service_account |
str |
No description. |
internal_value |
DataplexTaskExecutionSpec |
No description. |
creation_stack: typing.List[str]
- Type: typing.List[str]
The creation stack of this resolvable which will be appended to errors thrown during resolution.
If this returns an empty array the stack will not be attached.
fqn: str
- Type: str
args_input: typing.Mapping[str]
- Type: typing.Mapping[str]
kms_key_input: str
- Type: str
max_job_execution_lifetime_input: str
- Type: str
project_input: str
- Type: str
service_account_input: str
- Type: str
args: typing.Mapping[str]
- Type: typing.Mapping[str]
kms_key: str
- Type: str
max_job_execution_lifetime: str
- Type: str
project: str
- Type: str
service_account: str
- Type: str
internal_value: DataplexTaskExecutionSpec
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskExecutionStatusLatestJobList(
terraform_resource: IInterpolatingParent,
terraform_attribute: str,
wraps_set: bool
)
Name | Type | Description |
---|---|---|
terraform_resource |
cdktf.IInterpolatingParent |
The parent resource. |
terraform_attribute |
str |
The attribute on the parent resource this class is referencing. |
wraps_set |
bool |
whether the list is wrapping a set (will add tolist() to be able to access an item via an index). |
- Type: cdktf.IInterpolatingParent
The parent resource.
- Type: str
The attribute on the parent resource this class is referencing.
- Type: bool
whether the list is wrapping a set (will add tolist() to be able to access an item via an index).
Name | Description |
---|---|
all_with_map_key |
Creating an iterator for this complex list. |
compute_fqn |
No description. |
resolve |
Produce the Token's value at resolution time. |
to_string |
Return a string representation of this resolvable object. |
get |
No description. |
def all_with_map_key(
map_key_attribute_name: str
) -> DynamicListTerraformIterator
Creating an iterator for this complex list.
The list will be converted into a map with the mapKeyAttributeName as the key.
- Type: str
def compute_fqn() -> str
def resolve(
_context: IResolveContext
) -> typing.Any
Produce the Token's value at resolution time.
- Type: cdktf.IResolveContext
def to_string() -> str
Return a string representation of this resolvable object.
Returns a reversible string representation.
def get(
index: typing.Union[int, float]
) -> DataplexTaskExecutionStatusLatestJobOutputReference
- Type: typing.Union[int, float]
the index of the item to return.
Name | Type | Description |
---|---|---|
creation_stack |
typing.List[str] |
The creation stack of this resolvable which will be appended to errors thrown during resolution. |
fqn |
str |
No description. |
creation_stack: typing.List[str]
- Type: typing.List[str]
The creation stack of this resolvable which will be appended to errors thrown during resolution.
If this returns an empty array the stack will not be attached.
fqn: str
- Type: str
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskExecutionStatusLatestJobOutputReference(
terraform_resource: IInterpolatingParent,
terraform_attribute: str,
complex_object_index: typing.Union[int, float],
complex_object_is_from_set: bool
)
Name | Type | Description |
---|---|---|
terraform_resource |
cdktf.IInterpolatingParent |
The parent resource. |
terraform_attribute |
str |
The attribute on the parent resource this class is referencing. |
complex_object_index |
typing.Union[int, float] |
the index of this item in the list. |
complex_object_is_from_set |
bool |
whether the list is wrapping a set (will add tolist() to be able to access an item via an index). |
- Type: cdktf.IInterpolatingParent
The parent resource.
- Type: str
The attribute on the parent resource this class is referencing.
- Type: typing.Union[int, float]
the index of this item in the list.
- Type: bool
whether the list is wrapping a set (will add tolist() to be able to access an item via an index).
Name | Description |
---|---|
compute_fqn |
No description. |
get_any_map_attribute |
No description. |
get_boolean_attribute |
No description. |
get_boolean_map_attribute |
No description. |
get_list_attribute |
No description. |
get_number_attribute |
No description. |
get_number_list_attribute |
No description. |
get_number_map_attribute |
No description. |
get_string_attribute |
No description. |
get_string_map_attribute |
No description. |
interpolation_for_attribute |
No description. |
resolve |
Produce the Token's value at resolution time. |
to_string |
Return a string representation of this resolvable object. |
def compute_fqn() -> str
def get_any_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Any]
- Type: str
def get_boolean_attribute(
terraform_attribute: str
) -> IResolvable
- Type: str
def get_boolean_map_attribute(
terraform_attribute: str
) -> typing.Mapping[bool]
- Type: str
def get_list_attribute(
terraform_attribute: str
) -> typing.List[str]
- Type: str
def get_number_attribute(
terraform_attribute: str
) -> typing.Union[int, float]
- Type: str
def get_number_list_attribute(
terraform_attribute: str
) -> typing.List[typing.Union[int, float]]
- Type: str
def get_number_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Union[int, float]]
- Type: str
def get_string_attribute(
terraform_attribute: str
) -> str
- Type: str
def get_string_map_attribute(
terraform_attribute: str
) -> typing.Mapping[str]
- Type: str
def interpolation_for_attribute(
property: str
) -> IResolvable
- Type: str
def resolve(
_context: IResolveContext
) -> typing.Any
Produce the Token's value at resolution time.
- Type: cdktf.IResolveContext
def to_string() -> str
Return a string representation of this resolvable object.
Returns a reversible string representation.
Name | Type | Description |
---|---|---|
creation_stack |
typing.List[str] |
The creation stack of this resolvable which will be appended to errors thrown during resolution. |
fqn |
str |
No description. |
end_time |
str |
No description. |
message |
str |
No description. |
name |
str |
No description. |
retry_count |
typing.Union[int, float] |
No description. |
service |
str |
No description. |
service_job |
str |
No description. |
start_time |
str |
No description. |
state |
str |
No description. |
uid |
str |
No description. |
internal_value |
DataplexTaskExecutionStatusLatestJob |
No description. |
creation_stack: typing.List[str]
- Type: typing.List[str]
The creation stack of this resolvable which will be appended to errors thrown during resolution.
If this returns an empty array the stack will not be attached.
fqn: str
- Type: str
end_time: str
- Type: str
message: str
- Type: str
name: str
- Type: str
retry_count: typing.Union[int, float]
- Type: typing.Union[int, float]
service: str
- Type: str
service_job: str
- Type: str
start_time: str
- Type: str
state: str
- Type: str
uid: str
- Type: str
internal_value: DataplexTaskExecutionStatusLatestJob
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskExecutionStatusList(
terraform_resource: IInterpolatingParent,
terraform_attribute: str,
wraps_set: bool
)
Name | Type | Description |
---|---|---|
terraform_resource |
cdktf.IInterpolatingParent |
The parent resource. |
terraform_attribute |
str |
The attribute on the parent resource this class is referencing. |
wraps_set |
bool |
whether the list is wrapping a set (will add tolist() to be able to access an item via an index). |
- Type: cdktf.IInterpolatingParent
The parent resource.
- Type: str
The attribute on the parent resource this class is referencing.
- Type: bool
whether the list is wrapping a set (will add tolist() to be able to access an item via an index).
Name | Description |
---|---|
all_with_map_key |
Creating an iterator for this complex list. |
compute_fqn |
No description. |
resolve |
Produce the Token's value at resolution time. |
to_string |
Return a string representation of this resolvable object. |
get |
No description. |
def all_with_map_key(
map_key_attribute_name: str
) -> DynamicListTerraformIterator
Creating an iterator for this complex list.
The list will be converted into a map with the mapKeyAttributeName as the key.
- Type: str
def compute_fqn() -> str
def resolve(
_context: IResolveContext
) -> typing.Any
Produce the Token's value at resolution time.
- Type: cdktf.IResolveContext
def to_string() -> str
Return a string representation of this resolvable object.
Returns a reversible string representation.
def get(
index: typing.Union[int, float]
) -> DataplexTaskExecutionStatusOutputReference
- Type: typing.Union[int, float]
the index of the item to return.
Name | Type | Description |
---|---|---|
creation_stack |
typing.List[str] |
The creation stack of this resolvable which will be appended to errors thrown during resolution. |
fqn |
str |
No description. |
creation_stack: typing.List[str]
- Type: typing.List[str]
The creation stack of this resolvable which will be appended to errors thrown during resolution.
If this returns an empty array the stack will not be attached.
fqn: str
- Type: str
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskExecutionStatusOutputReference(
terraform_resource: IInterpolatingParent,
terraform_attribute: str,
complex_object_index: typing.Union[int, float],
complex_object_is_from_set: bool
)
Name | Type | Description |
---|---|---|
terraform_resource |
cdktf.IInterpolatingParent |
The parent resource. |
terraform_attribute |
str |
The attribute on the parent resource this class is referencing. |
complex_object_index |
typing.Union[int, float] |
the index of this item in the list. |
complex_object_is_from_set |
bool |
whether the list is wrapping a set (will add tolist() to be able to access an item via an index). |
- Type: cdktf.IInterpolatingParent
The parent resource.
- Type: str
The attribute on the parent resource this class is referencing.
- Type: typing.Union[int, float]
the index of this item in the list.
- Type: bool
whether the list is wrapping a set (will add tolist() to be able to access an item via an index).
Name | Description |
---|---|
compute_fqn |
No description. |
get_any_map_attribute |
No description. |
get_boolean_attribute |
No description. |
get_boolean_map_attribute |
No description. |
get_list_attribute |
No description. |
get_number_attribute |
No description. |
get_number_list_attribute |
No description. |
get_number_map_attribute |
No description. |
get_string_attribute |
No description. |
get_string_map_attribute |
No description. |
interpolation_for_attribute |
No description. |
resolve |
Produce the Token's value at resolution time. |
to_string |
Return a string representation of this resolvable object. |
def compute_fqn() -> str
def get_any_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Any]
- Type: str
def get_boolean_attribute(
terraform_attribute: str
) -> IResolvable
- Type: str
def get_boolean_map_attribute(
terraform_attribute: str
) -> typing.Mapping[bool]
- Type: str
def get_list_attribute(
terraform_attribute: str
) -> typing.List[str]
- Type: str
def get_number_attribute(
terraform_attribute: str
) -> typing.Union[int, float]
- Type: str
def get_number_list_attribute(
terraform_attribute: str
) -> typing.List[typing.Union[int, float]]
- Type: str
def get_number_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Union[int, float]]
- Type: str
def get_string_attribute(
terraform_attribute: str
) -> str
- Type: str
def get_string_map_attribute(
terraform_attribute: str
) -> typing.Mapping[str]
- Type: str
def interpolation_for_attribute(
property: str
) -> IResolvable
- Type: str
def resolve(
_context: IResolveContext
) -> typing.Any
Produce the Token's value at resolution time.
- Type: cdktf.IResolveContext
def to_string() -> str
Return a string representation of this resolvable object.
Returns a reversible string representation.
Name | Type | Description |
---|---|---|
creation_stack |
typing.List[str] |
The creation stack of this resolvable which will be appended to errors thrown during resolution. |
fqn |
str |
No description. |
latest_job |
DataplexTaskExecutionStatusLatestJobList |
No description. |
update_time |
str |
No description. |
internal_value |
DataplexTaskExecutionStatus |
No description. |
creation_stack: typing.List[str]
- Type: typing.List[str]
The creation stack of this resolvable which will be appended to errors thrown during resolution.
If this returns an empty array the stack will not be attached.
fqn: str
- Type: str
latest_job: DataplexTaskExecutionStatusLatestJobList
update_time: str
- Type: str
internal_value: DataplexTaskExecutionStatus
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskNotebookInfrastructureSpecBatchOutputReference(
terraform_resource: IInterpolatingParent,
terraform_attribute: str
)
Name | Type | Description |
---|---|---|
terraform_resource |
cdktf.IInterpolatingParent |
The parent resource. |
terraform_attribute |
str |
The attribute on the parent resource this class is referencing. |
- Type: cdktf.IInterpolatingParent
The parent resource.
- Type: str
The attribute on the parent resource this class is referencing.
Name | Description |
---|---|
compute_fqn |
No description. |
get_any_map_attribute |
No description. |
get_boolean_attribute |
No description. |
get_boolean_map_attribute |
No description. |
get_list_attribute |
No description. |
get_number_attribute |
No description. |
get_number_list_attribute |
No description. |
get_number_map_attribute |
No description. |
get_string_attribute |
No description. |
get_string_map_attribute |
No description. |
interpolation_for_attribute |
No description. |
resolve |
Produce the Token's value at resolution time. |
to_string |
Return a string representation of this resolvable object. |
reset_executors_count |
No description. |
reset_max_executors_count |
No description. |
def compute_fqn() -> str
def get_any_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Any]
- Type: str
def get_boolean_attribute(
terraform_attribute: str
) -> IResolvable
- Type: str
def get_boolean_map_attribute(
terraform_attribute: str
) -> typing.Mapping[bool]
- Type: str
def get_list_attribute(
terraform_attribute: str
) -> typing.List[str]
- Type: str
def get_number_attribute(
terraform_attribute: str
) -> typing.Union[int, float]
- Type: str
def get_number_list_attribute(
terraform_attribute: str
) -> typing.List[typing.Union[int, float]]
- Type: str
def get_number_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Union[int, float]]
- Type: str
def get_string_attribute(
terraform_attribute: str
) -> str
- Type: str
def get_string_map_attribute(
terraform_attribute: str
) -> typing.Mapping[str]
- Type: str
def interpolation_for_attribute(
property: str
) -> IResolvable
- Type: str
def resolve(
_context: IResolveContext
) -> typing.Any
Produce the Token's value at resolution time.
- Type: cdktf.IResolveContext
def to_string() -> str
Return a string representation of this resolvable object.
Returns a reversible string representation.
def reset_executors_count() -> None
def reset_max_executors_count() -> None
Name | Type | Description |
---|---|---|
creation_stack |
typing.List[str] |
The creation stack of this resolvable which will be appended to errors thrown during resolution. |
fqn |
str |
No description. |
executors_count_input |
typing.Union[int, float] |
No description. |
max_executors_count_input |
typing.Union[int, float] |
No description. |
executors_count |
typing.Union[int, float] |
No description. |
max_executors_count |
typing.Union[int, float] |
No description. |
internal_value |
DataplexTaskNotebookInfrastructureSpecBatch |
No description. |
creation_stack: typing.List[str]
- Type: typing.List[str]
The creation stack of this resolvable which will be appended to errors thrown during resolution.
If this returns an empty array the stack will not be attached.
fqn: str
- Type: str
executors_count_input: typing.Union[int, float]
- Type: typing.Union[int, float]
max_executors_count_input: typing.Union[int, float]
- Type: typing.Union[int, float]
executors_count: typing.Union[int, float]
- Type: typing.Union[int, float]
max_executors_count: typing.Union[int, float]
- Type: typing.Union[int, float]
internal_value: DataplexTaskNotebookInfrastructureSpecBatch
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskNotebookInfrastructureSpecContainerImageOutputReference(
terraform_resource: IInterpolatingParent,
terraform_attribute: str
)
Name | Type | Description |
---|---|---|
terraform_resource |
cdktf.IInterpolatingParent |
The parent resource. |
terraform_attribute |
str |
The attribute on the parent resource this class is referencing. |
- Type: cdktf.IInterpolatingParent
The parent resource.
- Type: str
The attribute on the parent resource this class is referencing.
Name | Description |
---|---|
compute_fqn |
No description. |
get_any_map_attribute |
No description. |
get_boolean_attribute |
No description. |
get_boolean_map_attribute |
No description. |
get_list_attribute |
No description. |
get_number_attribute |
No description. |
get_number_list_attribute |
No description. |
get_number_map_attribute |
No description. |
get_string_attribute |
No description. |
get_string_map_attribute |
No description. |
interpolation_for_attribute |
No description. |
resolve |
Produce the Token's value at resolution time. |
to_string |
Return a string representation of this resolvable object. |
reset_image |
No description. |
reset_java_jars |
No description. |
reset_properties |
No description. |
reset_python_packages |
No description. |
def compute_fqn() -> str
def get_any_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Any]
- Type: str
def get_boolean_attribute(
terraform_attribute: str
) -> IResolvable
- Type: str
def get_boolean_map_attribute(
terraform_attribute: str
) -> typing.Mapping[bool]
- Type: str
def get_list_attribute(
terraform_attribute: str
) -> typing.List[str]
- Type: str
def get_number_attribute(
terraform_attribute: str
) -> typing.Union[int, float]
- Type: str
def get_number_list_attribute(
terraform_attribute: str
) -> typing.List[typing.Union[int, float]]
- Type: str
def get_number_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Union[int, float]]
- Type: str
def get_string_attribute(
terraform_attribute: str
) -> str
- Type: str
def get_string_map_attribute(
terraform_attribute: str
) -> typing.Mapping[str]
- Type: str
def interpolation_for_attribute(
property: str
) -> IResolvable
- Type: str
def resolve(
_context: IResolveContext
) -> typing.Any
Produce the Token's value at resolution time.
- Type: cdktf.IResolveContext
def to_string() -> str
Return a string representation of this resolvable object.
Returns a reversible string representation.
def reset_image() -> None
def reset_java_jars() -> None
def reset_properties() -> None
def reset_python_packages() -> None
Name | Type | Description |
---|---|---|
creation_stack |
typing.List[str] |
The creation stack of this resolvable which will be appended to errors thrown during resolution. |
fqn |
str |
No description. |
image_input |
str |
No description. |
java_jars_input |
typing.List[str] |
No description. |
properties_input |
typing.Mapping[str] |
No description. |
python_packages_input |
typing.List[str] |
No description. |
image |
str |
No description. |
java_jars |
typing.List[str] |
No description. |
properties |
typing.Mapping[str] |
No description. |
python_packages |
typing.List[str] |
No description. |
internal_value |
DataplexTaskNotebookInfrastructureSpecContainerImage |
No description. |
creation_stack: typing.List[str]
- Type: typing.List[str]
The creation stack of this resolvable which will be appended to errors thrown during resolution.
If this returns an empty array the stack will not be attached.
fqn: str
- Type: str
image_input: str
- Type: str
java_jars_input: typing.List[str]
- Type: typing.List[str]
properties_input: typing.Mapping[str]
- Type: typing.Mapping[str]
python_packages_input: typing.List[str]
- Type: typing.List[str]
image: str
- Type: str
java_jars: typing.List[str]
- Type: typing.List[str]
properties: typing.Mapping[str]
- Type: typing.Mapping[str]
python_packages: typing.List[str]
- Type: typing.List[str]
internal_value: DataplexTaskNotebookInfrastructureSpecContainerImage
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskNotebookInfrastructureSpecOutputReference(
terraform_resource: IInterpolatingParent,
terraform_attribute: str
)
Name | Type | Description |
---|---|---|
terraform_resource |
cdktf.IInterpolatingParent |
The parent resource. |
terraform_attribute |
str |
The attribute on the parent resource this class is referencing. |
- Type: cdktf.IInterpolatingParent
The parent resource.
- Type: str
The attribute on the parent resource this class is referencing.
Name | Description |
---|---|
compute_fqn |
No description. |
get_any_map_attribute |
No description. |
get_boolean_attribute |
No description. |
get_boolean_map_attribute |
No description. |
get_list_attribute |
No description. |
get_number_attribute |
No description. |
get_number_list_attribute |
No description. |
get_number_map_attribute |
No description. |
get_string_attribute |
No description. |
get_string_map_attribute |
No description. |
interpolation_for_attribute |
No description. |
resolve |
Produce the Token's value at resolution time. |
to_string |
Return a string representation of this resolvable object. |
put_batch |
No description. |
put_container_image |
No description. |
put_vpc_network |
No description. |
reset_batch |
No description. |
reset_container_image |
No description. |
reset_vpc_network |
No description. |
def compute_fqn() -> str
def get_any_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Any]
- Type: str
def get_boolean_attribute(
terraform_attribute: str
) -> IResolvable
- Type: str
def get_boolean_map_attribute(
terraform_attribute: str
) -> typing.Mapping[bool]
- Type: str
def get_list_attribute(
terraform_attribute: str
) -> typing.List[str]
- Type: str
def get_number_attribute(
terraform_attribute: str
) -> typing.Union[int, float]
- Type: str
def get_number_list_attribute(
terraform_attribute: str
) -> typing.List[typing.Union[int, float]]
- Type: str
def get_number_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Union[int, float]]
- Type: str
def get_string_attribute(
terraform_attribute: str
) -> str
- Type: str
def get_string_map_attribute(
terraform_attribute: str
) -> typing.Mapping[str]
- Type: str
def interpolation_for_attribute(
property: str
) -> IResolvable
- Type: str
def resolve(
_context: IResolveContext
) -> typing.Any
Produce the Token's value at resolution time.
- Type: cdktf.IResolveContext
def to_string() -> str
Return a string representation of this resolvable object.
Returns a reversible string representation.
def put_batch(
executors_count: typing.Union[int, float] = None,
max_executors_count: typing.Union[int, float] = None
) -> None
- Type: typing.Union[int, float]
Total number of job executors. Executor Count should be between 2 and 100. [Default=2].
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#executors_count DataplexTask#executors_count}
- Type: typing.Union[int, float]
Max configurable executors.
If maxExecutorsCount > executorsCount, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. [Default=1000]
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#max_executors_count DataplexTask#max_executors_count}
def put_container_image(
image: str = None,
java_jars: typing.List[str] = None,
properties: typing.Mapping[str] = None,
python_packages: typing.List[str] = None
) -> None
- Type: str
Container image to use.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#image DataplexTask#image}
- Type: typing.List[str]
A list of Java JARS to add to the classpath.
Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#java_jars DataplexTask#java_jars}
- Type: typing.Mapping[str]
Override to common configuration of open source components installed on the Dataproc cluster.
The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#properties DataplexTask#properties}
- Type: typing.List[str]
A list of python packages to be installed.
Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#python_packages DataplexTask#python_packages}
def put_vpc_network(
network: str = None,
network_tags: typing.List[str] = None,
sub_network: str = None
) -> None
- Type: str
The Cloud VPC network in which the job is run.
By default, the Cloud VPC network named Default within the project is used.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#network DataplexTask#network}
- Type: typing.List[str]
List of network tags to apply to the job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#network_tags DataplexTask#network_tags}
- Type: str
The Cloud VPC sub-network in which the job is run.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#sub_network DataplexTask#sub_network}
def reset_batch() -> None
def reset_container_image() -> None
def reset_vpc_network() -> None
Name | Type | Description |
---|---|---|
creation_stack |
typing.List[str] |
The creation stack of this resolvable which will be appended to errors thrown during resolution. |
fqn |
str |
No description. |
batch |
DataplexTaskNotebookInfrastructureSpecBatchOutputReference |
No description. |
container_image |
DataplexTaskNotebookInfrastructureSpecContainerImageOutputReference |
No description. |
vpc_network |
DataplexTaskNotebookInfrastructureSpecVpcNetworkOutputReference |
No description. |
batch_input |
DataplexTaskNotebookInfrastructureSpecBatch |
No description. |
container_image_input |
DataplexTaskNotebookInfrastructureSpecContainerImage |
No description. |
vpc_network_input |
DataplexTaskNotebookInfrastructureSpecVpcNetwork |
No description. |
internal_value |
DataplexTaskNotebookInfrastructureSpec |
No description. |
creation_stack: typing.List[str]
- Type: typing.List[str]
The creation stack of this resolvable which will be appended to errors thrown during resolution.
If this returns an empty array the stack will not be attached.
fqn: str
- Type: str
batch: DataplexTaskNotebookInfrastructureSpecBatchOutputReference
container_image: DataplexTaskNotebookInfrastructureSpecContainerImageOutputReference
vpc_network: DataplexTaskNotebookInfrastructureSpecVpcNetworkOutputReference
batch_input: DataplexTaskNotebookInfrastructureSpecBatch
container_image_input: DataplexTaskNotebookInfrastructureSpecContainerImage
vpc_network_input: DataplexTaskNotebookInfrastructureSpecVpcNetwork
internal_value: DataplexTaskNotebookInfrastructureSpec
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskNotebookInfrastructureSpecVpcNetworkOutputReference(
terraform_resource: IInterpolatingParent,
terraform_attribute: str
)
Name | Type | Description |
---|---|---|
terraform_resource |
cdktf.IInterpolatingParent |
The parent resource. |
terraform_attribute |
str |
The attribute on the parent resource this class is referencing. |
- Type: cdktf.IInterpolatingParent
The parent resource.
- Type: str
The attribute on the parent resource this class is referencing.
Name | Description |
---|---|
compute_fqn |
No description. |
get_any_map_attribute |
No description. |
get_boolean_attribute |
No description. |
get_boolean_map_attribute |
No description. |
get_list_attribute |
No description. |
get_number_attribute |
No description. |
get_number_list_attribute |
No description. |
get_number_map_attribute |
No description. |
get_string_attribute |
No description. |
get_string_map_attribute |
No description. |
interpolation_for_attribute |
No description. |
resolve |
Produce the Token's value at resolution time. |
to_string |
Return a string representation of this resolvable object. |
reset_network |
No description. |
reset_network_tags |
No description. |
reset_sub_network |
No description. |
def compute_fqn() -> str
def get_any_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Any]
- Type: str
def get_boolean_attribute(
terraform_attribute: str
) -> IResolvable
- Type: str
def get_boolean_map_attribute(
terraform_attribute: str
) -> typing.Mapping[bool]
- Type: str
def get_list_attribute(
terraform_attribute: str
) -> typing.List[str]
- Type: str
def get_number_attribute(
terraform_attribute: str
) -> typing.Union[int, float]
- Type: str
def get_number_list_attribute(
terraform_attribute: str
) -> typing.List[typing.Union[int, float]]
- Type: str
def get_number_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Union[int, float]]
- Type: str
def get_string_attribute(
terraform_attribute: str
) -> str
- Type: str
def get_string_map_attribute(
terraform_attribute: str
) -> typing.Mapping[str]
- Type: str
def interpolation_for_attribute(
property: str
) -> IResolvable
- Type: str
def resolve(
_context: IResolveContext
) -> typing.Any
Produce the Token's value at resolution time.
- Type: cdktf.IResolveContext
def to_string() -> str
Return a string representation of this resolvable object.
Returns a reversible string representation.
def reset_network() -> None
def reset_network_tags() -> None
def reset_sub_network() -> None
Name | Type | Description |
---|---|---|
creation_stack |
typing.List[str] |
The creation stack of this resolvable which will be appended to errors thrown during resolution. |
fqn |
str |
No description. |
network_input |
str |
No description. |
network_tags_input |
typing.List[str] |
No description. |
sub_network_input |
str |
No description. |
network |
str |
No description. |
network_tags |
typing.List[str] |
No description. |
sub_network |
str |
No description. |
internal_value |
DataplexTaskNotebookInfrastructureSpecVpcNetwork |
No description. |
creation_stack: typing.List[str]
- Type: typing.List[str]
The creation stack of this resolvable which will be appended to errors thrown during resolution.
If this returns an empty array the stack will not be attached.
fqn: str
- Type: str
network_input: str
- Type: str
network_tags_input: typing.List[str]
- Type: typing.List[str]
sub_network_input: str
- Type: str
network: str
- Type: str
network_tags: typing.List[str]
- Type: typing.List[str]
sub_network: str
- Type: str
internal_value: DataplexTaskNotebookInfrastructureSpecVpcNetwork
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskNotebookOutputReference(
terraform_resource: IInterpolatingParent,
terraform_attribute: str
)
Name | Type | Description |
---|---|---|
terraform_resource |
cdktf.IInterpolatingParent |
The parent resource. |
terraform_attribute |
str |
The attribute on the parent resource this class is referencing. |
- Type: cdktf.IInterpolatingParent
The parent resource.
- Type: str
The attribute on the parent resource this class is referencing.
Name | Description |
---|---|
compute_fqn |
No description. |
get_any_map_attribute |
No description. |
get_boolean_attribute |
No description. |
get_boolean_map_attribute |
No description. |
get_list_attribute |
No description. |
get_number_attribute |
No description. |
get_number_list_attribute |
No description. |
get_number_map_attribute |
No description. |
get_string_attribute |
No description. |
get_string_map_attribute |
No description. |
interpolation_for_attribute |
No description. |
resolve |
Produce the Token's value at resolution time. |
to_string |
Return a string representation of this resolvable object. |
put_infrastructure_spec |
No description. |
reset_archive_uris |
No description. |
reset_file_uris |
No description. |
reset_infrastructure_spec |
No description. |
def compute_fqn() -> str
def get_any_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Any]
- Type: str
def get_boolean_attribute(
terraform_attribute: str
) -> IResolvable
- Type: str
def get_boolean_map_attribute(
terraform_attribute: str
) -> typing.Mapping[bool]
- Type: str
def get_list_attribute(
terraform_attribute: str
) -> typing.List[str]
- Type: str
def get_number_attribute(
terraform_attribute: str
) -> typing.Union[int, float]
- Type: str
def get_number_list_attribute(
terraform_attribute: str
) -> typing.List[typing.Union[int, float]]
- Type: str
def get_number_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Union[int, float]]
- Type: str
def get_string_attribute(
terraform_attribute: str
) -> str
- Type: str
def get_string_map_attribute(
terraform_attribute: str
) -> typing.Mapping[str]
- Type: str
def interpolation_for_attribute(
property: str
) -> IResolvable
- Type: str
def resolve(
_context: IResolveContext
) -> typing.Any
Produce the Token's value at resolution time.
- Type: cdktf.IResolveContext
def to_string() -> str
Return a string representation of this resolvable object.
Returns a reversible string representation.
def put_infrastructure_spec(
batch: DataplexTaskNotebookInfrastructureSpecBatch = None,
container_image: DataplexTaskNotebookInfrastructureSpecContainerImage = None,
vpc_network: DataplexTaskNotebookInfrastructureSpecVpcNetwork = None
) -> None
batch block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#batch DataplexTask#batch}
container_image block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#container_image DataplexTask#container_image}
vpc_network block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#vpc_network DataplexTask#vpc_network}
def reset_archive_uris() -> None
def reset_file_uris() -> None
def reset_infrastructure_spec() -> None
Name | Type | Description |
---|---|---|
creation_stack |
typing.List[str] |
The creation stack of this resolvable which will be appended to errors thrown during resolution. |
fqn |
str |
No description. |
infrastructure_spec |
DataplexTaskNotebookInfrastructureSpecOutputReference |
No description. |
archive_uris_input |
typing.List[str] |
No description. |
file_uris_input |
typing.List[str] |
No description. |
infrastructure_spec_input |
DataplexTaskNotebookInfrastructureSpec |
No description. |
notebook_input |
str |
No description. |
archive_uris |
typing.List[str] |
No description. |
file_uris |
typing.List[str] |
No description. |
notebook |
str |
No description. |
internal_value |
DataplexTaskNotebook |
No description. |
creation_stack: typing.List[str]
- Type: typing.List[str]
The creation stack of this resolvable which will be appended to errors thrown during resolution.
If this returns an empty array the stack will not be attached.
fqn: str
- Type: str
infrastructure_spec: DataplexTaskNotebookInfrastructureSpecOutputReference
archive_uris_input: typing.List[str]
- Type: typing.List[str]
file_uris_input: typing.List[str]
- Type: typing.List[str]
infrastructure_spec_input: DataplexTaskNotebookInfrastructureSpec
notebook_input: str
- Type: str
archive_uris: typing.List[str]
- Type: typing.List[str]
file_uris: typing.List[str]
- Type: typing.List[str]
notebook: str
- Type: str
internal_value: DataplexTaskNotebook
- Type: DataplexTaskNotebook
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskSparkInfrastructureSpecBatchOutputReference(
terraform_resource: IInterpolatingParent,
terraform_attribute: str
)
Name | Type | Description |
---|---|---|
terraform_resource |
cdktf.IInterpolatingParent |
The parent resource. |
terraform_attribute |
str |
The attribute on the parent resource this class is referencing. |
- Type: cdktf.IInterpolatingParent
The parent resource.
- Type: str
The attribute on the parent resource this class is referencing.
Name | Description |
---|---|
compute_fqn |
No description. |
get_any_map_attribute |
No description. |
get_boolean_attribute |
No description. |
get_boolean_map_attribute |
No description. |
get_list_attribute |
No description. |
get_number_attribute |
No description. |
get_number_list_attribute |
No description. |
get_number_map_attribute |
No description. |
get_string_attribute |
No description. |
get_string_map_attribute |
No description. |
interpolation_for_attribute |
No description. |
resolve |
Produce the Token's value at resolution time. |
to_string |
Return a string representation of this resolvable object. |
reset_executors_count |
No description. |
reset_max_executors_count |
No description. |
def compute_fqn() -> str
def get_any_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Any]
- Type: str
def get_boolean_attribute(
terraform_attribute: str
) -> IResolvable
- Type: str
def get_boolean_map_attribute(
terraform_attribute: str
) -> typing.Mapping[bool]
- Type: str
def get_list_attribute(
terraform_attribute: str
) -> typing.List[str]
- Type: str
def get_number_attribute(
terraform_attribute: str
) -> typing.Union[int, float]
- Type: str
def get_number_list_attribute(
terraform_attribute: str
) -> typing.List[typing.Union[int, float]]
- Type: str
def get_number_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Union[int, float]]
- Type: str
def get_string_attribute(
terraform_attribute: str
) -> str
- Type: str
def get_string_map_attribute(
terraform_attribute: str
) -> typing.Mapping[str]
- Type: str
def interpolation_for_attribute(
property: str
) -> IResolvable
- Type: str
def resolve(
_context: IResolveContext
) -> typing.Any
Produce the Token's value at resolution time.
- Type: cdktf.IResolveContext
def to_string() -> str
Return a string representation of this resolvable object.
Returns a reversible string representation.
def reset_executors_count() -> None
def reset_max_executors_count() -> None
Name | Type | Description |
---|---|---|
creation_stack |
typing.List[str] |
The creation stack of this resolvable which will be appended to errors thrown during resolution. |
fqn |
str |
No description. |
executors_count_input |
typing.Union[int, float] |
No description. |
max_executors_count_input |
typing.Union[int, float] |
No description. |
executors_count |
typing.Union[int, float] |
No description. |
max_executors_count |
typing.Union[int, float] |
No description. |
internal_value |
DataplexTaskSparkInfrastructureSpecBatch |
No description. |
creation_stack: typing.List[str]
- Type: typing.List[str]
The creation stack of this resolvable which will be appended to errors thrown during resolution.
If this returns an empty array the stack will not be attached.
fqn: str
- Type: str
executors_count_input: typing.Union[int, float]
- Type: typing.Union[int, float]
max_executors_count_input: typing.Union[int, float]
- Type: typing.Union[int, float]
executors_count: typing.Union[int, float]
- Type: typing.Union[int, float]
max_executors_count: typing.Union[int, float]
- Type: typing.Union[int, float]
internal_value: DataplexTaskSparkInfrastructureSpecBatch
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskSparkInfrastructureSpecContainerImageOutputReference(
terraform_resource: IInterpolatingParent,
terraform_attribute: str
)
Name | Type | Description |
---|---|---|
terraform_resource |
cdktf.IInterpolatingParent |
The parent resource. |
terraform_attribute |
str |
The attribute on the parent resource this class is referencing. |
- Type: cdktf.IInterpolatingParent
The parent resource.
- Type: str
The attribute on the parent resource this class is referencing.
Name | Description |
---|---|
compute_fqn |
No description. |
get_any_map_attribute |
No description. |
get_boolean_attribute |
No description. |
get_boolean_map_attribute |
No description. |
get_list_attribute |
No description. |
get_number_attribute |
No description. |
get_number_list_attribute |
No description. |
get_number_map_attribute |
No description. |
get_string_attribute |
No description. |
get_string_map_attribute |
No description. |
interpolation_for_attribute |
No description. |
resolve |
Produce the Token's value at resolution time. |
to_string |
Return a string representation of this resolvable object. |
reset_image |
No description. |
reset_java_jars |
No description. |
reset_properties |
No description. |
reset_python_packages |
No description. |
def compute_fqn() -> str
def get_any_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Any]
- Type: str
def get_boolean_attribute(
terraform_attribute: str
) -> IResolvable
- Type: str
def get_boolean_map_attribute(
terraform_attribute: str
) -> typing.Mapping[bool]
- Type: str
def get_list_attribute(
terraform_attribute: str
) -> typing.List[str]
- Type: str
def get_number_attribute(
terraform_attribute: str
) -> typing.Union[int, float]
- Type: str
def get_number_list_attribute(
terraform_attribute: str
) -> typing.List[typing.Union[int, float]]
- Type: str
def get_number_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Union[int, float]]
- Type: str
def get_string_attribute(
terraform_attribute: str
) -> str
- Type: str
def get_string_map_attribute(
terraform_attribute: str
) -> typing.Mapping[str]
- Type: str
def interpolation_for_attribute(
property: str
) -> IResolvable
- Type: str
def resolve(
_context: IResolveContext
) -> typing.Any
Produce the Token's value at resolution time.
- Type: cdktf.IResolveContext
def to_string() -> str
Return a string representation of this resolvable object.
Returns a reversible string representation.
def reset_image() -> None
def reset_java_jars() -> None
def reset_properties() -> None
def reset_python_packages() -> None
Name | Type | Description |
---|---|---|
creation_stack |
typing.List[str] |
The creation stack of this resolvable which will be appended to errors thrown during resolution. |
fqn |
str |
No description. |
image_input |
str |
No description. |
java_jars_input |
typing.List[str] |
No description. |
properties_input |
typing.Mapping[str] |
No description. |
python_packages_input |
typing.List[str] |
No description. |
image |
str |
No description. |
java_jars |
typing.List[str] |
No description. |
properties |
typing.Mapping[str] |
No description. |
python_packages |
typing.List[str] |
No description. |
internal_value |
DataplexTaskSparkInfrastructureSpecContainerImage |
No description. |
creation_stack: typing.List[str]
- Type: typing.List[str]
The creation stack of this resolvable which will be appended to errors thrown during resolution.
If this returns an empty array the stack will not be attached.
fqn: str
- Type: str
image_input: str
- Type: str
java_jars_input: typing.List[str]
- Type: typing.List[str]
properties_input: typing.Mapping[str]
- Type: typing.Mapping[str]
python_packages_input: typing.List[str]
- Type: typing.List[str]
image: str
- Type: str
java_jars: typing.List[str]
- Type: typing.List[str]
properties: typing.Mapping[str]
- Type: typing.Mapping[str]
python_packages: typing.List[str]
- Type: typing.List[str]
internal_value: DataplexTaskSparkInfrastructureSpecContainerImage
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskSparkInfrastructureSpecOutputReference(
terraform_resource: IInterpolatingParent,
terraform_attribute: str
)
Name | Type | Description |
---|---|---|
terraform_resource |
cdktf.IInterpolatingParent |
The parent resource. |
terraform_attribute |
str |
The attribute on the parent resource this class is referencing. |
- Type: cdktf.IInterpolatingParent
The parent resource.
- Type: str
The attribute on the parent resource this class is referencing.
Name | Description |
---|---|
compute_fqn |
No description. |
get_any_map_attribute |
No description. |
get_boolean_attribute |
No description. |
get_boolean_map_attribute |
No description. |
get_list_attribute |
No description. |
get_number_attribute |
No description. |
get_number_list_attribute |
No description. |
get_number_map_attribute |
No description. |
get_string_attribute |
No description. |
get_string_map_attribute |
No description. |
interpolation_for_attribute |
No description. |
resolve |
Produce the Token's value at resolution time. |
to_string |
Return a string representation of this resolvable object. |
put_batch |
No description. |
put_container_image |
No description. |
put_vpc_network |
No description. |
reset_batch |
No description. |
reset_container_image |
No description. |
reset_vpc_network |
No description. |
def compute_fqn() -> str
def get_any_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Any]
- Type: str
def get_boolean_attribute(
terraform_attribute: str
) -> IResolvable
- Type: str
def get_boolean_map_attribute(
terraform_attribute: str
) -> typing.Mapping[bool]
- Type: str
def get_list_attribute(
terraform_attribute: str
) -> typing.List[str]
- Type: str
def get_number_attribute(
terraform_attribute: str
) -> typing.Union[int, float]
- Type: str
def get_number_list_attribute(
terraform_attribute: str
) -> typing.List[typing.Union[int, float]]
- Type: str
def get_number_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Union[int, float]]
- Type: str
def get_string_attribute(
terraform_attribute: str
) -> str
- Type: str
def get_string_map_attribute(
terraform_attribute: str
) -> typing.Mapping[str]
- Type: str
def interpolation_for_attribute(
property: str
) -> IResolvable
- Type: str
def resolve(
_context: IResolveContext
) -> typing.Any
Produce the Token's value at resolution time.
- Type: cdktf.IResolveContext
def to_string() -> str
Return a string representation of this resolvable object.
Returns a reversible string representation.
def put_batch(
executors_count: typing.Union[int, float] = None,
max_executors_count: typing.Union[int, float] = None
) -> None
- Type: typing.Union[int, float]
Total number of job executors. Executor Count should be between 2 and 100. [Default=2].
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#executors_count DataplexTask#executors_count}
- Type: typing.Union[int, float]
Max configurable executors.
If maxExecutorsCount > executorsCount, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. [Default=1000]
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#max_executors_count DataplexTask#max_executors_count}
def put_container_image(
image: str = None,
java_jars: typing.List[str] = None,
properties: typing.Mapping[str] = None,
python_packages: typing.List[str] = None
) -> None
- Type: str
Container image to use.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#image DataplexTask#image}
- Type: typing.List[str]
A list of Java JARS to add to the classpath.
Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#java_jars DataplexTask#java_jars}
- Type: typing.Mapping[str]
Override to common configuration of open source components installed on the Dataproc cluster.
The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#properties DataplexTask#properties}
- Type: typing.List[str]
A list of python packages to be installed.
Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#python_packages DataplexTask#python_packages}
def put_vpc_network(
network: str = None,
network_tags: typing.List[str] = None,
sub_network: str = None
) -> None
- Type: str
The Cloud VPC network in which the job is run.
By default, the Cloud VPC network named Default within the project is used.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#network DataplexTask#network}
- Type: typing.List[str]
List of network tags to apply to the job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#network_tags DataplexTask#network_tags}
- Type: str
The Cloud VPC sub-network in which the job is run.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#sub_network DataplexTask#sub_network}
def reset_batch() -> None
def reset_container_image() -> None
def reset_vpc_network() -> None
Name | Type | Description |
---|---|---|
creation_stack |
typing.List[str] |
The creation stack of this resolvable which will be appended to errors thrown during resolution. |
fqn |
str |
No description. |
batch |
DataplexTaskSparkInfrastructureSpecBatchOutputReference |
No description. |
container_image |
DataplexTaskSparkInfrastructureSpecContainerImageOutputReference |
No description. |
vpc_network |
DataplexTaskSparkInfrastructureSpecVpcNetworkOutputReference |
No description. |
batch_input |
DataplexTaskSparkInfrastructureSpecBatch |
No description. |
container_image_input |
DataplexTaskSparkInfrastructureSpecContainerImage |
No description. |
vpc_network_input |
DataplexTaskSparkInfrastructureSpecVpcNetwork |
No description. |
internal_value |
DataplexTaskSparkInfrastructureSpec |
No description. |
creation_stack: typing.List[str]
- Type: typing.List[str]
The creation stack of this resolvable which will be appended to errors thrown during resolution.
If this returns an empty array the stack will not be attached.
fqn: str
- Type: str
batch: DataplexTaskSparkInfrastructureSpecBatchOutputReference
container_image: DataplexTaskSparkInfrastructureSpecContainerImageOutputReference
vpc_network: DataplexTaskSparkInfrastructureSpecVpcNetworkOutputReference
batch_input: DataplexTaskSparkInfrastructureSpecBatch
container_image_input: DataplexTaskSparkInfrastructureSpecContainerImage
vpc_network_input: DataplexTaskSparkInfrastructureSpecVpcNetwork
internal_value: DataplexTaskSparkInfrastructureSpec
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskSparkInfrastructureSpecVpcNetworkOutputReference(
terraform_resource: IInterpolatingParent,
terraform_attribute: str
)
Name | Type | Description |
---|---|---|
terraform_resource |
cdktf.IInterpolatingParent |
The parent resource. |
terraform_attribute |
str |
The attribute on the parent resource this class is referencing. |
- Type: cdktf.IInterpolatingParent
The parent resource.
- Type: str
The attribute on the parent resource this class is referencing.
Name | Description |
---|---|
compute_fqn |
No description. |
get_any_map_attribute |
No description. |
get_boolean_attribute |
No description. |
get_boolean_map_attribute |
No description. |
get_list_attribute |
No description. |
get_number_attribute |
No description. |
get_number_list_attribute |
No description. |
get_number_map_attribute |
No description. |
get_string_attribute |
No description. |
get_string_map_attribute |
No description. |
interpolation_for_attribute |
No description. |
resolve |
Produce the Token's value at resolution time. |
to_string |
Return a string representation of this resolvable object. |
reset_network |
No description. |
reset_network_tags |
No description. |
reset_sub_network |
No description. |
def compute_fqn() -> str
def get_any_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Any]
- Type: str
def get_boolean_attribute(
terraform_attribute: str
) -> IResolvable
- Type: str
def get_boolean_map_attribute(
terraform_attribute: str
) -> typing.Mapping[bool]
- Type: str
def get_list_attribute(
terraform_attribute: str
) -> typing.List[str]
- Type: str
def get_number_attribute(
terraform_attribute: str
) -> typing.Union[int, float]
- Type: str
def get_number_list_attribute(
terraform_attribute: str
) -> typing.List[typing.Union[int, float]]
- Type: str
def get_number_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Union[int, float]]
- Type: str
def get_string_attribute(
terraform_attribute: str
) -> str
- Type: str
def get_string_map_attribute(
terraform_attribute: str
) -> typing.Mapping[str]
- Type: str
def interpolation_for_attribute(
property: str
) -> IResolvable
- Type: str
def resolve(
_context: IResolveContext
) -> typing.Any
Produce the Token's value at resolution time.
- Type: cdktf.IResolveContext
def to_string() -> str
Return a string representation of this resolvable object.
Returns a reversible string representation.
def reset_network() -> None
def reset_network_tags() -> None
def reset_sub_network() -> None
Name | Type | Description |
---|---|---|
creation_stack |
typing.List[str] |
The creation stack of this resolvable which will be appended to errors thrown during resolution. |
fqn |
str |
No description. |
network_input |
str |
No description. |
network_tags_input |
typing.List[str] |
No description. |
sub_network_input |
str |
No description. |
network |
str |
No description. |
network_tags |
typing.List[str] |
No description. |
sub_network |
str |
No description. |
internal_value |
DataplexTaskSparkInfrastructureSpecVpcNetwork |
No description. |
creation_stack: typing.List[str]
- Type: typing.List[str]
The creation stack of this resolvable which will be appended to errors thrown during resolution.
If this returns an empty array the stack will not be attached.
fqn: str
- Type: str
network_input: str
- Type: str
network_tags_input: typing.List[str]
- Type: typing.List[str]
sub_network_input: str
- Type: str
network: str
- Type: str
network_tags: typing.List[str]
- Type: typing.List[str]
sub_network: str
- Type: str
internal_value: DataplexTaskSparkInfrastructureSpecVpcNetwork
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskSparkOutputReference(
terraform_resource: IInterpolatingParent,
terraform_attribute: str
)
Name | Type | Description |
---|---|---|
terraform_resource |
cdktf.IInterpolatingParent |
The parent resource. |
terraform_attribute |
str |
The attribute on the parent resource this class is referencing. |
- Type: cdktf.IInterpolatingParent
The parent resource.
- Type: str
The attribute on the parent resource this class is referencing.
Name | Description |
---|---|
compute_fqn |
No description. |
get_any_map_attribute |
No description. |
get_boolean_attribute |
No description. |
get_boolean_map_attribute |
No description. |
get_list_attribute |
No description. |
get_number_attribute |
No description. |
get_number_list_attribute |
No description. |
get_number_map_attribute |
No description. |
get_string_attribute |
No description. |
get_string_map_attribute |
No description. |
interpolation_for_attribute |
No description. |
resolve |
Produce the Token's value at resolution time. |
to_string |
Return a string representation of this resolvable object. |
put_infrastructure_spec |
No description. |
reset_archive_uris |
No description. |
reset_file_uris |
No description. |
reset_infrastructure_spec |
No description. |
reset_main_class |
No description. |
reset_main_jar_file_uri |
No description. |
reset_python_script_file |
No description. |
reset_sql_script |
No description. |
reset_sql_script_file |
No description. |
def compute_fqn() -> str
def get_any_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Any]
- Type: str
def get_boolean_attribute(
terraform_attribute: str
) -> IResolvable
- Type: str
def get_boolean_map_attribute(
terraform_attribute: str
) -> typing.Mapping[bool]
- Type: str
def get_list_attribute(
terraform_attribute: str
) -> typing.List[str]
- Type: str
def get_number_attribute(
terraform_attribute: str
) -> typing.Union[int, float]
- Type: str
def get_number_list_attribute(
terraform_attribute: str
) -> typing.List[typing.Union[int, float]]
- Type: str
def get_number_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Union[int, float]]
- Type: str
def get_string_attribute(
terraform_attribute: str
) -> str
- Type: str
def get_string_map_attribute(
terraform_attribute: str
) -> typing.Mapping[str]
- Type: str
def interpolation_for_attribute(
property: str
) -> IResolvable
- Type: str
def resolve(
_context: IResolveContext
) -> typing.Any
Produce the Token's value at resolution time.
- Type: cdktf.IResolveContext
def to_string() -> str
Return a string representation of this resolvable object.
Returns a reversible string representation.
def put_infrastructure_spec(
batch: DataplexTaskSparkInfrastructureSpecBatch = None,
container_image: DataplexTaskSparkInfrastructureSpecContainerImage = None,
vpc_network: DataplexTaskSparkInfrastructureSpecVpcNetwork = None
) -> None
batch block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#batch DataplexTask#batch}
container_image block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#container_image DataplexTask#container_image}
vpc_network block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/6.37.0/docs/resources/dataplex_task#vpc_network DataplexTask#vpc_network}
def reset_archive_uris() -> None
def reset_file_uris() -> None
def reset_infrastructure_spec() -> None
def reset_main_class() -> None
def reset_main_jar_file_uri() -> None
def reset_python_script_file() -> None
def reset_sql_script() -> None
def reset_sql_script_file() -> None
Name | Type | Description |
---|---|---|
creation_stack |
typing.List[str] |
The creation stack of this resolvable which will be appended to errors thrown during resolution. |
fqn |
str |
No description. |
infrastructure_spec |
DataplexTaskSparkInfrastructureSpecOutputReference |
No description. |
archive_uris_input |
typing.List[str] |
No description. |
file_uris_input |
typing.List[str] |
No description. |
infrastructure_spec_input |
DataplexTaskSparkInfrastructureSpec |
No description. |
main_class_input |
str |
No description. |
main_jar_file_uri_input |
str |
No description. |
python_script_file_input |
str |
No description. |
sql_script_file_input |
str |
No description. |
sql_script_input |
str |
No description. |
archive_uris |
typing.List[str] |
No description. |
file_uris |
typing.List[str] |
No description. |
main_class |
str |
No description. |
main_jar_file_uri |
str |
No description. |
python_script_file |
str |
No description. |
sql_script |
str |
No description. |
sql_script_file |
str |
No description. |
internal_value |
DataplexTaskSpark |
No description. |
creation_stack: typing.List[str]
- Type: typing.List[str]
The creation stack of this resolvable which will be appended to errors thrown during resolution.
If this returns an empty array the stack will not be attached.
fqn: str
- Type: str
infrastructure_spec: DataplexTaskSparkInfrastructureSpecOutputReference
archive_uris_input: typing.List[str]
- Type: typing.List[str]
file_uris_input: typing.List[str]
- Type: typing.List[str]
infrastructure_spec_input: DataplexTaskSparkInfrastructureSpec
main_class_input: str
- Type: str
main_jar_file_uri_input: str
- Type: str
python_script_file_input: str
- Type: str
sql_script_file_input: str
- Type: str
sql_script_input: str
- Type: str
archive_uris: typing.List[str]
- Type: typing.List[str]
file_uris: typing.List[str]
- Type: typing.List[str]
main_class: str
- Type: str
main_jar_file_uri: str
- Type: str
python_script_file: str
- Type: str
sql_script: str
- Type: str
sql_script_file: str
- Type: str
internal_value: DataplexTaskSpark
- Type: DataplexTaskSpark
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskTimeoutsOutputReference(
terraform_resource: IInterpolatingParent,
terraform_attribute: str
)
Name | Type | Description |
---|---|---|
terraform_resource |
cdktf.IInterpolatingParent |
The parent resource. |
terraform_attribute |
str |
The attribute on the parent resource this class is referencing. |
- Type: cdktf.IInterpolatingParent
The parent resource.
- Type: str
The attribute on the parent resource this class is referencing.
Name | Description |
---|---|
compute_fqn |
No description. |
get_any_map_attribute |
No description. |
get_boolean_attribute |
No description. |
get_boolean_map_attribute |
No description. |
get_list_attribute |
No description. |
get_number_attribute |
No description. |
get_number_list_attribute |
No description. |
get_number_map_attribute |
No description. |
get_string_attribute |
No description. |
get_string_map_attribute |
No description. |
interpolation_for_attribute |
No description. |
resolve |
Produce the Token's value at resolution time. |
to_string |
Return a string representation of this resolvable object. |
reset_create |
No description. |
reset_delete |
No description. |
reset_update |
No description. |
def compute_fqn() -> str
def get_any_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Any]
- Type: str
def get_boolean_attribute(
terraform_attribute: str
) -> IResolvable
- Type: str
def get_boolean_map_attribute(
terraform_attribute: str
) -> typing.Mapping[bool]
- Type: str
def get_list_attribute(
terraform_attribute: str
) -> typing.List[str]
- Type: str
def get_number_attribute(
terraform_attribute: str
) -> typing.Union[int, float]
- Type: str
def get_number_list_attribute(
terraform_attribute: str
) -> typing.List[typing.Union[int, float]]
- Type: str
def get_number_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Union[int, float]]
- Type: str
def get_string_attribute(
terraform_attribute: str
) -> str
- Type: str
def get_string_map_attribute(
terraform_attribute: str
) -> typing.Mapping[str]
- Type: str
def interpolation_for_attribute(
property: str
) -> IResolvable
- Type: str
def resolve(
_context: IResolveContext
) -> typing.Any
Produce the Token's value at resolution time.
- Type: cdktf.IResolveContext
def to_string() -> str
Return a string representation of this resolvable object.
Returns a reversible string representation.
def reset_create() -> None
def reset_delete() -> None
def reset_update() -> None
Name | Type | Description |
---|---|---|
creation_stack |
typing.List[str] |
The creation stack of this resolvable which will be appended to errors thrown during resolution. |
fqn |
str |
No description. |
create_input |
str |
No description. |
delete_input |
str |
No description. |
update_input |
str |
No description. |
create |
str |
No description. |
delete |
str |
No description. |
update |
str |
No description. |
internal_value |
typing.Union[cdktf.IResolvable, DataplexTaskTimeouts] |
No description. |
creation_stack: typing.List[str]
- Type: typing.List[str]
The creation stack of this resolvable which will be appended to errors thrown during resolution.
If this returns an empty array the stack will not be attached.
fqn: str
- Type: str
create_input: str
- Type: str
delete_input: str
- Type: str
update_input: str
- Type: str
create: str
- Type: str
delete: str
- Type: str
update: str
- Type: str
internal_value: typing.Union[IResolvable, DataplexTaskTimeouts]
- Type: typing.Union[cdktf.IResolvable, DataplexTaskTimeouts]
from cdktf_cdktf_provider_google import dataplex_task
dataplexTask.DataplexTaskTriggerSpecOutputReference(
terraform_resource: IInterpolatingParent,
terraform_attribute: str
)
Name | Type | Description |
---|---|---|
terraform_resource |
cdktf.IInterpolatingParent |
The parent resource. |
terraform_attribute |
str |
The attribute on the parent resource this class is referencing. |
- Type: cdktf.IInterpolatingParent
The parent resource.
- Type: str
The attribute on the parent resource this class is referencing.
Name | Description |
---|---|
compute_fqn |
No description. |
get_any_map_attribute |
No description. |
get_boolean_attribute |
No description. |
get_boolean_map_attribute |
No description. |
get_list_attribute |
No description. |
get_number_attribute |
No description. |
get_number_list_attribute |
No description. |
get_number_map_attribute |
No description. |
get_string_attribute |
No description. |
get_string_map_attribute |
No description. |
interpolation_for_attribute |
No description. |
resolve |
Produce the Token's value at resolution time. |
to_string |
Return a string representation of this resolvable object. |
reset_disabled |
No description. |
reset_max_retries |
No description. |
reset_schedule |
No description. |
reset_start_time |
No description. |
def compute_fqn() -> str
def get_any_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Any]
- Type: str
def get_boolean_attribute(
terraform_attribute: str
) -> IResolvable
- Type: str
def get_boolean_map_attribute(
terraform_attribute: str
) -> typing.Mapping[bool]
- Type: str
def get_list_attribute(
terraform_attribute: str
) -> typing.List[str]
- Type: str
def get_number_attribute(
terraform_attribute: str
) -> typing.Union[int, float]
- Type: str
def get_number_list_attribute(
terraform_attribute: str
) -> typing.List[typing.Union[int, float]]
- Type: str
def get_number_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Union[int, float]]
- Type: str
def get_string_attribute(
terraform_attribute: str
) -> str
- Type: str
def get_string_map_attribute(
terraform_attribute: str
) -> typing.Mapping[str]
- Type: str
def interpolation_for_attribute(
property: str
) -> IResolvable
- Type: str
def resolve(
_context: IResolveContext
) -> typing.Any
Produce the Token's value at resolution time.
- Type: cdktf.IResolveContext
def to_string() -> str
Return a string representation of this resolvable object.
Returns a reversible string representation.
def reset_disabled() -> None
def reset_max_retries() -> None
def reset_schedule() -> None
def reset_start_time() -> None
Name | Type | Description |
---|---|---|
creation_stack |
typing.List[str] |
The creation stack of this resolvable which will be appended to errors thrown during resolution. |
fqn |
str |
No description. |
disabled_input |
typing.Union[bool, cdktf.IResolvable] |
No description. |
max_retries_input |
typing.Union[int, float] |
No description. |
schedule_input |
str |
No description. |
start_time_input |
str |
No description. |
type_input |
str |
No description. |
disabled |
typing.Union[bool, cdktf.IResolvable] |
No description. |
max_retries |
typing.Union[int, float] |
No description. |
schedule |
str |
No description. |
start_time |
str |
No description. |
type |
str |
No description. |
internal_value |
DataplexTaskTriggerSpec |
No description. |
creation_stack: typing.List[str]
- Type: typing.List[str]
The creation stack of this resolvable which will be appended to errors thrown during resolution.
If this returns an empty array the stack will not be attached.
fqn: str
- Type: str
disabled_input: typing.Union[bool, IResolvable]
- Type: typing.Union[bool, cdktf.IResolvable]
max_retries_input: typing.Union[int, float]
- Type: typing.Union[int, float]
schedule_input: str
- Type: str
start_time_input: str
- Type: str
type_input: str
- Type: str
disabled: typing.Union[bool, IResolvable]
- Type: typing.Union[bool, cdktf.IResolvable]
max_retries: typing.Union[int, float]
- Type: typing.Union[int, float]
schedule: str
- Type: str
start_time: str
- Type: str
type: str
- Type: str
internal_value: DataplexTaskTriggerSpec
- Type: DataplexTaskTriggerSpec