Instance(mapping=None, *, ignore_unknown_fields=False, **kwargs)
Represents a Data Fusion instance.
Attributes | |
---|---|
Name | Description |
name |
str
Output only. The name of this instance is in the form of projects/{project}/locations/{location}/instances/{instance}. |
description |
str
A description of this instance. |
type_ |
google.cloud.data_fusion_v1.types.Instance.Type
Required. Instance type. |
enable_stackdriver_logging |
bool
Option to enable Stackdriver Logging. |
enable_stackdriver_monitoring |
bool
Option to enable Stackdriver Monitoring. |
private_instance |
bool
Specifies whether the Data Fusion instance should be private. If set to true, all Data Fusion nodes will have private IP addresses and will not be able to access the public internet. |
network_config |
google.cloud.data_fusion_v1.types.NetworkConfig
Network configuration options. These are required when a private Data Fusion instance is to be created. |
labels |
MutableMapping[str, str]
The resource labels for instance to use to annotate any related underlying resources such as Compute Engine VMs. The character '=' is not allowed to be used within the labels. |
options |
MutableMapping[str, str]
Map of additional options used to configure the behavior of Data Fusion instance. |
create_time |
google.protobuf.timestamp_pb2.Timestamp
Output only. The time the instance was created. |
update_time |
google.protobuf.timestamp_pb2.Timestamp
Output only. The time the instance was last updated. |
state |
google.cloud.data_fusion_v1.types.Instance.State
Output only. The current state of this Data Fusion instance. |
state_message |
str
Output only. Additional information about the current state of this Data Fusion instance if available. |
service_endpoint |
str
Output only. Endpoint on which the Data Fusion UI is accessible. |
zone |
str
Name of the zone in which the Data Fusion instance will be created. Only DEVELOPER instances use this field. |
version |
str
Current version of the Data Fusion. Only specifiable in Update. |
service_account |
str
Output only. Deprecated. Use tenant_project_id instead to extract the tenant project ID. |
display_name |
str
Display name for an instance. |
available_version |
MutableSequence[google.cloud.data_fusion_v1.types.Version]
Available versions that the instance can be upgraded to using UpdateInstanceRequest. |
api_endpoint |
str
Output only. Endpoint on which the REST APIs is accessible. |
gcs_bucket |
str
Output only. Cloud Storage bucket generated by Data Fusion in the customer project. |
accelerators |
MutableSequence[google.cloud.data_fusion_v1.types.Accelerator]
List of accelerators enabled for this CDF instance. |
p4_service_account |
str
Output only. P4 service account for the customer project. |
tenant_project_id |
str
Output only. The name of the tenant project. |
dataproc_service_account |
str
User-managed service account to set on Dataproc when Cloud Data Fusion creates Dataproc to run data processing pipelines. This allows users to have fine-grained access control on Dataproc's accesses to cloud resources. |
enable_rbac |
bool
Option to enable granular role-based access control. |
crypto_key_config |
google.cloud.data_fusion_v1.types.CryptoKeyConfig
The crypto key configuration. This field is used by the Customer-Managed Encryption Keys (CMEK) feature. |
disabled_reason |
MutableSequence[google.cloud.data_fusion_v1.types.Instance.DisabledReason]
Output only. If the instance state is DISABLED, the reason for disabling the instance. |
Classes
DisabledReason
DisabledReason(value)
The reason for disabling the instance if the state is DISABLED.
Values: DISABLED_REASON_UNSPECIFIED (0): This is an unknown reason for disabling. KMS_KEY_ISSUE (1): The KMS key used by the instance is either revoked or denied access to
LabelsEntry
LabelsEntry(mapping=None, *, ignore_unknown_fields=False, **kwargs)
The abstract base class for a message.
Parameters | |
---|---|
Name | Description |
kwargs |
dict
Keys and values corresponding to the fields of the message. |
mapping |
Union[dict,
A dictionary or message to be used to determine the values for this message. |
ignore_unknown_fields |
Optional(bool)
If True, do not raise errors for unknown fields. Only applied if |
OptionsEntry
OptionsEntry(mapping=None, *, ignore_unknown_fields=False, **kwargs)
The abstract base class for a message.
Parameters | |
---|---|
Name | Description |
kwargs |
dict
Keys and values corresponding to the fields of the message. |
mapping |
Union[dict,
A dictionary or message to be used to determine the values for this message. |
ignore_unknown_fields |
Optional(bool)
If True, do not raise errors for unknown fields. Only applied if |
State
State(value)
Represents the state of a Data Fusion instance
Values: STATE_UNSPECIFIED (0): Instance does not have a state yet CREATING (1): Instance is being created ACTIVE (2): Instance is active and ready for requests. This corresponds to 'RUNNING' in datafusion.v1beta1. FAILED (3): Instance creation failed DELETING (4): Instance is being deleted UPGRADING (5): Instance is being upgraded RESTARTING (6): Instance is being restarted UPDATING (7): Instance is being updated on customer request AUTO_UPDATING (8): Instance is being auto-updated AUTO_UPGRADING (9): Instance is being auto-upgraded DISABLED (10): Instance is disabled
Type
Type(value)
Represents the type of Data Fusion instance. Each type is configured with the default settings for processing and memory.
Values: TYPE_UNSPECIFIED (0): No type specified. The instance creation will fail. BASIC (1): Basic Data Fusion instance. In Basic type, the user will be able to create data pipelines using point and click UI. However, there are certain limitations, such as fewer number of concurrent pipelines, no support for streaming pipelines, etc. ENTERPRISE (2): Enterprise Data Fusion instance. In Enterprise type, the user will have all features available, such as support for streaming pipelines, higher number of concurrent pipelines, etc. DEVELOPER (3): Developer Data Fusion instance. In Developer type, the user will have all features available but with restrictive capabilities. This is to help enterprises design and develop their data ingestion and integration pipelines at low cost.