Reference documentation and code samples for the Dataplex V1 API class Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec.
Job specification for a metadata import job
Inherits
- Object
Extended By
- Google::Protobuf::MessageExts::ClassMethods
Includes
- Google::Protobuf::MessageExts
Methods
#aspect_sync_mode
def aspect_sync_mode() -> ::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::SyncMode
-
(::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::SyncMode) — Required. The sync mode for aspects.
Only
INCREMENTAL
mode is supported for aspects. An aspect is modified only if the metadata import file includes a reference to the aspect in theupdate_mask
field and theaspect_keys
field.
#aspect_sync_mode=
def aspect_sync_mode=(value) -> ::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::SyncMode
-
value (::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::SyncMode) — Required. The sync mode for aspects.
Only
INCREMENTAL
mode is supported for aspects. An aspect is modified only if the metadata import file includes a reference to the aspect in theupdate_mask
field and theaspect_keys
field.
-
(::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::SyncMode) — Required. The sync mode for aspects.
Only
INCREMENTAL
mode is supported for aspects. An aspect is modified only if the metadata import file includes a reference to the aspect in theupdate_mask
field and theaspect_keys
field.
#entry_sync_mode
def entry_sync_mode() -> ::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::SyncMode
-
(::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::SyncMode) — Required. The sync mode for entries.
Only
FULL
mode is supported for entries. All entries in the job's scope are modified. If an entry exists in Dataplex but isn't included in the metadata import file, the entry is deleted when you run the metadata job.
#entry_sync_mode=
def entry_sync_mode=(value) -> ::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::SyncMode
-
value (::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::SyncMode) — Required. The sync mode for entries.
Only
FULL
mode is supported for entries. All entries in the job's scope are modified. If an entry exists in Dataplex but isn't included in the metadata import file, the entry is deleted when you run the metadata job.
-
(::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::SyncMode) — Required. The sync mode for entries.
Only
FULL
mode is supported for entries. All entries in the job's scope are modified. If an entry exists in Dataplex but isn't included in the metadata import file, the entry is deleted when you run the metadata job.
#log_level
def log_level() -> ::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::LogLevel
-
(::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::LogLevel) — Optional. The level of logs to write to Cloud Logging for this job.
Debug-level logs provide highly-detailed information for troubleshooting, but their increased verbosity could incur additional costs that might not be merited for all jobs.
If unspecified, defaults to
INFO
.
#log_level=
def log_level=(value) -> ::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::LogLevel
-
value (::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::LogLevel) — Optional. The level of logs to write to Cloud Logging for this job.
Debug-level logs provide highly-detailed information for troubleshooting, but their increased verbosity could incur additional costs that might not be merited for all jobs.
If unspecified, defaults to
INFO
.
-
(::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::LogLevel) — Optional. The level of logs to write to Cloud Logging for this job.
Debug-level logs provide highly-detailed information for troubleshooting, but their increased verbosity could incur additional costs that might not be merited for all jobs.
If unspecified, defaults to
INFO
.
#scope
def scope() -> ::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::ImportJobScope
- (::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::ImportJobScope) — Required. A boundary on the scope of impact that the metadata import job can have.
#scope=
def scope=(value) -> ::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::ImportJobScope
- value (::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::ImportJobScope) — Required. A boundary on the scope of impact that the metadata import job can have.
- (::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::ImportJobScope) — Required. A boundary on the scope of impact that the metadata import job can have.
#source_create_time
def source_create_time() -> ::Google::Protobuf::Timestamp
- (::Google::Protobuf::Timestamp) — Optional. The time when the process that created the metadata import files began.
#source_create_time=
def source_create_time=(value) -> ::Google::Protobuf::Timestamp
- value (::Google::Protobuf::Timestamp) — Optional. The time when the process that created the metadata import files began.
- (::Google::Protobuf::Timestamp) — Optional. The time when the process that created the metadata import files began.
#source_storage_uri
def source_storage_uri() -> ::String
-
(::String) — Optional. The URI of a Cloud Storage bucket or folder (beginning with
gs://
and ending with/
) that contains the metadata import files for this job.A metadata import file defines the values to set for each of the entries and aspects in a metadata job. For more information about how to create a metadata import file and the file requirements, see Metadata import file.
You can provide multiple metadata import files in the same metadata job. The bucket or folder must contain at least one metadata import file, in JSON Lines format (either
.json
or.jsonl
file extension).In
FULL
entry sync mode, don't save the metadata import file in a folder namedSOURCE_STORAGE_URI/deletions/
.Caution: If the metadata import file contains no data, all entries and aspects that belong to the job's scope are deleted.
#source_storage_uri=
def source_storage_uri=(value) -> ::String
-
value (::String) — Optional. The URI of a Cloud Storage bucket or folder (beginning with
gs://
and ending with/
) that contains the metadata import files for this job.A metadata import file defines the values to set for each of the entries and aspects in a metadata job. For more information about how to create a metadata import file and the file requirements, see Metadata import file.
You can provide multiple metadata import files in the same metadata job. The bucket or folder must contain at least one metadata import file, in JSON Lines format (either
.json
or.jsonl
file extension).In
FULL
entry sync mode, don't save the metadata import file in a folder namedSOURCE_STORAGE_URI/deletions/
.Caution: If the metadata import file contains no data, all entries and aspects that belong to the job's scope are deleted.
-
(::String) — Optional. The URI of a Cloud Storage bucket or folder (beginning with
gs://
and ending with/
) that contains the metadata import files for this job.A metadata import file defines the values to set for each of the entries and aspects in a metadata job. For more information about how to create a metadata import file and the file requirements, see Metadata import file.
You can provide multiple metadata import files in the same metadata job. The bucket or folder must contain at least one metadata import file, in JSON Lines format (either
.json
or.jsonl
file extension).In
FULL
entry sync mode, don't save the metadata import file in a folder namedSOURCE_STORAGE_URI/deletions/
.Caution: If the metadata import file contains no data, all entries and aspects that belong to the job's scope are deleted.