Reference documentation and code samples for the google-cloud-bigquery class Google::Cloud::Bigquery::CopyJob::Updater.
Yielded to a block to accumulate changes for an API request.
Inherits
Methods
#cancel
def cancel()
#create=
def create=(new_create)
Sets the create disposition.
This specifies whether the job is allowed to create new tables. The
default value is needed
.
The following values are supported:
needed
- Create the table if it does not exist.never
- The table must already exist. A 'notFound' error is raised if the table does not exist.
- new_create (String) — The new create disposition.
#encryption=
def encryption=(val)
Sets the encryption configuration of the destination table.
- val (Google::Cloud::BigQuery::EncryptionConfiguration) — Custom encryption configuration (e.g., Cloud KMS keys).
require "google/cloud/bigquery" bigquery = Google::Cloud::Bigquery.new dataset = bigquery.dataset "my_dataset" table = dataset.table "my_table" key_name = "projects/a/locations/b/keyRings/c/cryptoKeys/d" encrypt_config = bigquery.encryption kms_key: key_name job = table.copy_job "my_dataset.new_table" do |job| job.encryption = encrypt_config end
#labels=
def labels=(value)
Sets the labels to use for the job.
-
value (Hash) —
A hash of user-provided labels associated with the job. You can use these to organize and group your jobs.
The labels applied to a resource must meet the following requirements:
- Each resource can have multiple labels, up to a maximum of 64.
- Each label must be a key-value pair.
- Keys have a minimum length of 1 character and a maximum length of 63 characters, and cannot be empty. Values can be empty, and have a maximum length of 63 characters.
- Keys and values can contain only lowercase letters, numeric characters, underscores, and dashes. All characters must use UTF-8 encoding, and international characters are allowed.
- The key portion of a label must be unique. However, you can use the same key with multiple resources.
- Keys must start with a lowercase letter or international character.
#location=
def location=(value)
Sets the geographic location where the job should run. Required except for US and EU.
- value (String) — A geographic location, such as "US", "EU" or "asia-northeast1". Required except for US and EU.
require "google/cloud/bigquery" bigquery = Google::Cloud::Bigquery.new dataset = bigquery.dataset "my_dataset" table = dataset.table "my_table" destination_table = dataset.table "my_destination_table" copy_job = table.copy_job destination_table do |j| j.location = "EU" end copy_job.wait_until_done! copy_job.done? #=> true
#refresh!
def refresh!()
#reload!
def reload!()
#rerun!
def rerun!()
#wait_until_done!
def wait_until_done!()
#write=
def write=(new_write)
Sets the write disposition.
This specifies how to handle data already present in the table. The
default value is append
.
The following values are supported:
truncate
- BigQuery overwrites the table data.append
- BigQuery appends the data to the table.empty
- An error will be returned if the table already contains data.
- new_write (String) — The new write disposition.