StorageTransferJob

Property Value
Google Cloud Service Name Storage Transfer
Google Cloud Service Documentation /storage-transfer/docs/
Google Cloud REST Resource Name v1.transferJobs
Google Cloud REST Resource Documentation /storage-transfer/docs/reference/rest/v1/transferJobs
Config Connector Resource Short Names gcpstoragetransferjob
gcpstoragetransferjobs
storagetransferjob
Config Connector Service Name storagetransfer.googleapis.com
Config Connector Resource Fully Qualified Name storagetransferjobs.storagetransfer.cnrm.cloud.google.com
Can Be Referenced by IAMPolicy/IAMPolicyMember No

Custom Resource Definition Properties

Annotations

Fields
cnrm.cloud.google.com/project-id

Spec

Schema

  description: string
  schedule:
    scheduleEndDate:
      day: integer
      month: integer
      year: integer
    scheduleStartDate:
      day: integer
      month: integer
      year: integer
    startTimeOfDay:
      hours: integer
      minutes: integer
      nanos: integer
      seconds: integer
  status: string
  transferSpec:
    awsS3DataSource:
      awsAccessKey:
        accessKeyId:
          value: string
          valueFrom:
            secretKeyRef:
              key: string
              name: string
        secretAccessKey:
          value: string
          valueFrom:
            secretKeyRef:
              key: string
              name: string
      bucketName: string
    gcsDataSink:
      bucketRef:
        external: string
        name: string
        namespace: string
    gcsDataSource:
      bucketRef:
        external: string
        name: string
        namespace: string
    httpDataSource:
      listUrl: string
    objectConditions:
      excludePrefixes:
      - string
      includePrefixes:
      - string
      maxTimeElapsedSinceLastModification: string
      minTimeElapsedSinceLastModification: string
    transferOptions:
      deleteObjectsFromSourceAfterTransfer: boolean
      deleteObjectsUniqueInSink: boolean
      overwriteObjectsAlreadyExistingInSink: boolean
Fields

description

Required

string

Unique description to identify the Transfer Job.

schedule

Required

object

Schedule specification defining when the Transfer Job should be scheduled to start, end and and what time to run.

schedule.scheduleEndDate

Optional

object

The last day the recurring transfer will be run. If schedule_end_date is the same as schedule_start_date, the transfer will be executed only once.

schedule.scheduleEndDate.day

Required*

integer

Day of month. Must be from 1 to 31 and valid for the year and month.

schedule.scheduleEndDate.month

Required*

integer

Month of year. Must be from 1 to 12.

schedule.scheduleEndDate.year

Required*

integer

Year of date. Must be from 1 to 9999.

schedule.scheduleStartDate

Required

object

The first day the recurring transfer is scheduled to run. If schedule_start_date is in the past, the transfer will run for the first time on the following day.

schedule.scheduleStartDate.day

Required

integer

Day of month. Must be from 1 to 31 and valid for the year and month.

schedule.scheduleStartDate.month

Required

integer

Month of year. Must be from 1 to 12.

schedule.scheduleStartDate.year

Required

integer

Year of date. Must be from 1 to 9999.

schedule.startTimeOfDay

Optional

object

The time in UTC at which the transfer will be scheduled to start in a day. Transfers may start later than this time. If not specified, recurring and one-time transfers that are scheduled to run today will run immediately; recurring transfers that are scheduled to run on a future date will start at approximately midnight UTC on that date. Note that when configuring a transfer with the Cloud Platform Console, the transfer's start time in a day is specified in your local timezone.

schedule.startTimeOfDay.hours

Required*

integer

Hours of day in 24 hour format. Should be from 0 to 23.

schedule.startTimeOfDay.minutes

Required*

integer

Minutes of hour of day. Must be from 0 to 59.

schedule.startTimeOfDay.nanos

Required*

integer

Fractions of seconds in nanoseconds. Must be from 0 to 999,999,999.

schedule.startTimeOfDay.seconds

Required*

integer

Seconds of minutes of the time. Must normally be from 0 to 59.

status

Optional

string

Status of the job. Default: ENABLED. NOTE: The effect of the new job status takes place during a subsequent job run. For example, if you change the job status from ENABLED to DISABLED, and an operation spawned by the transfer is running, the status change would not affect the current operation.

transferSpec

Required

object

Transfer specification.

transferSpec.awsS3DataSource

Optional

object

An AWS S3 data source.

transferSpec.awsS3DataSource.awsAccessKey

Required*

object

AWS credentials block.

transferSpec.awsS3DataSource.awsAccessKey.accessKeyId

Required*

object

AWS Key ID.

transferSpec.awsS3DataSource.awsAccessKey.accessKeyId.value

Optional

string

Value of the field. Cannot be used if 'valueFrom' is specified.

transferSpec.awsS3DataSource.awsAccessKey.accessKeyId.valueFrom

Optional

object

Source for the field's value. Cannot be used if 'value' is specified.

transferSpec.awsS3DataSource.awsAccessKey.accessKeyId.valueFrom.secretKeyRef

Optional

object

Reference to a value with the given key in the given Secret in the resource's namespace.

transferSpec.awsS3DataSource.awsAccessKey.accessKeyId.valueFrom.secretKeyRef.key

Required*

string

Key that identifies the value to be extracted.

transferSpec.awsS3DataSource.awsAccessKey.accessKeyId.valueFrom.secretKeyRef.name

Required*

string

Name of the Secret to extract a value from.

transferSpec.awsS3DataSource.awsAccessKey.secretAccessKey

Required*

object

AWS Secret Access Key.

transferSpec.awsS3DataSource.awsAccessKey.secretAccessKey.value

Optional

string

Value of the field. Cannot be used if 'valueFrom' is specified.

transferSpec.awsS3DataSource.awsAccessKey.secretAccessKey.valueFrom

Optional

object

Source for the field's value. Cannot be used if 'value' is specified.

transferSpec.awsS3DataSource.awsAccessKey.secretAccessKey.valueFrom.secretKeyRef

Optional

object

Reference to a value with the given key in the given Secret in the resource's namespace.

transferSpec.awsS3DataSource.awsAccessKey.secretAccessKey.valueFrom.secretKeyRef.key

Required*

string

Key that identifies the value to be extracted.

transferSpec.awsS3DataSource.awsAccessKey.secretAccessKey.valueFrom.secretKeyRef.name

Required*

string

Name of the Secret to extract a value from.

transferSpec.awsS3DataSource.bucketName

Required*

string

S3 Bucket name.

transferSpec.gcsDataSink

Optional

object

A Google Cloud Storage data sink.

transferSpec.gcsDataSink.bucketRef

Required*

object

transferSpec.gcsDataSink.bucketRef.external

Optional

string

The name of a StorageBucket.

transferSpec.gcsDataSink.bucketRef.name

Optional

string

Name of the referent. More info: https://kubernetes.io/docs/concepts/overview/working-with-objects/names/#names

transferSpec.gcsDataSink.bucketRef.namespace

Optional

string

Namespace of the referent. More info: https://kubernetes.io/docs/concepts/overview/working-with-objects/namespaces/

transferSpec.gcsDataSource

Optional

object

A Google Cloud Storage data source.

transferSpec.gcsDataSource.bucketRef

Required*

object

transferSpec.gcsDataSource.bucketRef.external

Optional

string

The name of a StorageBucket.

transferSpec.gcsDataSource.bucketRef.name

Optional

string

Name of the referent. More info: https://kubernetes.io/docs/concepts/overview/working-with-objects/names/#names

transferSpec.gcsDataSource.bucketRef.namespace

Optional

string

Namespace of the referent. More info: https://kubernetes.io/docs/concepts/overview/working-with-objects/namespaces/

transferSpec.httpDataSource

Optional

object

An HTTP URL data source.

transferSpec.httpDataSource.listUrl

Required*

string

The URL that points to the file that stores the object list entries. This file must allow public access. Currently, only URLs with HTTP and HTTPS schemes are supported.

transferSpec.objectConditions

Optional

object

Only objects that satisfy these object conditions are included in the set of data source and data sink objects. Object conditions based on objects' last_modification_time do not exclude objects in a data sink.

transferSpec.objectConditions.excludePrefixes

Optional

list (string)

exclude_prefixes must follow the requirements described for include_prefixes.

transferSpec.objectConditions.excludePrefixes.[]

Optional

string

transferSpec.objectConditions.includePrefixes

Optional

list (string)

If include_refixes is specified, objects that satisfy the object conditions must have names that start with one of the include_prefixes and that do not start with any of the exclude_prefixes. If include_prefixes is not specified, all objects except those that have names starting with one of the exclude_prefixes must satisfy the object conditions.

transferSpec.objectConditions.includePrefixes.[]

Optional

string

transferSpec.objectConditions.maxTimeElapsedSinceLastModification

Optional

string

A duration in seconds with up to nine fractional digits, terminated by 's'. Example: "3.5s".

transferSpec.objectConditions.minTimeElapsedSinceLastModification

Optional

string

A duration in seconds with up to nine fractional digits, terminated by 's'. Example: "3.5s".

transferSpec.transferOptions

Optional

object

Characteristics of how to treat files from datasource and sink during job. If the option delete_objects_unique_in_sink is true, object conditions based on objects' last_modification_time are ignored and do not exclude objects in a data source or a data sink.

transferSpec.transferOptions.deleteObjectsFromSourceAfterTransfer

Optional

boolean

Whether objects should be deleted from the source after they are transferred to the sink. Note that this option and delete_objects_unique_in_sink are mutually exclusive.

transferSpec.transferOptions.deleteObjectsUniqueInSink

Optional

boolean

Whether objects that exist only in the sink should be deleted. Note that this option and delete_objects_from_source_after_transfer are mutually exclusive.

transferSpec.transferOptions.overwriteObjectsAlreadyExistingInSink

Optional

boolean

Whether overwriting objects that already exist in the sink is allowed.

* Field is required when parent field is specified

Status

Schema

  conditions:
  - lastTransitionTime: string
    message: string
    reason: string
    status: string
    type: string
  creationTime: string
  deletionTime: string
  lastModificationTime: string
  name: string
Fields
conditions

list (object)

Conditions represents the latest available observation of the resource's current state.

conditions.[]

object

conditions.[].lastTransitionTime

string

Last time the condition transitioned from one status to another.

conditions.[].message

string

Human-readable message indicating details about last transition.

conditions.[].reason

string

Unique, one-word, CamelCase reason for the condition's last transition.

conditions.[].status

string

Status is the status of the condition. Can be True, False, Unknown.

conditions.[].type

string

Type is the type of the condition.

creationTime

string

When the Transfer Job was created.

deletionTime

string

When the Transfer Job was deleted.

lastModificationTime

string

When the Transfer Job was last modified.

name

string

The name of the Transfer Job.

Sample YAML(s)

Typical Use Case

  # Copyright 2020 Google LLC
  #
  # Licensed under the Apache License, Version 2.0 (the "License");
  # you may not use this file except in compliance with the License.
  # You may obtain a copy of the License at
  #
  #     http://www.apache.org/licenses/LICENSE-2.0
  #
  # Unless required by applicable law or agreed to in writing, software
  # distributed under the License is distributed on an "AS IS" BASIS,
  # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  # See the License for the specific language governing permissions and
  # limitations under the License.
  
  apiVersion: storagetransfer.cnrm.cloud.google.com/v1beta1
  kind: StorageTransferJob
  metadata:
    name: storagetransferjob-sample
  spec:
    description: "Sample storage transfer job"
    schedule:
      startTimeOfDay:
        seconds: 0
        hours: 0
        minutes: 0
        nanos: 0
      scheduleEndDate:
        day: 31
        month: 12
        year: 9999
      scheduleStartDate:
        day: 28
        month: 1
        year: 2020
    status: ENABLED
    transferSpec:
      gcsDataSink:
        bucketRef:
          name: ${PROJECT_ID?}-storagetransferjob-dep1
      gcsDataSource:
        bucketRef:
          name: ${PROJECT_ID?}-storagetransferjob-dep2
      objectConditions:
        maxTimeElapsedSinceLastModification: 5s
        minTimeElapsedSinceLastModification: 2s
      transferOptions:
        deleteObjectsUniqueInSink: false
        overwriteObjectsAlreadyExistingInSink: true
  ---
  apiVersion: iam.cnrm.cloud.google.com/v1beta1
  kind: IAMPolicyMember
  metadata:
    name: storagetransferjob-dep1
  spec:
    # replace ${PROJECT_NUMBER?} with your project number
    member: serviceAccount:project-${PROJECT_NUMBER?}@storage-transfer-service.iam.gserviceaccount.com
    role: roles/storage.admin
    resourceRef:
      apiVersion: storage.cnrm.cloud.google.com/v1beta1
      kind: StorageBucket
      name: ${PROJECT_ID?}-storagetransferjob-dep1
  ---
  apiVersion: iam.cnrm.cloud.google.com/v1beta1
  kind: IAMPolicyMember
  metadata:
    name: storagetransferjob-dep2
  spec:
    # replace ${PROJECT_NUMBER?} with your project number
    member: serviceAccount:project-${PROJECT_NUMBER?}@storage-transfer-service.iam.gserviceaccount.com
    role: roles/storage.admin
    resourceRef:
      apiVersion: storage.cnrm.cloud.google.com/v1beta1
      kind: StorageBucket
      name: ${PROJECT_ID?}-storagetransferjob-dep2
  ---
  apiVersion: storage.cnrm.cloud.google.com/v1beta1
  kind: StorageBucket
  metadata:
    # StorageBucket names must be globally unique. Replace ${PROJECT_ID?} with your project ID.
    name: ${PROJECT_ID?}-storagetransferjob-dep1
  ---
  apiVersion: storage.cnrm.cloud.google.com/v1beta1
  kind: StorageBucket
  metadata:
    # StorageBucket names must be globally unique. Replace ${PROJECT_ID?} with your project ID.
    name: ${PROJECT_ID?}-storagetransferjob-dep2