Crea y administra transferencias de datos de manera programática

En esta página, se muestra cómo usar el Servicio de transferencia de almacenamiento de forma directa a través de la API de REST y, de manera programática, con Java y Python en dos situaciones comunes. Para crear un trabajo de transferencia con Google Cloud Console, consulte Creación y administración de transferencias con la consola.

Cuando configures o edites trabajos de transferencia de manera programática con la API de Storage Transfer, la hora debe estar en UTC. Para obtener más información sobre cómo especificar el programa de un trabajo de transferencia, consulta Programa.

Antes de comenzar

Antes de poder configurar trabajos de transferencia en el Servicio de transferencia de almacenamiento, asegúrese de tener el acceso necesario:

  • Acceso al servicio de transferencia de almacenamiento: debes tener asignada una de las siguientes funciones:

    • funciones/propietario
    • funciones/editor
    • roles/storagetransfer.admin
    • roles/storagetransfer.user
    • Un rol personalizado que incluye, como mínimo, permisos roles/storagetransfer.user.

      Para obtener más información sobre cómo agregar y ver permisos a nivel de proyecto, consulte Uso de permisos de IAM con proyectos.

    Para obtener más información sobre los permisos y las funciones de IAM en el Servicio de transferencia de almacenamiento, consulta Control de acceso con permisos y funciones de IAM.

  • Acceso a la fuente y al receptor: El Servicio de transferencia de almacenamiento usa una cuenta de servicio para realizar las transferencias. Para acceder a la fuente y al receptor de datos, esta cuenta de servicio debe contar con permisos de fuente y permisos de receptor.

Transferencias desde Amazon S3 a Cloud Storage

En este ejemplo, aprenderás cómo mover archivos desde Amazon S3 a un depósito de Cloud Storage. Asegúrate de revisar Configura el acceso y Precios para comprender lo que implica trasladar datos desde Amazon S3 hasta Cloud Storage.

Para crear el trabajo de transferencia, sigue estos pasos:

Cuando crees trabajos de transferencia, no incluyas el prefijo s3:// para bucketName en los nombres de origen del depósito de Amazon S3.

REST

Realiza la solicitud con transferJobs create:
POST https://storagetransfer.googleapis.com/v1/transferJobs
{
    "description": "YOUR DESCRIPTION",
    "status": "ENABLED",
    "projectId": "PROJECT_ID",
    "schedule": {
        "scheduleStartDate": {
            "day": 1,
            "month": 1,
            "year": 2015
        },
        "scheduleEndDate": {
            "day": 1,
            "month": 1,
            "year": 2015
        },
        "startTimeOfDay": {
            "hours": 1,
            "minutes": 1
        }
    },
    "transferSpec": {
        "awsS3DataSource": {
            "bucketName": "AWS_SOURCE_NAME",
            "awsAccessKey": {
                "accessKeyId": "AWS_ACCESS_KEY_ID",
                "secretAccessKey": "AWS_SECRET_ACCESS_KEY"
            }
        },
        "gcsDataSink": {
            "bucketName": "GCS_SINK_NAME"
        }
    }
}
Respuesta:
200 OK
{
    "transferJob": [
        {
            "creationTime": "2015-01-01T01:01:00.000000000Z",
            "description": "YOUR DESCRIPTION",
            "name": "transferJobs/JOB_ID",
            "status": "ENABLED",
            "lastModificationTime": "2015-01-01T01:01:00.000000000Z",
            "projectId": "PROJECT_ID",
            "schedule": {
                "scheduleStartDate": {
                    "day": 1,
                    "month": 1,
                    "year": 2015
                },
                "scheduleEndDate": {
                    "day": 1,
                    "month": 1,
                    "year": 2015
                },
                "startTimeOfDay": {
                    "hours": 1,
                    "minutes": 1
                }
            },
            "transferSpec": {
                "awsS3DataSource": {
                    "bucketName": "AWS_SOURCE_NAME"
                },
                "gcsDataSink": {
                    "bucketName": "GCS_SINK_NAME"
                },
                "objectConditions": {},
                "transferOptions": {}
            }
        }
    ]
}

Java

Si deseas obtener información sobre cómo crear un cliente del Servicio de transferencia de almacenamiento, consulta en Crea un cliente para una biblioteca de API de Google.


package com.google.cloud.storage.storagetransfer.samples;

import com.google.api.services.storagetransfer.v1.Storagetransfer;
import com.google.api.services.storagetransfer.v1.model.AwsAccessKey;
import com.google.api.services.storagetransfer.v1.model.AwsS3Data;
import com.google.api.services.storagetransfer.v1.model.Date;
import com.google.api.services.storagetransfer.v1.model.GcsData;
import com.google.api.services.storagetransfer.v1.model.Schedule;
import com.google.api.services.storagetransfer.v1.model.TimeOfDay;
import com.google.api.services.storagetransfer.v1.model.TransferJob;
import com.google.api.services.storagetransfer.v1.model.TransferSpec;
import java.io.IOException;
import java.io.PrintStream;

/** Creates a one-off transfer job from Amazon S3 to Google Cloud Storage. */
public final class AwsRequester {
  /**
   * Creates and executes a request for a TransferJob from Amazon S3 to Cloud Storage.
   *
   * The {@code startDate} and {@code startTime} parameters should be set according to the UTC
   * Time Zone. See:
   * https://developers.google.com/resources/api-libraries/documentation/storagetransfer/v1/java/latest/com/google/api/services/storagetransfer/v1/model/Schedule.html#getStartTimeOfDay()
   *
   * @return the response TransferJob if the request is successful
   * @throws InstantiationException if instantiation fails when building the TransferJob
   * @throws IllegalAccessException if an illegal access occurs when building the TransferJob
   * @throws IOException if the client failed to complete the request
   */
  public static TransferJob createAwsTransferJob(
      String projectId,
      String jobDescription,
      String awsSourceBucket,
      String gcsSinkBucket,
      String startDate,
      String startTime,
      String awsAccessKeyId,
      String awsSecretAccessKey)
      throws InstantiationException, IllegalAccessException, IOException {
    Date date = TransferJobUtils.createDate(startDate);
    TimeOfDay time = TransferJobUtils.createTimeOfDay(startTime);
    TransferJob transferJob =
        new TransferJob()
            .setDescription(jobDescription)
            .setProjectId(projectId)
            .setTransferSpec(
                new TransferSpec()
                    .setAwsS3DataSource(
                        new AwsS3Data()
                            .setBucketName(awsSourceBucket)
                            .setAwsAccessKey(
                                new AwsAccessKey()
                                    .setAccessKeyId(awsAccessKeyId)
                                    .setSecretAccessKey(awsSecretAccessKey)))
                    .setGcsDataSink(new GcsData().setBucketName(gcsSinkBucket)))
            .setSchedule(
                new Schedule()
                    .setScheduleStartDate(date)
                    .setScheduleEndDate(date)
                    .setStartTimeOfDay(time))
            .setStatus("ENABLED");

    Storagetransfer client = TransferClientCreator.createStorageTransferClient();
    return client.transferJobs().create(transferJob).execute();
  }

  public static void run(PrintStream out)
      throws InstantiationException, IllegalAccessException, IOException {
    String projectId = TransferJobUtils.getPropertyOrFail("projectId");
    String jobDescription = TransferJobUtils.getPropertyOrFail("jobDescription");
    String awsSourceBucket = TransferJobUtils.getPropertyOrFail("awsSourceBucket");
    String gcsSinkBucket = TransferJobUtils.getPropertyOrFail("gcsSinkBucket");
    String startDate = TransferJobUtils.getPropertyOrFail("startDate");
    String startTime = TransferJobUtils.getPropertyOrFail("startTime");
    String awsAccessKeyId = TransferJobUtils.getEnvOrFail("AWS_ACCESS_KEY_ID");
    String awsSecretAccessKey = TransferJobUtils.getEnvOrFail("AWS_SECRET_ACCESS_KEY");

    TransferJob responseT =
        createAwsTransferJob(
            projectId,
            jobDescription,
            awsSourceBucket,
            gcsSinkBucket,
            startDate,
            startTime,
            awsAccessKeyId,
            awsSecretAccessKey);
    out.println("Return transferJob: " + responseT.toPrettyString());
  }

  /** Output the contents of a successfully created TransferJob. */
  public static void main(String[] args) {
    try {
      run(System.out);
    } catch (Exception e) {
      e.printStackTrace();
    }
  }
}

Python

Si deseas obtener información sobre cómo crear un cliente del Servicio de transferencia de almacenamiento, consulta en Crea un cliente para una biblioteca de API de Google.

"""Command-line sample that creates a one-time transfer from Amazon S3 to
Google Cloud Storage.

This sample is used on this page:

    https://cloud.google.com/storage/transfer/create-transfer

For more information, see README.md.
"""

import argparse
import datetime
import json

import googleapiclient.discovery

def main(description, project_id, start_date, start_time, source_bucket,
         access_key_id, secret_access_key, sink_bucket):
    """Create a one-time transfer from Amazon S3 to Google Cloud Storage."""
    storagetransfer = googleapiclient.discovery.build('storagetransfer', 'v1')

    # Edit this template with desired parameters.
    transfer_job = {
        'description': description,
        'status': 'ENABLED',
        'projectId': project_id,
        'schedule': {
            'scheduleStartDate': {
                'day': start_date.day,
                'month': start_date.month,
                'year': start_date.year
            },
            'scheduleEndDate': {
                'day': start_date.day,
                'month': start_date.month,
                'year': start_date.year
            },
            'startTimeOfDay': {
                'hours': start_time.hour,
                'minutes': start_time.minute,
                'seconds': start_time.second
            }
        },
        'transferSpec': {
            'awsS3DataSource': {
                'bucketName': source_bucket,
                'awsAccessKey': {
                    'accessKeyId': access_key_id,
                    'secretAccessKey': secret_access_key
                }
            },
            'gcsDataSink': {
                'bucketName': sink_bucket
            }
        }
    }

    result = storagetransfer.transferJobs().create(body=transfer_job).execute()
    print('Returned transferJob: {}'.format(
        json.dumps(result, indent=4)))

if __name__ == '__main__':
    parser = argparse.ArgumentParser(
        description=__doc__,
        formatter_class=argparse.RawDescriptionHelpFormatter)
    parser.add_argument('description', help='Transfer description.')
    parser.add_argument('project_id', help='Your Google Cloud project ID.')
    parser.add_argument('start_date', help='Date YYYY/MM/DD.')
    parser.add_argument('start_time', help='UTC Time (24hr) HH:MM:SS.')
    parser.add_argument('source_bucket', help='AWS source bucket name.')
    parser.add_argument('access_key_id', help='Your AWS access key id.')
    parser.add_argument(
        'secret_access_key',
        help='Your AWS secret access key.'
    )
    parser.add_argument('sink_bucket', help='GCS sink bucket name.')

    args = parser.parse_args()
    start_date = datetime.datetime.strptime(args.start_date, '%Y/%m/%d')
    start_time = datetime.datetime.strptime(args.start_time, '%H:%M:%S')

    main(
        args.description,
        args.project_id,
        start_date,
        start_time,
        args.source_bucket,
        args.access_key_id,
        args.secret_access_key,
        args.sink_bucket)

Transferencia entre Microsoft Azure Blob Storage y Cloud Storage

En este ejemplo, aprenderá cómo mover archivos de Microsoft Azure Storage a un depósito de Cloud Storage. Asegúrese de revisar Configuración de acceso y Precios para comprender lo que implica trasladar datos de Microsoft Azure Storage a Cloud Storage.

REST

Realiza la solicitud con transferJobs create:
POST https://storagetransfer.googleapis.com/v1/transferJobs
{
    "description": "YOUR DESCRIPTION",
    "status": "ENABLED",
    "projectId": "PROJECT_ID",
    "schedule": {
        "scheduleStartDate": {
            "day": 14,
            "month": 2,
            "year": 2020
        },
        "scheduleEndDate": {
            "day": 14
            "month": 2,
            "year": 2020
        },
        "startTimeOfDay": {
            "hours": 1,
            "minutes": 1
        }
    },
    "transferSpec": {
        "azureBlobStorageDataSource": {
            "storageAccount": "AZURE_SOURCE_NAME",
            "azureCredentials": {
                "sasToken": "AZURE_SAS_TOKEN",
            },
            "container": "AZURE_CONTAINER",
        },
        "gcsDataSink": {
            "bucketName": "GCS_SINK_NAME"
        }
    }
}
Respuesta:
200 OK
{
    "transferJob": [
        {
            "creationTime": "2020-02-14T01:01:00.000000000Z",
            "description": "YOUR DESCRIPTION",
            "name": "transferJobs/JOB_ID",
            "status": "ENABLED",
            "lastModificationTime": "2020-02-14T01:01:00.000000000Z",
            "projectId": "PROJECT_ID",
            "schedule": {
                "scheduleStartDate": {
                    "day": 14
                    "month": 2,
                    "year": 2020
                },
                "scheduleEndDate": {
                    "day": 14,
                    "month": 2,
                    "year": 2020
                },
                "startTimeOfDay": {
                    "hours": 1,
                    "minutes": 1
                }
            },
            "transferSpec": {
                "azureBlobStorageDataSource": {
                    "storageAccount": "AZURE_SOURCE_NAME",
                    "azureCredentials": {
                        "sasToken": "AZURE_SAS_TOKEN",
                    },
                    "container": "AZURE_CONTAINER",
                },
                "objectConditions": {},
                "transferOptions": {}
            }
        }
    ]
}

Transfiere entre depósitos de Cloud Storage

En este ejemplo, aprenderás cómo mover archivos desde un depósito de Cloud Storage a otro. Por ejemplo, puedes replicar datos en un depósito en otra ubicación.

Para crear el trabajo de transferencia, sigue estos pasos:

REST

Realiza la solicitud con transferJobs create:
POST https://storagetransfer.googleapis.com/v1/transferJobs
{
    "description": "YOUR DESCRIPTION",
    "status": "ENABLED",
    "projectId": "PROJECT_ID",
    "schedule": {
        "scheduleStartDate": {
            "day": 1,
            "month": 1,
            "year": 2015
        },
        "startTimeOfDay": {
            "hours": 1,
            "minutes": 1
        }
    },
    "transferSpec": {
        "gcsDataSource": {
            "bucketName": "GCS_SOURCE_NAME"
        },
        "gcsDataSink": {
            "bucketName": "GCS_NEARLINE_SINK_NAME"
        },
        "objectConditions": {
            "minTimeElapsedSinceLastModification": "2592000s"
        },
        "transferOptions": {
            "deleteObjectsFromSourceAfterTransfer": true
        }
    }
}
Respuesta:
200 OK
{
    "transferJob": [
        {
            "creationTime": "2015-01-01T01:01:00.000000000Z",
            "description": "YOUR DESCRIPTION",
            "name": "transferJobs/JOB_ID",
            "status": "ENABLED",
            "lastModificationTime": "2015-01-01T01:01:00.000000000Z",
            "projectId": "PROJECT_ID",
            "schedule": {
                "scheduleStartDate": {
                    "day": 1,
                    "month": 1,
                    "year": 2015
                },
                "startTimeOfDay": {
                    "hours": 1,
                    "minutes": 1
                }
            },
            "transferSpec": {
                "gcsDataSource": {
                    "bucketName": "GCS_SOURCE_NAME",
                },
                "gcsDataSink": {
                    "bucketName": "GCS_NEARLINE_SINK_NAME"
                },
                "objectConditions": {
                    "minTimeElapsedSinceLastModification": "2592000.000s"
                },
                "transferOptions": {
                    "deleteObjectsFromSourceAfterTransfer": true
                }
            }
        }
    ]
}

Java

Si deseas obtener información sobre cómo crear un cliente del Servicio de transferencia de almacenamiento, consulta en Crea un cliente para una biblioteca de API de Google.


package com.google.cloud.storage.storagetransfer.samples;

import com.google.api.services.storagetransfer.v1.Storagetransfer;
import com.google.api.services.storagetransfer.v1.model.Date;
import com.google.api.services.storagetransfer.v1.model.GcsData;
import com.google.api.services.storagetransfer.v1.model.ObjectConditions;
import com.google.api.services.storagetransfer.v1.model.Schedule;
import com.google.api.services.storagetransfer.v1.model.TimeOfDay;
import com.google.api.services.storagetransfer.v1.model.TransferJob;
import com.google.api.services.storagetransfer.v1.model.TransferOptions;
import com.google.api.services.storagetransfer.v1.model.TransferSpec;
import java.io.IOException;
import java.io.PrintStream;

/**
 * Creates a daily transfer from a standard Cloud Storage bucket to a Cloud Storage Nearline bucket
 * for files untouched for 30 days.
 */
public final class NearlineRequester {

  /**
   * Creates and executes a request for a TransferJob to Cloud Storage Nearline.
   *
   * The {@code startDate} and {@code startTime} parameters should be set according to the UTC
   * Time Zone. See:
   * https://developers.google.com/resources/api-libraries/documentation/storagetransfer/v1/java/latest/com/google/api/services/storagetransfer/v1/model/Schedule.html#getStartTimeOfDay()
   *
   * @return the response TransferJob if the request is successful
   * @throws InstantiationException if instantiation fails when building the TransferJob
   * @throws IllegalAccessException if an illegal access occurs when building the TransferJob
   * @throws IOException if the client failed to complete the request
   */
  public static TransferJob createNearlineTransferJob(
      String projectId,
      String jobDescription,
      String gcsSourceBucket,
      String gcsNearlineSinkBucket,
      String startDate,
      String startTime)
      throws InstantiationException, IllegalAccessException, IOException {
    Date date = TransferJobUtils.createDate(startDate);
    TimeOfDay time = TransferJobUtils.createTimeOfDay(startTime);
    TransferJob transferJob =
        new TransferJob()
            .setDescription(jobDescription)
            .setProjectId(projectId)
            .setTransferSpec(
                new TransferSpec()
                    .setGcsDataSource(new GcsData().setBucketName(gcsSourceBucket))
                    .setGcsDataSink(new GcsData().setBucketName(gcsNearlineSinkBucket))
                    .setObjectConditions(
                        new ObjectConditions()
                            .setMinTimeElapsedSinceLastModification("2592000s" /* 30 days */))
                    .setTransferOptions(
                        new TransferOptions().setDeleteObjectsFromSourceAfterTransfer(true)))
            .setSchedule(new Schedule().setScheduleStartDate(date).setStartTimeOfDay(time))
            .setStatus("ENABLED");

    Storagetransfer client = TransferClientCreator.createStorageTransferClient();
    return client.transferJobs().create(transferJob).execute();
  }

  public static void run(PrintStream out)
      throws InstantiationException, IllegalAccessException, IOException {
    String projectId = TransferJobUtils.getPropertyOrFail("projectId");
    String jobDescription = TransferJobUtils.getPropertyOrFail("jobDescription");
    String gcsSourceBucket = TransferJobUtils.getPropertyOrFail("gcsSourceBucket");
    String gcsNearlineSinkBucket = TransferJobUtils.getPropertyOrFail("gcsNearlineSinkBucket");
    String startDate = TransferJobUtils.getPropertyOrFail("startDate");
    String startTime = TransferJobUtils.getPropertyOrFail("startTime");

    TransferJob responseT =
        createNearlineTransferJob(
            projectId,
            jobDescription,
            gcsSourceBucket,
            gcsNearlineSinkBucket,
            startDate,
            startTime);
    out.println("Return transferJob: " + responseT.toPrettyString());
  }

  /**
   * Output the contents of a successfully created TransferJob.
   *
   * @param args arguments from the command line
   */
  public static void main(String[] args) {
    try {
      run(System.out);
    } catch (Exception e) {
      e.printStackTrace();
    }
  }
}

Python

Si deseas obtener información sobre cómo crear un cliente del Servicio de transferencia de almacenamiento, consulta en Crea un cliente para una biblioteca de API de Google.


"""Command-line sample that creates a daily transfer from a standard
GCS bucket to a Nearline GCS bucket for objects untouched for 30 days.

This sample is used on this page:

    https://cloud.google.com/storage/transfer/create-transfer

For more information, see README.md.
"""

import argparse
import datetime
import json

import googleapiclient.discovery

def main(description, project_id, start_date, start_time, source_bucket,
         sink_bucket):
    """Create a daily transfer from Standard to Nearline Storage class."""
    storagetransfer = googleapiclient.discovery.build('storagetransfer', 'v1')

    # Edit this template with desired parameters.
    transfer_job = {
        'description': description,
        'status': 'ENABLED',
        'projectId': project_id,
        'schedule': {
            'scheduleStartDate': {
                'day': start_date.day,
                'month': start_date.month,
                'year': start_date.year
            },
            'startTimeOfDay': {
                'hours': start_time.hour,
                'minutes': start_time.minute,
                'seconds': start_time.second
            }
        },
        'transferSpec': {
            'gcsDataSource': {
                'bucketName': source_bucket
            },
            'gcsDataSink': {
                'bucketName': sink_bucket
            },
            'objectConditions': {
                'minTimeElapsedSinceLastModification': '2592000s'  # 30 days
            },
            'transferOptions': {
                'deleteObjectsFromSourceAfterTransfer': 'true'
            }
        }
    }

    result = storagetransfer.transferJobs().create(body=transfer_job).execute()
    print('Returned transferJob: {}'.format(
        json.dumps(result, indent=4)))

if __name__ == '__main__':
    parser = argparse.ArgumentParser(
        description=__doc__,
        formatter_class=argparse.RawDescriptionHelpFormatter)
    parser.add_argument('description', help='Transfer description.')
    parser.add_argument('project_id', help='Your Google Cloud project ID.')
    parser.add_argument('start_date', help='Date YYYY/MM/DD.')
    parser.add_argument('start_time', help='UTC Time (24hr) HH:MM:SS.')
    parser.add_argument('source_bucket', help='Standard GCS bucket name.')
    parser.add_argument('sink_bucket', help='Nearline GCS bucket name.')

    args = parser.parse_args()
    start_date = datetime.datetime.strptime(args.start_date, '%Y/%m/%d')
    start_time = datetime.datetime.strptime(args.start_time, '%H:%M:%S')

    main(
        args.description,
        args.project_id,
        start_date,
        start_time,
        args.source_bucket,
        args.sink_bucket)

Comprueba el estado de la operación de transferencia

Es posible que desees comprobar el estado de tus operaciones de transferencia en cualquiera de los ejemplos anteriores. El siguiente código de ejemplo muestra el estado de una operación de transferencia basada en los ID de un trabajo y de tu proyecto.

REST

Realiza la solicitud con transferOperations list:
GET https://storagetransfer.googleapis.com/v1/transferOperations?filter=%7B"project_id":"PROJECT_ID","job_names":%5B"transferJobs/JOB_ID"%5D%7D
Respuesta:

Cloud Storage

Si tu fuente es un depósito de Cloud Storage, la respuesta debe ser similar a lo siguiente:

200 OK
{
    "operations": [
        {
            "done": true,
            "metadata": {
                "@type": "type.googleapis.com/google.storagetransfer.v1.TransferOperation",
                "counters": {},
                "endTime": "2015-01-01T01:01:00.000Z",
                "name": "transferOperations/000000000000000000",
                "projectId": "PROJECT_ID",
                "startTime": "2015-01-01T01:01:00.000",
                "transferSpec": {
                    "gcsDataSink": {
                        "bucketName": "GCS_NEARLINE_SINK_NAME"
                    },
                    "gcsDataSource": {
                        "bucketName": "GCS_SOURCE_NAME"
                    },
                    "objectConditions": {
                        "minTimeElapsedSinceLastModification": "2592000.000s"
                    },
                    "transferOptions": {
                        "deleteObjectsFromSourceAfterTransfer": true
                    }
                },
                "transferStatus": "SUCCESS"
            },
            "name": "transferOperations/000000000000000000",
            "response": {
                "@type": "type.googleapis.com/google.protobuf.Empty"
            }
        }
    ]
}

Amazon S3

Si tu fuente es un depósito de Amazon S3, la respuesta debe ser similar a lo siguiente:

200 OK
{
    "operations": [
        {
            "done": true,
            "metadata": {
                "@type": "type.googleapis.com/google.storagetransfer.v1.TransferOperation",
                "counters": {},
                "endTime": "2015-01-01T01:01:00.000Z",
                "name": "transferOperations/000000000000000000",
                "projectId": "PROJECT_ID",
                "startTime": "2015-01-01T01:01:00.000",
                "transferSpec": {
                    "awsS3DataSource": {
                        "bucketName": "AWS_SOURCE_NAME"
                    },
                    "gcsDataSink": {
                        "bucketName": "GCS_SINK_NAME"
                    },
                    "objectConditions": {},
                    "transferOptions": {}
                },
                "transferStatus": "SUCCESS"
            },
            "name": "transferOperations/000000000000000000",
            "response": {
                "@type": "type.googleapis.com/google.protobuf.Empty"
            }
        }
    ]
}

Microsoft Azure Blob Storage

Si su fuente es un depósito de Microsoft Azure Storage, la respuesta se verá así:

200 OK
{
    "operations": [
        {
            "done": true,
            "metadata": {
                "@type": "type.googleapis.com/google.storagetransfer.v1.TransferOperation",
                "counters": {},
                "endTime": "2020-02-14T01:01:00.000Z",
                "name": "transferOperations/000000000000000000",
                "projectId": "PROJECT_ID",
                "startTime": "2020-02-14T01:01:00.000",
                "transferSpec": {
                    "azureBlobStorageDataSource": {
                        "storageAccount": "AZURE_SOURCE_NAME"
                    },
                    "gcsDataSink": {
                        "bucketName": "GCS_SINK_NAME"
                    },
                    "objectConditions": {},
                    "transferOptions": {}
                },
                "transferStatus": "SUCCESS"
            },
            "name": "transferOperations/000000000000000000",
            "response": {
                "@type": "type.googleapis.com/google.protobuf.Empty"
            }
        }
    ]
}

Java

Si deseas obtener información sobre cómo crear un cliente del Servicio de transferencia de almacenamiento, consulta en Configura tu aplicación.


package com.google.cloud.storage.storagetransfer.samples;

import com.google.api.services.storagetransfer.v1.Storagetransfer;
import com.google.api.services.storagetransfer.v1.model.ListOperationsResponse;
import java.io.IOException;
import java.util.logging.Logger;

/**
 * Queries for TransferOperations associated with a specific TransferJob. A TransferJob is done when
 * all of its associated TransferOperations have completed.
 */
public final class RequestChecker {

  private static final String PROJECT_ID = "YOUR_PROJECT_ID";
  private static final String JOB_NAME = "YOUR_JOB_NAME";

  private static final Logger LOG = Logger.getLogger(RequestChecker.class.getName());

  /**
   * Creates and executes a query for all associated TransferOperations.
   *
   * @param client a Storagetransfer client, for interacting with the Storage Transfer API
   * @param projectId the project to query within
   * @param jobName the job Name of the relevant TransferJob
   * @return an object containing information on associated TransferOperations
   * @throws IOException if the client failed to complete the request
   */
  public static ListOperationsResponse checkTransfer(
      Storagetransfer client, String projectId, String jobName) throws IOException {
    return client
        .transferOperations()
        .list("transferOperations")
        .setFilter("{\"project_id\": \"" + projectId + "\", \"job_names\": [\"" + jobName + "\"] }")
        .execute();
  }

  /**
   * Output the returned list of TransferOperations.
   *
   * @param args arguments from the command line
   */
  public static void main(String[] args) {
    try {
      ListOperationsResponse resp =
          checkTransfer(TransferClientCreator.createStorageTransferClient(), PROJECT_ID, JOB_NAME);
      LOG.info("Result of transferOperations/list: " + resp.toPrettyString());
    } catch (Exception e) {
      e.printStackTrace();
    }
  }
}

Python

Si deseas obtener información sobre cómo crear un cliente del Servicio de transferencia de almacenamiento, consulta en Configura tu aplicación.


"""Command-line sample that checks the status of an in-process transfer.

This sample is used on this page:

    https://cloud.google.com/storage/transfer/create-transfer

For more information, see README.md.
"""

import argparse
import json

import googleapiclient.discovery

def main(project_id, job_name):
    """Review the transfer operations associated with a transfer job."""
    storagetransfer = googleapiclient.discovery.build('storagetransfer', 'v1')

    filterString = (
        '{{"project_id": "{project_id}", '
        '"job_names": ["{job_name}"]}}'
    ).format(project_id=project_id, job_name=job_name)

    result = storagetransfer.transferOperations().list(
        name="transferOperations",
        filter=filterString).execute()
    print('Result of transferOperations/list: {}'.format(
        json.dumps(result, indent=4, sort_keys=True)))

if __name__ == '__main__':
    parser = argparse.ArgumentParser(
        description=__doc__,
        formatter_class=argparse.RawDescriptionHelpFormatter)
    parser.add_argument('project_id', help='Your Google Cloud project ID.')
    parser.add_argument('job_name', help='Your job name.')

    args = parser.parse_args()

    main(args.project_id, args.job_name)

Cancela las operaciones de transferencia

Para cancelar una sola operación de transferencia, utilice el método transferOperations cancel. Para eliminar un trabajo de transferencia completo, incluidas las operaciones de transferencia futuras que están programadas para él, establezca el estado del trabajo de transferencia en DELETED utilizando el método transferJobs patch. La actualización del estado del trabajo de transferencia no afecta las operaciones de transferencia que se encuentran en ejecución. Para cancelar una operación de transferencia en progreso, use el método transferOperations cancel.

Qué sigue

Obtén información sobre cómo trabajar con Cloud Storage.