迁移到 Storage Transfer Service Cloud 客户端库

为了确保高质量并与其他 Cloud 库保持一致,Storage Transfer Service 的文档现在使用 Cloud 客户端库(而不是 Google API 客户端库)。如需详细了解这两个选项,请参阅客户端库说明

Google API 客户端库会继续接收更新,但文档中已不再引用它。

本指南介绍了在使用 Storage Transfer Service 时的主要差异,并提供了有关如何在迁移到 Cloud 客户端库时更新客户端的说明。

Java

更新依赖项

如需切换到新库,请将对 google-api-services-storagetransfer 的依赖项替换为 google-cloud-storage-transfer

<dependency>
    <groupId>com.google.cloud</groupId>
    <artifactId>google-cloud-storage-transfer</artifactId>
    <version>0.2.3</version>
</dependency>

如果您使用的是不带 BOM 的 Gradle,请将此代码添加到您的依赖项中:

implementation 'com.google.cloud:google-cloud-storage-transfer:0.2.3'
<dependencyManagement>
  <dependencies>
    <dependency>
      <groupId>com.google.cloud</groupId>
      <artifactId>libraries-bom</artifactId>
      <version>24.1.0</version>
      <type>pom</type>
      <scope>import</scope>
    </dependency>
  </dependencies>
</dependencyManagement>

<dependencies>
  <dependency>
    <groupId>com.google.cloud</groupId>
    <artifactId>google-cloud-storage-transfer</artifactId>
  </dependency>

在大多数情况下,代码可以从 API 客户端库轻松转换为 Cloud 客户端库。以下是两个 Java 客户端之间的一些主要区别,

客户端实例化

Cloud 客户端库通过在后台处理客户端实例化,减少了许多与客户端实例化相关的样板文件。

API 客户端库

GoogleCredentials credential = GoogleCredentials.getApplicationDefault();
if (credential.createScopedRequired()) {
 credential = credential.createScoped(StoragetransferScopes.all());
}
Storagetransfer storageTransfer = new Storagetransfer.Builder(Utils.getDefaultTransport(),
   Utils.getDefaultJsonFactory(), new HttpCredentialsAdapter(credential))
   .build();

Cloud 客户端库

StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();

模型类的构建器

Cloud 客户端库中的模型类使用构建器而不是构造函数。

API 客户端库

TransferJob transferJob =
   new TransferJob()
       .setStatus("ENABLED");

Cloud 客户端库

TransferJob transferJob =
   TransferJob.newBuilder()
       .setStatus(Status.ENABLED)
   .build();

列出操作会返回可迭代对象

Cloud 客户端库中的列出操作会返回简单的可迭代对象,而不是 API 客户端库中的分页结果。

API 客户端库

public class StoragetransferExample {
  public static void main(String args[]) throws IOException, GeneralSecurityException {
    Storagetransfer storagetransferService = createStoragetransferService();
    Storagetransfer.TransferJobs.List request = storagetransferService.transferJobs().list();

    ListTransferJobsResponse response;
    do {
      response = request.execute();
      if (response.getTransferJobs() == null) {
        continue;
      }
      for (TransferJob transferJob : response.getTransferJobs()) {
        System.out.println(transferJob);
      }
      request.setPageToken(response.getNextPageToken());
    } while (response.getNextPageToken() != null);
  }

  public static Storagetransfer createStoragetransferService()
      throws IOException, GeneralSecurityException {
    HttpTransport httpTransport = GoogleNetHttpTransport.newTrustedTransport();
    JsonFactory jsonFactory = JacksonFactory.getDefaultInstance();

    GoogleCredential credential = GoogleCredential.getApplicationDefault();
    }

    return new Storagetransfer.Builder(httpTransport, jsonFactory, credential)
        .build();
  }
}

Cloud 客户端库

public class StoragetransferExample {
 public static void main(String args[]) throws Exception {
   StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();
   ListTransferJobsRequest request = ListTransferJobsRequest.newBuilder().build();
   for (TransferJob job : client.listTransferJobs(request).iterateAll()) {
     System.out.println(job);
   }
 }
}

示例比较

在这里,我们包含旧版 API 客户端库示例,并与使用 Cloud 客户端库的等效示例进行比较。如果您之前使用过这些示例,则可以使用此对比项了解如何将代码移至新的 Cloud 客户端库。

从 Amazon S3 转移

API 客户端库


import com.google.api.client.googleapis.util.Utils;
import com.google.api.services.storagetransfer.v1.Storagetransfer;
import com.google.api.services.storagetransfer.v1.StoragetransferScopes;
import com.google.api.services.storagetransfer.v1.model.AwsAccessKey;
import com.google.api.services.storagetransfer.v1.model.AwsS3Data;
import com.google.api.services.storagetransfer.v1.model.Date;
import com.google.api.services.storagetransfer.v1.model.GcsData;
import com.google.api.services.storagetransfer.v1.model.Schedule;
import com.google.api.services.storagetransfer.v1.model.TimeOfDay;
import com.google.api.services.storagetransfer.v1.model.TransferJob;
import com.google.api.services.storagetransfer.v1.model.TransferSpec;
import com.google.auth.http.HttpCredentialsAdapter;
import com.google.auth.oauth2.GoogleCredentials;
import java.io.IOException;
import java.util.Calendar;

public class TransferFromAwsApiary {

  // Creates a one-off transfer job from Amazon S3 to Google Cloud Storage.
  public static void transferFromAws(
      String projectId,
      String jobDescription,
      String awsSourceBucket,
      String gcsSinkBucket,
      long startDateTime)
      throws IOException {

    // Your Google Cloud Project ID
    // String projectId = "your-project-id";

    // A short description of this job
    // String jobDescription = "Sample transfer job from S3 to GCS.";

    // The name of the source AWS bucket to transfer data from
    // String awsSourceBucket = "yourAwsSourceBucket";

    // The name of the GCS bucket to transfer data to
    // String gcsSinkBucket = "your-gcs-bucket";

    // What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
    // If this is in the past relative to when the job is created, it will run the next day.
    // long startDateTime =
    //     new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();

    // The ID used to access your AWS account. Should be accessed via environment variable.
    String awsAccessKeyId = System.getenv("AWS_ACCESS_KEY_ID");

    // The Secret Key used to access your AWS account. Should be accessed via environment variable.
    String awsSecretAccessKey = System.getenv("AWS_SECRET_ACCESS_KEY");

    // Set up source and sink
    TransferSpec transferSpec =
        new TransferSpec()
            .setAwsS3DataSource(
                new AwsS3Data()
                    .setBucketName(awsSourceBucket)
                    .setAwsAccessKey(
                        new AwsAccessKey()
                            .setAccessKeyId(awsAccessKeyId)
                            .setSecretAccessKey(awsSecretAccessKey)))
            .setGcsDataSink(new GcsData().setBucketName(gcsSinkBucket));

    // Parse epoch timestamp into the model classes
    Calendar startCalendar = Calendar.getInstance();
    startCalendar.setTimeInMillis(startDateTime);
    // Note that this is a Date from the model class package, not a java.util.Date
    Date startDate =
        new Date()
            .setYear(startCalendar.get(Calendar.YEAR))
            .setMonth(startCalendar.get(Calendar.MONTH) + 1)
            .setDay(startCalendar.get(Calendar.DAY_OF_MONTH));
    TimeOfDay startTime =
        new TimeOfDay()
            .setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
            .setMinutes(startCalendar.get(Calendar.MINUTE))
            .setSeconds(startCalendar.get(Calendar.SECOND));
    Schedule schedule =
        new Schedule()
            .setScheduleStartDate(startDate)
            .setScheduleEndDate(startDate)
            .setStartTimeOfDay(startTime);

    // Set up the transfer job
    TransferJob transferJob =
        new TransferJob()
            .setDescription(jobDescription)
            .setProjectId(projectId)
            .setTransferSpec(transferSpec)
            .setSchedule(schedule)
            .setStatus("ENABLED");

    // Create a Transfer Service client
    GoogleCredentials credential = GoogleCredentials.getApplicationDefault();
    if (credential.createScopedRequired()) {
      credential = credential.createScoped(StoragetransferScopes.all());
    }
    Storagetransfer storageTransfer =
        new Storagetransfer.Builder(
                Utils.getDefaultTransport(),
                Utils.getDefaultJsonFactory(),
                new HttpCredentialsAdapter(credential))
            .build();

    // Create the transfer job
    TransferJob response = storageTransfer.transferJobs().create(transferJob).execute();

    System.out.println("Created transfer job from AWS to GCS:");
    System.out.println(response.toPrettyString());
  }
}

Cloud 客户端库


import com.google.storagetransfer.v1.proto.StorageTransferServiceClient;
import com.google.storagetransfer.v1.proto.TransferProto.CreateTransferJobRequest;
import com.google.storagetransfer.v1.proto.TransferTypes.AwsAccessKey;
import com.google.storagetransfer.v1.proto.TransferTypes.AwsS3Data;
import com.google.storagetransfer.v1.proto.TransferTypes.GcsData;
import com.google.storagetransfer.v1.proto.TransferTypes.Schedule;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob.Status;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferSpec;
import com.google.type.Date;
import com.google.type.TimeOfDay;
import java.io.IOException;
import java.util.Calendar;

public class TransferFromAws {

  // Creates a one-off transfer job from Amazon S3 to Google Cloud Storage.
  public static void transferFromAws(
      String projectId,
      String jobDescription,
      String awsSourceBucket,
      String gcsSinkBucket,
      long startDateTime)
      throws IOException {

    // Your Google Cloud Project ID
    // String projectId = "your-project-id";

    // A short description of this job
    // String jobDescription = "Sample transfer job from S3 to GCS.";

    // The name of the source AWS bucket to transfer data from
    // String awsSourceBucket = "yourAwsSourceBucket";

    // The name of the GCS bucket to transfer data to
    // String gcsSinkBucket = "your-gcs-bucket";

    // What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
    // If this is in the past relative to when the job is created, it will run the next day.
    // long startDateTime =
    //     new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();

    // The ID used to access your AWS account. Should be accessed via environment variable.
    String awsAccessKeyId = System.getenv("AWS_ACCESS_KEY_ID");

    // The Secret Key used to access your AWS account. Should be accessed via environment variable.
    String awsSecretAccessKey = System.getenv("AWS_SECRET_ACCESS_KEY");

    // Set up source and sink
    TransferSpec transferSpec =
        TransferSpec.newBuilder()
            .setAwsS3DataSource(
                AwsS3Data.newBuilder()
                    .setBucketName(awsSourceBucket)
                    .setAwsAccessKey(
                        AwsAccessKey.newBuilder()
                            .setAccessKeyId(awsAccessKeyId)
                            .setSecretAccessKey(awsSecretAccessKey)))
            .setGcsDataSink(GcsData.newBuilder().setBucketName(gcsSinkBucket))
            .build();

    // Parse epoch timestamp into the model classes
    Calendar startCalendar = Calendar.getInstance();
    startCalendar.setTimeInMillis(startDateTime);
    // Note that this is a Date from the model class package, not a java.util.Date
    Date startDate =
        Date.newBuilder()
            .setYear(startCalendar.get(Calendar.YEAR))
            .setMonth(startCalendar.get(Calendar.MONTH) + 1)
            .setDay(startCalendar.get(Calendar.DAY_OF_MONTH))
            .build();
    TimeOfDay startTime =
        TimeOfDay.newBuilder()
            .setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
            .setMinutes(startCalendar.get(Calendar.MINUTE))
            .setSeconds(startCalendar.get(Calendar.SECOND))
            .build();
    Schedule schedule =
        Schedule.newBuilder()
            .setScheduleStartDate(startDate)
            .setScheduleEndDate(startDate)
            .setStartTimeOfDay(startTime)
            .build();

    // Set up the transfer job
    TransferJob transferJob =
        TransferJob.newBuilder()
            .setDescription(jobDescription)
            .setProjectId(projectId)
            .setTransferSpec(transferSpec)
            .setSchedule(schedule)
            .setStatus(Status.ENABLED)
            .build();

    // Create a Transfer Service client
    StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();

    // Create the transfer job
    TransferJob response =
        storageTransfer.createTransferJob(
            CreateTransferJobRequest.newBuilder().setTransferJob(transferJob).build());

    System.out.println("Created transfer job from AWS to GCS:");
    System.out.println(response.toString());
  }
}

转移到 Nearline

API 客户端库

import com.google.api.client.googleapis.util.Utils;
import com.google.api.services.storagetransfer.v1.Storagetransfer;
import com.google.api.services.storagetransfer.v1.StoragetransferScopes;
import com.google.api.services.storagetransfer.v1.model.Date;
import com.google.api.services.storagetransfer.v1.model.GcsData;
import com.google.api.services.storagetransfer.v1.model.ObjectConditions;
import com.google.api.services.storagetransfer.v1.model.Schedule;
import com.google.api.services.storagetransfer.v1.model.TimeOfDay;
import com.google.api.services.storagetransfer.v1.model.TransferJob;
import com.google.api.services.storagetransfer.v1.model.TransferOptions;
import com.google.api.services.storagetransfer.v1.model.TransferSpec;
import com.google.auth.http.HttpCredentialsAdapter;
import com.google.auth.oauth2.GoogleCredentials;
import java.io.IOException;
import java.util.Calendar;

public class TransferToNearlineApiary {
  /**
   * Creates a one-off transfer job that transfers objects in a standard GCS bucket that are more
   * than 30 days old to a Nearline GCS bucket.
   */
  public static void transferToNearlineApiary(
      String projectId,
      String jobDescription,
      String gcsSourceBucket,
      String gcsNearlineSinkBucket,
      long startDateTime)
      throws IOException {

    // Your Google Cloud Project ID
    // String projectId = "your-project-id";

    // A short description of this job
    // String jobDescription = "Sample transfer job of old objects to a Nearline GCS bucket.";

    // The name of the source GCS bucket to transfer data from
    // String gcsSourceBucket = "your-gcs-source-bucket";

    // The name of the Nearline GCS bucket to transfer old objects to
    // String gcsSinkBucket = "your-nearline-gcs-bucket";

    // What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
    // If this is in the past relative to when the job is created, it will run the next day.
    // long startDateTime =
    //     new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();

    // Parse epoch timestamp into the model classes
    Calendar startCalendar = Calendar.getInstance();
    startCalendar.setTimeInMillis(startDateTime);
    // Note that this is a Date from the model class package, not a java.util.Date
    Date date =
        new Date()
            .setYear(startCalendar.get(Calendar.YEAR))
            .setMonth(startCalendar.get(Calendar.MONTH) + 1)
            .setDay(startCalendar.get(Calendar.DAY_OF_MONTH));
    TimeOfDay time =
        new TimeOfDay()
            .setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
            .setMinutes(startCalendar.get(Calendar.MINUTE))
            .setSeconds(startCalendar.get(Calendar.SECOND));

    TransferJob transferJob =
        new TransferJob()
            .setDescription(jobDescription)
            .setProjectId(projectId)
            .setTransferSpec(
                new TransferSpec()
                    .setGcsDataSource(new GcsData().setBucketName(gcsSourceBucket))
                    .setGcsDataSink(new GcsData().setBucketName(gcsNearlineSinkBucket))
                    .setObjectConditions(
                        new ObjectConditions()
                            .setMinTimeElapsedSinceLastModification("2592000s" /* 30 days */))
                    .setTransferOptions(
                        new TransferOptions().setDeleteObjectsFromSourceAfterTransfer(true)))
            .setSchedule(new Schedule().setScheduleStartDate(date).setStartTimeOfDay(time))
            .setStatus("ENABLED");

    // Create a Transfer Service client
    GoogleCredentials credential = GoogleCredentials.getApplicationDefault();
    if (credential.createScopedRequired()) {
      credential = credential.createScoped(StoragetransferScopes.all());
    }
    Storagetransfer storageTransfer =
        new Storagetransfer.Builder(
                Utils.getDefaultTransport(),
                Utils.getDefaultJsonFactory(),
                new HttpCredentialsAdapter(credential))
            .build();

    // Create the transfer job
    TransferJob response = storageTransfer.transferJobs().create(transferJob).execute();

    System.out.println("Created transfer job from standard bucket to Nearline bucket:");
    System.out.println(response.toPrettyString());
  }
}

Cloud 客户端库

import com.google.protobuf.Duration;
import com.google.storagetransfer.v1.proto.StorageTransferServiceClient;
import com.google.storagetransfer.v1.proto.TransferProto.CreateTransferJobRequest;
import com.google.storagetransfer.v1.proto.TransferTypes.GcsData;
import com.google.storagetransfer.v1.proto.TransferTypes.ObjectConditions;
import com.google.storagetransfer.v1.proto.TransferTypes.Schedule;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob.Status;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferOptions;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferSpec;
import com.google.type.Date;
import com.google.type.TimeOfDay;
import java.io.IOException;
import java.util.Calendar;

public class TransferToNearline {
  /**
   * Creates a one-off transfer job that transfers objects in a standard GCS bucket that are more
   * than 30 days old to a Nearline GCS bucket.
   */
  public static void transferToNearline(
      String projectId,
      String jobDescription,
      String gcsSourceBucket,
      String gcsNearlineSinkBucket,
      long startDateTime)
      throws IOException {

    // Your Google Cloud Project ID
    // String projectId = "your-project-id";

    // A short description of this job
    // String jobDescription = "Sample transfer job of old objects to a Nearline GCS bucket.";

    // The name of the source GCS bucket to transfer data from
    // String gcsSourceBucket = "your-gcs-source-bucket";

    // The name of the Nearline GCS bucket to transfer old objects to
    // String gcsSinkBucket = "your-nearline-gcs-bucket";

    // What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
    // If this is in the past relative to when the job is created, it will run the next day.
    // long startDateTime =
    //     new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();

    // Parse epoch timestamp into the model classes
    Calendar startCalendar = Calendar.getInstance();
    startCalendar.setTimeInMillis(startDateTime);
    // Note that this is a Date from the model class package, not a java.util.Date
    Date date =
        Date.newBuilder()
            .setYear(startCalendar.get(Calendar.YEAR))
            .setMonth(startCalendar.get(Calendar.MONTH) + 1)
            .setDay(startCalendar.get(Calendar.DAY_OF_MONTH))
            .build();
    TimeOfDay time =
        TimeOfDay.newBuilder()
            .setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
            .setMinutes(startCalendar.get(Calendar.MINUTE))
            .setSeconds(startCalendar.get(Calendar.SECOND))
            .build();

    TransferJob transferJob =
        TransferJob.newBuilder()
            .setDescription(jobDescription)
            .setProjectId(projectId)
            .setTransferSpec(
                TransferSpec.newBuilder()
                    .setGcsDataSource(GcsData.newBuilder().setBucketName(gcsSourceBucket))
                    .setGcsDataSink(GcsData.newBuilder().setBucketName(gcsNearlineSinkBucket))
                    .setObjectConditions(
                        ObjectConditions.newBuilder()
                            .setMinTimeElapsedSinceLastModification(
                                Duration.newBuilder().setSeconds(2592000 /* 30 days */)))
                    .setTransferOptions(
                        TransferOptions.newBuilder().setDeleteObjectsFromSourceAfterTransfer(true)))
            .setSchedule(Schedule.newBuilder().setScheduleStartDate(date).setStartTimeOfDay(time))
            .setStatus(Status.ENABLED)
            .build();

    // Create a Transfer Service client
    StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();

    // Create the transfer job
    TransferJob response =
        storageTransfer.createTransferJob(
            CreateTransferJobRequest.newBuilder().setTransferJob(transferJob).build());

    System.out.println("Created transfer job from standard bucket to Nearline bucket:");
    System.out.println(response.toString());
  }
}

检查最新的转移操作

API 客户端库


import com.google.api.client.googleapis.util.Utils;
import com.google.api.services.storagetransfer.v1.Storagetransfer;
import com.google.api.services.storagetransfer.v1.StoragetransferScopes;
import com.google.api.services.storagetransfer.v1.model.Operation;
import com.google.api.services.storagetransfer.v1.model.TransferJob;
import com.google.auth.http.HttpCredentialsAdapter;
import com.google.auth.oauth2.GoogleCredentials;
import java.io.IOException;

public class CheckLatestTransferOperationApiary {

  // Gets the requested transfer job and checks its latest operation
  public static void checkLatestTransferOperationApiary(String projectId, String jobName)
      throws IOException {
    // Your Google Cloud Project ID
    // String projectId = "your-project-id";

    // The name of the job to check
    // String jobName = "myJob/1234567890";

    // Create Storage Transfer client
    GoogleCredentials credential = GoogleCredentials.getApplicationDefault();
    if (credential.createScopedRequired()) {
      credential = credential.createScoped(StoragetransferScopes.all());
    }
    Storagetransfer storageTransfer =
        new Storagetransfer.Builder(
                Utils.getDefaultTransport(),
                Utils.getDefaultJsonFactory(),
                new HttpCredentialsAdapter(credential))
            .build();

    // Get transfer job and check latest operation
    TransferJob transferJob = storageTransfer.transferJobs().get(jobName, projectId).execute();
    String latestOperationName = transferJob.getLatestOperationName();

    if (latestOperationName != null) {
      Operation latestOperation =
          storageTransfer.transferOperations().get(latestOperationName).execute();
      System.out.println("The latest operation for transfer job " + jobName + " is:");
      System.out.println(latestOperation.toPrettyString());

    } else {
      System.out.println(
          "Transfer job "
              + jobName
              + " does not have an operation scheduled yet,"
              + " try again once the job starts running.");
    }
  }
}

Cloud 客户端库


import com.google.longrunning.Operation;
import com.google.storagetransfer.v1.proto.StorageTransferServiceClient;
import com.google.storagetransfer.v1.proto.TransferProto.GetTransferJobRequest;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferOperation;
import java.io.IOException;

public class CheckLatestTransferOperation {

  // Gets the requested transfer job and checks its latest operation
  public static void checkLatestTransferOperation(String projectId, String jobName)
      throws IOException {
    // Your Google Cloud Project ID
    // String projectId = "your-project-id";

    // The name of the job to check
    // String jobName = "myJob/1234567890";

    StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();

    // Get transfer job and check latest operation
    TransferJob transferJob =
        storageTransfer.getTransferJob(
            GetTransferJobRequest.newBuilder().setJobName(jobName).setProjectId(projectId).build());
    String latestOperationName = transferJob.getLatestOperationName();

    if (!latestOperationName.isEmpty()) {
      Operation operation = storageTransfer.getOperationsClient().getOperation(latestOperationName);
      TransferOperation latestOperation =
          TransferOperation.parseFrom(operation.getMetadata().getValue());

      System.out.println("The latest operation for transfer job " + jobName + " is:");
      System.out.println(latestOperation.toString());

    } else {
      System.out.println(
          "Transfer job "
              + jobName
              + " hasn't run yet,"
              + " try again once the job starts running.");
    }
  }
}

Python

更新依赖项

如需使用新库,请添加对 google-cloud-storage-transfer 的依赖项。系统将使用它,而不是 google-api-python-client 中的资产识别客户端。

pip install --upgrade google-cloud-storage-transfer

客户端实例化

使用 storage_transfer 模块,而非 googleapiclient.discovery

API 客户端库

"""A sample for creating a Storage Transfer Service client."""

import googleapiclient.discovery

def create_transfer_client():
    return googleapiclient.discovery.build("storagetransfer", "v1")

Cloud 客户端库

"""A sample for creating a Storage Transfer Service client."""

from google.cloud import storage_transfer

def create_transfer_client():
    return storage_transfer.StorageTransferServiceClient()

示例比较

为了说明这两个库之间的差异,这里将旧的 API 客户端示例与其在 Cloud 客户端库中的等效示例并排显示。

从 Amazon S3 转移

API 客户端库

def main(
    description,
    project_id,
    start_date,
    start_time,
    source_bucket,
    access_key_id,
    secret_access_key,
    sink_bucket,
):
    """Create a one-time transfer from Amazon S3 to Google Cloud Storage."""
    storagetransfer = googleapiclient.discovery.build("storagetransfer", "v1")

    # Edit this template with desired parameters.
    transfer_job = {
        "description": description,
        "status": "ENABLED",
        "projectId": project_id,
        "schedule": {
            "scheduleStartDate": {
                "day": start_date.day,
                "month": start_date.month,
                "year": start_date.year,
            },
            "scheduleEndDate": {
                "day": start_date.day,
                "month": start_date.month,
                "year": start_date.year,
            },
            "startTimeOfDay": {
                "hours": start_time.hour,
                "minutes": start_time.minute,
                "seconds": start_time.second,
            },
        },
        "transferSpec": {
            "awsS3DataSource": {
                "bucketName": source_bucket,
                "awsAccessKey": {
                    "accessKeyId": access_key_id,
                    "secretAccessKey": secret_access_key,
                },
            },
            "gcsDataSink": {"bucketName": sink_bucket},
        },
    }

    result = storagetransfer.transferJobs().create(body=transfer_job).execute()
    print("Returned transferJob: {}".format(json.dumps(result, indent=4)))

Cloud 客户端库

from datetime import datetime

from google.cloud import storage_transfer

def create_one_time_aws_transfer(
    project_id: str,
    description: str,
    source_bucket: str,
    aws_access_key_id: str,
    aws_secret_access_key: str,
    sink_bucket: str,
):
    """Creates a one-time transfer job from Amazon S3 to Google Cloud
    Storage."""

    client = storage_transfer.StorageTransferServiceClient()

    # The ID of the Google Cloud Platform Project that owns the job
    # project_id = 'my-project-id'

    # A useful description for your transfer job
    # description = 'My transfer job'

    # AWS S3 source bucket name
    # source_bucket = 'my-s3-source-bucket'

    # AWS Access Key ID
    # aws_access_key_id = 'AKIA...'

    # AWS Secret Access Key
    # aws_secret_access_key = 'HEAoMK2.../...ku8'

    # Google Cloud Storage destination bucket name
    # sink_bucket = 'my-gcs-destination-bucket'

    now = datetime.utcnow()
    # Setting the start date and the end date as
    # the same time creates a one-time transfer
    one_time_schedule = {"day": now.day, "month": now.month, "year": now.year}

    transfer_job_request = storage_transfer.CreateTransferJobRequest(
        {
            "transfer_job": {
                "project_id": project_id,
                "description": description,
                "status": storage_transfer.TransferJob.Status.ENABLED,
                "schedule": {
                    "schedule_start_date": one_time_schedule,
                    "schedule_end_date": one_time_schedule,
                },
                "transfer_spec": {
                    "aws_s3_data_source": {
                        "bucket_name": source_bucket,
                        "aws_access_key": {
                            "access_key_id": aws_access_key_id,
                            "secret_access_key": aws_secret_access_key,
                        },
                    },
                    "gcs_data_sink": {
                        "bucket_name": sink_bucket,
                    },
                },
            }
        }
    )

    result = client.create_transfer_job(transfer_job_request)
    print(f"Created transferJob: {result.name}")

转移到 Nearline

API 客户端库

def main(description, project_id, start_date, start_time, source_bucket, sink_bucket):
    """Create a daily transfer from Standard to Nearline Storage class."""
    storagetransfer = googleapiclient.discovery.build("storagetransfer", "v1")

    # Edit this template with desired parameters.
    transfer_job = {
        "description": description,
        "status": "ENABLED",
        "projectId": project_id,
        "schedule": {
            "scheduleStartDate": {
                "day": start_date.day,
                "month": start_date.month,
                "year": start_date.year,
            },
            "startTimeOfDay": {
                "hours": start_time.hour,
                "minutes": start_time.minute,
                "seconds": start_time.second,
            },
        },
        "transferSpec": {
            "gcsDataSource": {"bucketName": source_bucket},
            "gcsDataSink": {"bucketName": sink_bucket},
            "objectConditions": {
                "minTimeElapsedSinceLastModification": "2592000s"  # 30 days
            },
            "transferOptions": {"deleteObjectsFromSourceAfterTransfer": "true"},
        },
    }

    result = storagetransfer.transferJobs().create(body=transfer_job).execute()
    print("Returned transferJob: {}".format(json.dumps(result, indent=4)))

Cloud 客户端库

请注意 google.protobuf.duration_pb2.Duration 的导入。

from datetime import datetime

from google.cloud import storage_transfer
from google.protobuf.duration_pb2 import Duration

def create_daily_nearline_30_day_migration(
    project_id: str,
    description: str,
    source_bucket: str,
    sink_bucket: str,
    start_date: datetime,
):
    """Create a daily migration from a GCS bucket to a Nearline GCS bucket
    for objects untouched for 30 days."""

    client = storage_transfer.StorageTransferServiceClient()

    # The ID of the Google Cloud Platform Project that owns the job
    # project_id = 'my-project-id'

    # A useful description for your transfer job
    # description = 'My transfer job'

    # Google Cloud Storage source bucket name
    # source_bucket = 'my-gcs-source-bucket'

    # Google Cloud Storage destination bucket name
    # sink_bucket = 'my-gcs-destination-bucket'

    transfer_job_request = storage_transfer.CreateTransferJobRequest(
        {
            "transfer_job": {
                "project_id": project_id,
                "description": description,
                "status": storage_transfer.TransferJob.Status.ENABLED,
                "schedule": {
                    "schedule_start_date": {
                        "day": start_date.day,
                        "month": start_date.month,
                        "year": start_date.year,
                    }
                },
                "transfer_spec": {
                    "gcs_data_source": {
                        "bucket_name": source_bucket,
                    },
                    "gcs_data_sink": {
                        "bucket_name": sink_bucket,
                    },
                    "object_conditions": {
                        "min_time_elapsed_since_last_modification": Duration(
                            seconds=2592000  # 30 days
                        )
                    },
                    "transfer_options": {
                        "delete_objects_from_source_after_transfer": True
                    },
                },
            }
        }
    )

    result = client.create_transfer_job(transfer_job_request)
    print(f"Created transferJob: {result.name}")

检查最新的转移操作

API 客户端库


"""Command-line sample that checks the latest operation of a transfer.
This sample is used on this page:
    https://cloud.google.com/storage/transfer/create-transfer
For more information, see README.md.
"""

import argparse
import json

import googleapiclient.discovery

def check_latest_transfer_operation(project_id, job_name):
    """Check the latest transfer operation associated with a transfer job."""
    storagetransfer = googleapiclient.discovery.build("storagetransfer", "v1")

    transferJob = (
        storagetransfer.transferJobs()
        .get(projectId=project_id, jobName=job_name)
        .execute()
    )
    latestOperationName = transferJob.get("latestOperationName")

    if latestOperationName:
        result = (
            storagetransfer.transferOperations().get(name=latestOperationName).execute()
        )
        print(
            "The latest operation for job"
            + job_name
            + " is: {}".format(json.dumps(result, indent=4, sort_keys=True))
        )

    else:
        print(
            "Transfer job "
            + job_name
            + " does not have an operation scheduled yet, "
            + "try again once the job starts running."
        )

if __name__ == "__main__":
    parser = argparse.ArgumentParser(
        description=__doc__, formatter_class=argparse.RawDescriptionHelpFormatter
    )
    parser.add_argument("project_id", help="Your Google Cloud project ID.")
    parser.add_argument("job_name", help="Your job name.")

    args = parser.parse_args()

    check_latest_transfer_operation(args.project_id, args.job_name)

Cloud 客户端库

注意 storage_transfer.TransferOperation.deserialize 的用法

from google.cloud import storage_transfer

def check_latest_transfer_operation(project_id: str, job_name: str):
    """Checks the latest transfer operation for a given transfer job."""

    client = storage_transfer.StorageTransferServiceClient()

    # The ID of the Google Cloud Platform Project that owns the job
    # project_id = 'my-project-id'

    # Storage Transfer Service job name
    # job_name = 'transferJobs/1234567890'

    transfer_job = client.get_transfer_job(
        {
            "project_id": project_id,
            "job_name": job_name,
        }
    )

    if transfer_job.latest_operation_name:
        response = client.transport.operations_client.get_operation(
            transfer_job.latest_operation_name
        )
        operation = storage_transfer.TransferOperation.deserialize(
            response.metadata.value
        )

        print(f"Latest transfer operation for `{job_name}`: {operation}")
    else:
        print(f"Transfer job {job_name} has not ran yet.")