品質を維持し、他の Google Cloud ライブラリとの整合性を保つため、Storage Transfer Service のドキュメントでは、Google API クライアント ライブラリの代わりに Cloud クライアント ライブラリを使用しています。この 2 つのオプションの詳細については、クライアント ライブラリの説明をご覧ください。
Google API クライアント ライブラリは引き続き更新されますが、ドキュメントで参照されることはありません。
このガイドでは、Storage Transfer Service を使用する場合の主な違いと、Cloud クライアント ライブラリに移行する際のクライアントの更新手順について説明します。
Java
依存関係の更新
新しいライブラリに切り替えるには、google-api-services-storagetransfer
の依存関係を google-cloud-storage-transfer
に置き換えます。
<dependency> <groupId>com.google.cloud</groupId> <artifactId>google-cloud-storage-transfer</artifactId> <version>0.2.3</version> </dependency>
Gradle を BOM なしで使用している場合は、次のものを依存関係に追加します。
implementation 'com.google.cloud:google-cloud-storage-transfer:0.2.3'
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>libraries-bom</artifactId>
<version>24.1.0</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-storage-transfer</artifactId>
</dependency>
ほとんどの場合、コードは API クライアント ライブラリから Cloud クライアント ライブラリに簡単に変換できます。2 つの Java クライアントの主な違いは次のとおりです。
クライアントのインスタンス化
Cloud クライアント ライブラリは、バックグラウンドで処理することで、クライアントのインスタンス化に関連するボイラープレートを大幅に削減します。
API クライアント ライブラリ
GoogleCredentials credential = GoogleCredentials.getApplicationDefault();
if (credential.createScopedRequired()) {
credential = credential.createScoped(StoragetransferScopes.all());
}
Storagetransfer storageTransfer = new Storagetransfer.Builder(Utils.getDefaultTransport(),
Utils.getDefaultJsonFactory(), new HttpCredentialsAdapter(credential))
.build();
Cloud クライアント ライブラリ
StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();
モデルクラスのビルダー
Cloud クライアント ライブラリのモデルクラスは、コンストラクタではなくビルダーを使用します。
API クライアント ライブラリ
TransferJob transferJob =
new TransferJob()
.setStatus("ENABLED");
Cloud クライアント ライブラリ
TransferJob transferJob =
TransferJob.newBuilder()
.setStatus(Status.ENABLED)
.build();
イテラブルを返すリスト オペレーション
Cloud クライアント ライブラリのリスト オペレーションは単純なイテラブルを返します。API クライアント ライブラリのようにページ分けされた結果は返しません。
API クライアント ライブラリ
public class StoragetransferExample {
public static void main(String args[]) throws IOException, GeneralSecurityException {
Storagetransfer storagetransferService = createStoragetransferService();
Storagetransfer.TransferJobs.List request = storagetransferService.transferJobs().list();
ListTransferJobsResponse response;
do {
response = request.execute();
if (response.getTransferJobs() == null) {
continue;
}
for (TransferJob transferJob : response.getTransferJobs()) {
System.out.println(transferJob);
}
request.setPageToken(response.getNextPageToken());
} while (response.getNextPageToken() != null);
}
public static Storagetransfer createStoragetransferService()
throws IOException, GeneralSecurityException {
HttpTransport httpTransport = GoogleNetHttpTransport.newTrustedTransport();
JsonFactory jsonFactory = JacksonFactory.getDefaultInstance();
GoogleCredential credential = GoogleCredential.getApplicationDefault();
}
return new Storagetransfer.Builder(httpTransport, jsonFactory, credential)
.build();
}
}
Cloud クライアント ライブラリ
public class StoragetransferExample {
public static void main(String args[]) throws Exception {
StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();
ListTransferJobsRequest request = ListTransferJobsRequest.newBuilder().build();
for (TransferJob job : client.listTransferJobs(request).iterateAll()) {
System.out.println(job);
}
}
}
サンプル比較
ここでは、以前の API クライアント ライブラリのサンプルと、Cloud クライアント ライブラリを使用した同等のサンプルを比較します。これらのサンプルを使用したことがある場合は、この比較を参照すれば、新しい Cloud クライアント ライブラリへのコードの移行方法を理解できます。
Amazon S3 から転送する
API クライアント ライブラリ
import com.google.api.client.googleapis.util.Utils;
import com.google.api.services.storagetransfer.v1.Storagetransfer;
import com.google.api.services.storagetransfer.v1.StoragetransferScopes;
import com.google.api.services.storagetransfer.v1.model.AwsAccessKey;
import com.google.api.services.storagetransfer.v1.model.AwsS3Data;
import com.google.api.services.storagetransfer.v1.model.Date;
import com.google.api.services.storagetransfer.v1.model.GcsData;
import com.google.api.services.storagetransfer.v1.model.Schedule;
import com.google.api.services.storagetransfer.v1.model.TimeOfDay;
import com.google.api.services.storagetransfer.v1.model.TransferJob;
import com.google.api.services.storagetransfer.v1.model.TransferSpec;
import com.google.auth.http.HttpCredentialsAdapter;
import com.google.auth.oauth2.GoogleCredentials;
import java.io.IOException;
import java.util.Calendar;
public class TransferFromAwsApiary {
// Creates a one-off transfer job from Amazon S3 to Google Cloud Storage.
public static void transferFromAws(
String projectId,
String jobDescription,
String awsSourceBucket,
String gcsSinkBucket,
long startDateTime)
throws IOException {
// Your Google Cloud Project ID
// String projectId = "your-project-id";
// A short description of this job
// String jobDescription = "Sample transfer job from S3 to GCS.";
// The name of the source AWS bucket to transfer data from
// String awsSourceBucket = "yourAwsSourceBucket";
// The name of the GCS bucket to transfer data to
// String gcsSinkBucket = "your-gcs-bucket";
// What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
// If this is in the past relative to when the job is created, it will run the next day.
// long startDateTime =
// new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();
// The ID used to access your AWS account. Should be accessed via environment variable.
String awsAccessKeyId = System.getenv("AWS_ACCESS_KEY_ID");
// The Secret Key used to access your AWS account. Should be accessed via environment variable.
String awsSecretAccessKey = System.getenv("AWS_SECRET_ACCESS_KEY");
// Set up source and sink
TransferSpec transferSpec =
new TransferSpec()
.setAwsS3DataSource(
new AwsS3Data()
.setBucketName(awsSourceBucket)
.setAwsAccessKey(
new AwsAccessKey()
.setAccessKeyId(awsAccessKeyId)
.setSecretAccessKey(awsSecretAccessKey)))
.setGcsDataSink(new GcsData().setBucketName(gcsSinkBucket));
// Parse epoch timestamp into the model classes
Calendar startCalendar = Calendar.getInstance();
startCalendar.setTimeInMillis(startDateTime);
// Note that this is a Date from the model class package, not a java.util.Date
Date startDate =
new Date()
.setYear(startCalendar.get(Calendar.YEAR))
.setMonth(startCalendar.get(Calendar.MONTH) + 1)
.setDay(startCalendar.get(Calendar.DAY_OF_MONTH));
TimeOfDay startTime =
new TimeOfDay()
.setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
.setMinutes(startCalendar.get(Calendar.MINUTE))
.setSeconds(startCalendar.get(Calendar.SECOND));
Schedule schedule =
new Schedule()
.setScheduleStartDate(startDate)
.setScheduleEndDate(startDate)
.setStartTimeOfDay(startTime);
// Set up the transfer job
TransferJob transferJob =
new TransferJob()
.setDescription(jobDescription)
.setProjectId(projectId)
.setTransferSpec(transferSpec)
.setSchedule(schedule)
.setStatus("ENABLED");
// Create a Transfer Service client
GoogleCredentials credential = GoogleCredentials.getApplicationDefault();
if (credential.createScopedRequired()) {
credential = credential.createScoped(StoragetransferScopes.all());
}
Storagetransfer storageTransfer =
new Storagetransfer.Builder(
Utils.getDefaultTransport(),
Utils.getDefaultJsonFactory(),
new HttpCredentialsAdapter(credential))
.build();
// Create the transfer job
TransferJob response = storageTransfer.transferJobs().create(transferJob).execute();
System.out.println("Created transfer job from AWS to GCS:");
System.out.println(response.toPrettyString());
}
}
Cloud クライアント ライブラリ
import com.google.storagetransfer.v1.proto.StorageTransferServiceClient;
import com.google.storagetransfer.v1.proto.TransferProto.CreateTransferJobRequest;
import com.google.storagetransfer.v1.proto.TransferTypes.AwsAccessKey;
import com.google.storagetransfer.v1.proto.TransferTypes.AwsS3Data;
import com.google.storagetransfer.v1.proto.TransferTypes.GcsData;
import com.google.storagetransfer.v1.proto.TransferTypes.Schedule;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob.Status;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferSpec;
import com.google.type.Date;
import com.google.type.TimeOfDay;
import java.io.IOException;
import java.util.Calendar;
public class TransferFromAws {
// Creates a one-off transfer job from Amazon S3 to Google Cloud Storage.
public static void transferFromAws(
String projectId,
String jobDescription,
String awsSourceBucket,
String gcsSinkBucket,
long startDateTime)
throws IOException {
// Your Google Cloud Project ID
// String projectId = "your-project-id";
// A short description of this job
// String jobDescription = "Sample transfer job from S3 to GCS.";
// The name of the source AWS bucket to transfer data from
// String awsSourceBucket = "yourAwsSourceBucket";
// The name of the GCS bucket to transfer data to
// String gcsSinkBucket = "your-gcs-bucket";
// What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
// If this is in the past relative to when the job is created, it will run the next day.
// long startDateTime =
// new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();
// The ID used to access your AWS account. Should be accessed via environment variable.
String awsAccessKeyId = System.getenv("AWS_ACCESS_KEY_ID");
// The Secret Key used to access your AWS account. Should be accessed via environment variable.
String awsSecretAccessKey = System.getenv("AWS_SECRET_ACCESS_KEY");
// Set up source and sink
TransferSpec transferSpec =
TransferSpec.newBuilder()
.setAwsS3DataSource(
AwsS3Data.newBuilder()
.setBucketName(awsSourceBucket)
.setAwsAccessKey(
AwsAccessKey.newBuilder()
.setAccessKeyId(awsAccessKeyId)
.setSecretAccessKey(awsSecretAccessKey)))
.setGcsDataSink(GcsData.newBuilder().setBucketName(gcsSinkBucket))
.build();
// Parse epoch timestamp into the model classes
Calendar startCalendar = Calendar.getInstance();
startCalendar.setTimeInMillis(startDateTime);
// Note that this is a Date from the model class package, not a java.util.Date
Date startDate =
Date.newBuilder()
.setYear(startCalendar.get(Calendar.YEAR))
.setMonth(startCalendar.get(Calendar.MONTH) + 1)
.setDay(startCalendar.get(Calendar.DAY_OF_MONTH))
.build();
TimeOfDay startTime =
TimeOfDay.newBuilder()
.setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
.setMinutes(startCalendar.get(Calendar.MINUTE))
.setSeconds(startCalendar.get(Calendar.SECOND))
.build();
Schedule schedule =
Schedule.newBuilder()
.setScheduleStartDate(startDate)
.setScheduleEndDate(startDate)
.setStartTimeOfDay(startTime)
.build();
// Set up the transfer job
TransferJob transferJob =
TransferJob.newBuilder()
.setDescription(jobDescription)
.setProjectId(projectId)
.setTransferSpec(transferSpec)
.setSchedule(schedule)
.setStatus(Status.ENABLED)
.build();
// Create a Transfer Service client
StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();
// Create the transfer job
TransferJob response =
storageTransfer.createTransferJob(
CreateTransferJobRequest.newBuilder().setTransferJob(transferJob).build());
System.out.println("Created transfer job from AWS to GCS:");
System.out.println(response.toString());
}
}
Nearline に転送する
API クライアント ライブラリ
import com.google.api.client.googleapis.util.Utils;
import com.google.api.services.storagetransfer.v1.Storagetransfer;
import com.google.api.services.storagetransfer.v1.StoragetransferScopes;
import com.google.api.services.storagetransfer.v1.model.Date;
import com.google.api.services.storagetransfer.v1.model.GcsData;
import com.google.api.services.storagetransfer.v1.model.ObjectConditions;
import com.google.api.services.storagetransfer.v1.model.Schedule;
import com.google.api.services.storagetransfer.v1.model.TimeOfDay;
import com.google.api.services.storagetransfer.v1.model.TransferJob;
import com.google.api.services.storagetransfer.v1.model.TransferOptions;
import com.google.api.services.storagetransfer.v1.model.TransferSpec;
import com.google.auth.http.HttpCredentialsAdapter;
import com.google.auth.oauth2.GoogleCredentials;
import java.io.IOException;
import java.util.Calendar;
public class TransferToNearlineApiary {
/**
* Creates a one-off transfer job that transfers objects in a standard GCS bucket that are more
* than 30 days old to a Nearline GCS bucket.
*/
public static void transferToNearlineApiary(
String projectId,
String jobDescription,
String gcsSourceBucket,
String gcsNearlineSinkBucket,
long startDateTime)
throws IOException {
// Your Google Cloud Project ID
// String projectId = "your-project-id";
// A short description of this job
// String jobDescription = "Sample transfer job of old objects to a Nearline GCS bucket.";
// The name of the source GCS bucket to transfer data from
// String gcsSourceBucket = "your-gcs-source-bucket";
// The name of the Nearline GCS bucket to transfer old objects to
// String gcsSinkBucket = "your-nearline-gcs-bucket";
// What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
// If this is in the past relative to when the job is created, it will run the next day.
// long startDateTime =
// new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();
// Parse epoch timestamp into the model classes
Calendar startCalendar = Calendar.getInstance();
startCalendar.setTimeInMillis(startDateTime);
// Note that this is a Date from the model class package, not a java.util.Date
Date date =
new Date()
.setYear(startCalendar.get(Calendar.YEAR))
.setMonth(startCalendar.get(Calendar.MONTH) + 1)
.setDay(startCalendar.get(Calendar.DAY_OF_MONTH));
TimeOfDay time =
new TimeOfDay()
.setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
.setMinutes(startCalendar.get(Calendar.MINUTE))
.setSeconds(startCalendar.get(Calendar.SECOND));
TransferJob transferJob =
new TransferJob()
.setDescription(jobDescription)
.setProjectId(projectId)
.setTransferSpec(
new TransferSpec()
.setGcsDataSource(new GcsData().setBucketName(gcsSourceBucket))
.setGcsDataSink(new GcsData().setBucketName(gcsNearlineSinkBucket))
.setObjectConditions(
new ObjectConditions()
.setMinTimeElapsedSinceLastModification("2592000s" /* 30 days */))
.setTransferOptions(
new TransferOptions().setDeleteObjectsFromSourceAfterTransfer(true)))
.setSchedule(new Schedule().setScheduleStartDate(date).setStartTimeOfDay(time))
.setStatus("ENABLED");
// Create a Transfer Service client
GoogleCredentials credential = GoogleCredentials.getApplicationDefault();
if (credential.createScopedRequired()) {
credential = credential.createScoped(StoragetransferScopes.all());
}
Storagetransfer storageTransfer =
new Storagetransfer.Builder(
Utils.getDefaultTransport(),
Utils.getDefaultJsonFactory(),
new HttpCredentialsAdapter(credential))
.build();
// Create the transfer job
TransferJob response = storageTransfer.transferJobs().create(transferJob).execute();
System.out.println("Created transfer job from standard bucket to Nearline bucket:");
System.out.println(response.toPrettyString());
}
}
Cloud クライアント ライブラリ
import com.google.protobuf.Duration;
import com.google.storagetransfer.v1.proto.StorageTransferServiceClient;
import com.google.storagetransfer.v1.proto.TransferProto.CreateTransferJobRequest;
import com.google.storagetransfer.v1.proto.TransferTypes.GcsData;
import com.google.storagetransfer.v1.proto.TransferTypes.ObjectConditions;
import com.google.storagetransfer.v1.proto.TransferTypes.Schedule;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob.Status;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferOptions;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferSpec;
import com.google.type.Date;
import com.google.type.TimeOfDay;
import java.io.IOException;
import java.util.Calendar;
public class TransferToNearline {
/**
* Creates a one-off transfer job that transfers objects in a standard GCS bucket that are more
* than 30 days old to a Nearline GCS bucket.
*/
public static void transferToNearline(
String projectId,
String jobDescription,
String gcsSourceBucket,
String gcsNearlineSinkBucket,
long startDateTime)
throws IOException {
// Your Google Cloud Project ID
// String projectId = "your-project-id";
// A short description of this job
// String jobDescription = "Sample transfer job of old objects to a Nearline GCS bucket.";
// The name of the source GCS bucket to transfer data from
// String gcsSourceBucket = "your-gcs-source-bucket";
// The name of the Nearline GCS bucket to transfer old objects to
// String gcsSinkBucket = "your-nearline-gcs-bucket";
// What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
// If this is in the past relative to when the job is created, it will run the next day.
// long startDateTime =
// new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();
// Parse epoch timestamp into the model classes
Calendar startCalendar = Calendar.getInstance();
startCalendar.setTimeInMillis(startDateTime);
// Note that this is a Date from the model class package, not a java.util.Date
Date date =
Date.newBuilder()
.setYear(startCalendar.get(Calendar.YEAR))
.setMonth(startCalendar.get(Calendar.MONTH) + 1)
.setDay(startCalendar.get(Calendar.DAY_OF_MONTH))
.build();
TimeOfDay time =
TimeOfDay.newBuilder()
.setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
.setMinutes(startCalendar.get(Calendar.MINUTE))
.setSeconds(startCalendar.get(Calendar.SECOND))
.build();
TransferJob transferJob =
TransferJob.newBuilder()
.setDescription(jobDescription)
.setProjectId(projectId)
.setTransferSpec(
TransferSpec.newBuilder()
.setGcsDataSource(GcsData.newBuilder().setBucketName(gcsSourceBucket))
.setGcsDataSink(GcsData.newBuilder().setBucketName(gcsNearlineSinkBucket))
.setObjectConditions(
ObjectConditions.newBuilder()
.setMinTimeElapsedSinceLastModification(
Duration.newBuilder().setSeconds(2592000 /* 30 days */)))
.setTransferOptions(
TransferOptions.newBuilder().setDeleteObjectsFromSourceAfterTransfer(true)))
.setSchedule(Schedule.newBuilder().setScheduleStartDate(date).setStartTimeOfDay(time))
.setStatus(Status.ENABLED)
.build();
// Create a Transfer Service client
StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();
// Create the transfer job
TransferJob response =
storageTransfer.createTransferJob(
CreateTransferJobRequest.newBuilder().setTransferJob(transferJob).build());
System.out.println("Created transfer job from standard bucket to Nearline bucket:");
System.out.println(response.toString());
}
}
最新の転送オペレーションを確認する
API クライアント ライブラリ
import com.google.api.client.googleapis.util.Utils;
import com.google.api.services.storagetransfer.v1.Storagetransfer;
import com.google.api.services.storagetransfer.v1.StoragetransferScopes;
import com.google.api.services.storagetransfer.v1.model.Operation;
import com.google.api.services.storagetransfer.v1.model.TransferJob;
import com.google.auth.http.HttpCredentialsAdapter;
import com.google.auth.oauth2.GoogleCredentials;
import java.io.IOException;
public class CheckLatestTransferOperationApiary {
// Gets the requested transfer job and checks its latest operation
public static void checkLatestTransferOperationApiary(String projectId, String jobName)
throws IOException {
// Your Google Cloud Project ID
// String projectId = "your-project-id";
// The name of the job to check
// String jobName = "myJob/1234567890";
// Create Storage Transfer client
GoogleCredentials credential = GoogleCredentials.getApplicationDefault();
if (credential.createScopedRequired()) {
credential = credential.createScoped(StoragetransferScopes.all());
}
Storagetransfer storageTransfer =
new Storagetransfer.Builder(
Utils.getDefaultTransport(),
Utils.getDefaultJsonFactory(),
new HttpCredentialsAdapter(credential))
.build();
// Get transfer job and check latest operation
TransferJob transferJob = storageTransfer.transferJobs().get(jobName, projectId).execute();
String latestOperationName = transferJob.getLatestOperationName();
if (latestOperationName != null) {
Operation latestOperation =
storageTransfer.transferOperations().get(latestOperationName).execute();
System.out.println("The latest operation for transfer job " + jobName + " is:");
System.out.println(latestOperation.toPrettyString());
} else {
System.out.println(
"Transfer job "
+ jobName
+ " does not have an operation scheduled yet,"
+ " try again once the job starts running.");
}
}
}
Cloud クライアント ライブラリ
import com.google.longrunning.Operation;
import com.google.storagetransfer.v1.proto.StorageTransferServiceClient;
import com.google.storagetransfer.v1.proto.TransferProto.GetTransferJobRequest;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferOperation;
import java.io.IOException;
public class CheckLatestTransferOperation {
// Gets the requested transfer job and checks its latest operation
public static void checkLatestTransferOperation(String projectId, String jobName)
throws IOException {
// Your Google Cloud Project ID
// String projectId = "your-project-id";
// The name of the job to check
// String jobName = "myJob/1234567890";
StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();
// Get transfer job and check latest operation
TransferJob transferJob =
storageTransfer.getTransferJob(
GetTransferJobRequest.newBuilder().setJobName(jobName).setProjectId(projectId).build());
String latestOperationName = transferJob.getLatestOperationName();
if (!latestOperationName.isEmpty()) {
Operation operation = storageTransfer.getOperationsClient().getOperation(latestOperationName);
TransferOperation latestOperation =
TransferOperation.parseFrom(operation.getMetadata().getValue());
System.out.println("The latest operation for transfer job " + jobName + " is:");
System.out.println(latestOperation.toString());
} else {
System.out.println(
"Transfer job "
+ jobName
+ " hasn't run yet,"
+ " try again once the job starts running.");
}
}
}
Python
依存関係の更新
新しいライブラリを使用するには、google-cloud-storage-transfer
の依存関係を追加します。これは、google-api-python-client
の検出クライアントの代わりに使用されます。
pip install --upgrade google-cloud-storage-transfer
クライアントのインスタンス化
googleapiclient.discovery
ではなく storage_transfer
モジュールを使用します。
API クライアント ライブラリ
"""A sample for creating a Storage Transfer Service client."""
import googleapiclient.discovery
def create_transfer_client():
return googleapiclient.discovery.build("storagetransfer", "v1")
Cloud クライアント ライブラリ
"""A sample for creating a Storage Transfer Service client."""
from google.cloud import storage_transfer
def create_transfer_client():
return storage_transfer.StorageTransferServiceClient()
サンプル比較
以下では、古い API クライアントのサンプルと Cloud クライアント ライブラリを使用した同等のサンプルを比較して、この 2 つのライブラリの違いを説明します。
Amazon S3 から転送する
API クライアント ライブラリ
def main(
description,
project_id,
start_date,
start_time,
source_bucket,
access_key_id,
secret_access_key,
sink_bucket,
):
"""Create a one-time transfer from Amazon S3 to Google Cloud Storage."""
storagetransfer = googleapiclient.discovery.build("storagetransfer", "v1")
# Edit this template with desired parameters.
transfer_job = {
"description": description,
"status": "ENABLED",
"projectId": project_id,
"schedule": {
"scheduleStartDate": {
"day": start_date.day,
"month": start_date.month,
"year": start_date.year,
},
"scheduleEndDate": {
"day": start_date.day,
"month": start_date.month,
"year": start_date.year,
},
"startTimeOfDay": {
"hours": start_time.hour,
"minutes": start_time.minute,
"seconds": start_time.second,
},
},
"transferSpec": {
"awsS3DataSource": {
"bucketName": source_bucket,
"awsAccessKey": {
"accessKeyId": access_key_id,
"secretAccessKey": secret_access_key,
},
},
"gcsDataSink": {"bucketName": sink_bucket},
},
}
result = storagetransfer.transferJobs().create(body=transfer_job).execute()
print("Returned transferJob: {}".format(json.dumps(result, indent=4)))
Cloud クライアント ライブラリ
from datetime import datetime
from google.cloud import storage_transfer
def create_one_time_aws_transfer(
project_id: str,
description: str,
source_bucket: str,
aws_access_key_id: str,
aws_secret_access_key: str,
sink_bucket: str,
):
"""Creates a one-time transfer job from Amazon S3 to Google Cloud
Storage."""
client = storage_transfer.StorageTransferServiceClient()
# The ID of the Google Cloud Platform Project that owns the job
# project_id = 'my-project-id'
# A useful description for your transfer job
# description = 'My transfer job'
# AWS S3 source bucket name
# source_bucket = 'my-s3-source-bucket'
# AWS Access Key ID
# aws_access_key_id = 'AKIA...'
# AWS Secret Access Key
# aws_secret_access_key = 'HEAoMK2.../...ku8'
# Google Cloud Storage destination bucket name
# sink_bucket = 'my-gcs-destination-bucket'
now = datetime.utcnow()
# Setting the start date and the end date as
# the same time creates a one-time transfer
one_time_schedule = {"day": now.day, "month": now.month, "year": now.year}
transfer_job_request = storage_transfer.CreateTransferJobRequest(
{
"transfer_job": {
"project_id": project_id,
"description": description,
"status": storage_transfer.TransferJob.Status.ENABLED,
"schedule": {
"schedule_start_date": one_time_schedule,
"schedule_end_date": one_time_schedule,
},
"transfer_spec": {
"aws_s3_data_source": {
"bucket_name": source_bucket,
"aws_access_key": {
"access_key_id": aws_access_key_id,
"secret_access_key": aws_secret_access_key,
},
},
"gcs_data_sink": {
"bucket_name": sink_bucket,
},
},
}
}
)
result = client.create_transfer_job(transfer_job_request)
print(f"Created transferJob: {result.name}")
Nearline に転送する
API クライアント ライブラリ
def main(description, project_id, start_date, start_time, source_bucket, sink_bucket):
"""Create a daily transfer from Standard to Nearline Storage class."""
storagetransfer = googleapiclient.discovery.build("storagetransfer", "v1")
# Edit this template with desired parameters.
transfer_job = {
"description": description,
"status": "ENABLED",
"projectId": project_id,
"schedule": {
"scheduleStartDate": {
"day": start_date.day,
"month": start_date.month,
"year": start_date.year,
},
"startTimeOfDay": {
"hours": start_time.hour,
"minutes": start_time.minute,
"seconds": start_time.second,
},
},
"transferSpec": {
"gcsDataSource": {"bucketName": source_bucket},
"gcsDataSink": {"bucketName": sink_bucket},
"objectConditions": {
"minTimeElapsedSinceLastModification": "2592000s" # 30 days
},
"transferOptions": {"deleteObjectsFromSourceAfterTransfer": "true"},
},
}
result = storagetransfer.transferJobs().create(body=transfer_job).execute()
print("Returned transferJob: {}".format(json.dumps(result, indent=4)))
Cloud クライアント ライブラリ
google.protobuf.duration_pb2.Duration
をインポートしている点に注意してください。
from datetime import datetime
from google.cloud import storage_transfer
from google.protobuf.duration_pb2 import Duration
def create_daily_nearline_30_day_migration(
project_id: str,
description: str,
source_bucket: str,
sink_bucket: str,
start_date: datetime,
):
"""Create a daily migration from a GCS bucket to a Nearline GCS bucket
for objects untouched for 30 days."""
client = storage_transfer.StorageTransferServiceClient()
# The ID of the Google Cloud Platform Project that owns the job
# project_id = 'my-project-id'
# A useful description for your transfer job
# description = 'My transfer job'
# Google Cloud Storage source bucket name
# source_bucket = 'my-gcs-source-bucket'
# Google Cloud Storage destination bucket name
# sink_bucket = 'my-gcs-destination-bucket'
transfer_job_request = storage_transfer.CreateTransferJobRequest(
{
"transfer_job": {
"project_id": project_id,
"description": description,
"status": storage_transfer.TransferJob.Status.ENABLED,
"schedule": {
"schedule_start_date": {
"day": start_date.day,
"month": start_date.month,
"year": start_date.year,
}
},
"transfer_spec": {
"gcs_data_source": {
"bucket_name": source_bucket,
},
"gcs_data_sink": {
"bucket_name": sink_bucket,
},
"object_conditions": {
"min_time_elapsed_since_last_modification": Duration(
seconds=2592000 # 30 days
)
},
"transfer_options": {
"delete_objects_from_source_after_transfer": True
},
},
}
}
)
result = client.create_transfer_job(transfer_job_request)
print(f"Created transferJob: {result.name}")
最新の転送オペレーションを確認する
API クライアント ライブラリ
"""Command-line sample that checks the latest operation of a transfer.
This sample is used on this page:
https://cloud.google.com/storage/transfer/create-transfer
For more information, see README.md.
"""
import argparse
import json
import googleapiclient.discovery
def check_latest_transfer_operation(project_id, job_name):
"""Check the latest transfer operation associated with a transfer job."""
storagetransfer = googleapiclient.discovery.build("storagetransfer", "v1")
transferJob = (
storagetransfer.transferJobs()
.get(projectId=project_id, jobName=job_name)
.execute()
)
latestOperationName = transferJob.get("latestOperationName")
if latestOperationName:
result = (
storagetransfer.transferOperations().get(name=latestOperationName).execute()
)
print(
"The latest operation for job"
+ job_name
+ " is: {}".format(json.dumps(result, indent=4, sort_keys=True))
)
else:
print(
"Transfer job "
+ job_name
+ " does not have an operation scheduled yet, "
+ "try again once the job starts running."
)
if __name__ == "__main__":
parser = argparse.ArgumentParser(
description=__doc__, formatter_class=argparse.RawDescriptionHelpFormatter
)
parser.add_argument("project_id", help="Your Google Cloud project ID.")
parser.add_argument("job_name", help="Your job name.")
args = parser.parse_args()
check_latest_transfer_operation(args.project_id, args.job_name)
Cloud クライアント ライブラリ
storage_transfer.TransferOperation.deserialize
を使用している点に注意してください。
from google.cloud import storage_transfer
def check_latest_transfer_operation(project_id: str, job_name: str):
"""Checks the latest transfer operation for a given transfer job."""
client = storage_transfer.StorageTransferServiceClient()
# The ID of the Google Cloud Platform Project that owns the job
# project_id = 'my-project-id'
# Storage Transfer Service job name
# job_name = 'transferJobs/1234567890'
transfer_job = client.get_transfer_job(
{
"project_id": project_id,
"job_name": job_name,
}
)
if transfer_job.latest_operation_name:
response = client.transport.operations_client.get_operation(
transfer_job.latest_operation_name
)
operation = storage_transfer.TransferOperation.deserialize(
response.metadata.value
)
print(f"Latest transfer operation for `{job_name}`: {operation}")
else:
print(f"Transfer job {job_name} has not ran yet.")