높은 품질과 다른 Cloud 라이브러리와의 일관성을 위해 Storage Transfer Service 문서는 현재 Google API 클라이언트 라이브러리 대신 Cloud 클라이언트 라이브러리를 사용합니다. 두 옵션에 대한 자세한 내용은 클라이언트 라이브러리 설명을 참조하세요.
Google API 클라이언트 라이브러리는 계속 업데이트를 받지만 더 이상 문서에서 참조되지 않습니다.
이 가이드에서는 Storage Transfer Service 사용 시 적용되는 주요 차이점에 대해 설명하고 Cloud 클라이언트 라이브러리로 마이그레이션할 때 클라이언트를 업데이트하는 방법을 설명합니다.
Java
종속 항목 업데이트
새 라이브러리로 전환하려면 google-api-services-storagetransfer
의 종속 항목을 google-cloud-storage-transfer
로 바꿉니다.
<dependency> <groupId>com.google.cloud</groupId> <artifactId>google-cloud-storage-transfer</artifactId> <version>0.2.3</version> </dependency>
BOM없이 Gradle을 사용하는 경우 종속 항목에 다음을 추가합니다.
implementation 'com.google.cloud:google-cloud-storage-transfer:0.2.3'
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>libraries-bom</artifactId>
<version>24.1.0</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-storage-transfer</artifactId>
</dependency>
대부분의 경우 API 클라이언트 라이브러리에서 Cloud 클라이언트 라이브러리로 코드를 쉽게 변환할 수 있습니다. 두 자바 클라이언트의 주요 차이점은 다음과 같습니다.
클라이언트 인스턴스화
Cloud 클라이언트 라이브러리는 클라이언트 인스턴스화와 관련된 상용구를 상당 부분 줄여 백그라운드에서 처리합니다.
API 클라이언트 라이브러리
GoogleCredentials credential = GoogleCredentials.getApplicationDefault();
if (credential.createScopedRequired()) {
credential = credential.createScoped(StoragetransferScopes.all());
}
Storagetransfer storageTransfer = new Storagetransfer.Builder(Utils.getDefaultTransport(),
Utils.getDefaultJsonFactory(), new HttpCredentialsAdapter(credential))
.build();
Cloud 클라이언트 라이브러리
StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();
모델 클래스용 빌더
Cloud 클라이언트 라이브러리의 모델 클래스는 생성자 대신 빌더를 사용합니다.
API 클라이언트 라이브러리
TransferJob transferJob =
new TransferJob()
.setStatus("ENABLED");
Cloud 클라이언트 라이브러리
TransferJob transferJob =
TransferJob.newBuilder()
.setStatus(Status.ENABLED)
.build();
나열 작업 반환 반복 가능 항목
Cloud 클라이언트 라이브러리에서 나열 작업은 API 클라이언트 라이브러리에서 페이지로 나눈 결과 대신 간단한 반복 가능 항목을 반환합니다.
API 클라이언트 라이브러리
public class StoragetransferExample {
public static void main(String args[]) throws IOException, GeneralSecurityException {
Storagetransfer storagetransferService = createStoragetransferService();
Storagetransfer.TransferJobs.List request = storagetransferService.transferJobs().list();
ListTransferJobsResponse response;
do {
response = request.execute();
if (response.getTransferJobs() == null) {
continue;
}
for (TransferJob transferJob : response.getTransferJobs()) {
System.out.println(transferJob);
}
request.setPageToken(response.getNextPageToken());
} while (response.getNextPageToken() != null);
}
public static Storagetransfer createStoragetransferService()
throws IOException, GeneralSecurityException {
HttpTransport httpTransport = GoogleNetHttpTransport.newTrustedTransport();
JsonFactory jsonFactory = JacksonFactory.getDefaultInstance();
GoogleCredential credential = GoogleCredential.getApplicationDefault();
}
return new Storagetransfer.Builder(httpTransport, jsonFactory, credential)
.build();
}
}
Cloud 클라이언트 라이브러리
public class StoragetransferExample {
public static void main(String args[]) throws Exception {
StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();
ListTransferJobsRequest request = ListTransferJobsRequest.newBuilder().build();
for (TransferJob job : client.listTransferJobs(request).iterateAll()) {
System.out.println(job);
}
}
}
샘플 비교
여기에서는 Cloud 클라이언트 라이브러리를 사용하는 동등한 샘플과 비교하여 이전 API 클라이언트 라이브러리 샘플이 포함되어 있습니다. 이전에 이러한 샘플을 사용한 경우 이 비교를 사용하여 코드를 새 Cloud 클라이언트 라이브러리로 이동하는 방법을 이해할 수 있습니다.
Amazon S3에서 전송
API 클라이언트 라이브러리
import com.google.api.client.googleapis.util.Utils;
import com.google.api.services.storagetransfer.v1.Storagetransfer;
import com.google.api.services.storagetransfer.v1.StoragetransferScopes;
import com.google.api.services.storagetransfer.v1.model.AwsAccessKey;
import com.google.api.services.storagetransfer.v1.model.AwsS3Data;
import com.google.api.services.storagetransfer.v1.model.Date;
import com.google.api.services.storagetransfer.v1.model.GcsData;
import com.google.api.services.storagetransfer.v1.model.Schedule;
import com.google.api.services.storagetransfer.v1.model.TimeOfDay;
import com.google.api.services.storagetransfer.v1.model.TransferJob;
import com.google.api.services.storagetransfer.v1.model.TransferSpec;
import com.google.auth.http.HttpCredentialsAdapter;
import com.google.auth.oauth2.GoogleCredentials;
import java.io.IOException;
import java.util.Calendar;
public class TransferFromAwsApiary {
// Creates a one-off transfer job from Amazon S3 to Google Cloud Storage.
public static void transferFromAws(
String projectId,
String jobDescription,
String awsSourceBucket,
String gcsSinkBucket,
long startDateTime)
throws IOException {
// Your Google Cloud Project ID
// String projectId = "your-project-id";
// A short description of this job
// String jobDescription = "Sample transfer job from S3 to GCS.";
// The name of the source AWS bucket to transfer data from
// String awsSourceBucket = "yourAwsSourceBucket";
// The name of the GCS bucket to transfer data to
// String gcsSinkBucket = "your-gcs-bucket";
// What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
// If this is in the past relative to when the job is created, it will run the next day.
// long startDateTime =
// new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();
// The ID used to access your AWS account. Should be accessed via environment variable.
String awsAccessKeyId = System.getenv("AWS_ACCESS_KEY_ID");
// The Secret Key used to access your AWS account. Should be accessed via environment variable.
String awsSecretAccessKey = System.getenv("AWS_SECRET_ACCESS_KEY");
// Set up source and sink
TransferSpec transferSpec =
new TransferSpec()
.setAwsS3DataSource(
new AwsS3Data()
.setBucketName(awsSourceBucket)
.setAwsAccessKey(
new AwsAccessKey()
.setAccessKeyId(awsAccessKeyId)
.setSecretAccessKey(awsSecretAccessKey)))
.setGcsDataSink(new GcsData().setBucketName(gcsSinkBucket));
// Parse epoch timestamp into the model classes
Calendar startCalendar = Calendar.getInstance();
startCalendar.setTimeInMillis(startDateTime);
// Note that this is a Date from the model class package, not a java.util.Date
Date startDate =
new Date()
.setYear(startCalendar.get(Calendar.YEAR))
.setMonth(startCalendar.get(Calendar.MONTH) + 1)
.setDay(startCalendar.get(Calendar.DAY_OF_MONTH));
TimeOfDay startTime =
new TimeOfDay()
.setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
.setMinutes(startCalendar.get(Calendar.MINUTE))
.setSeconds(startCalendar.get(Calendar.SECOND));
Schedule schedule =
new Schedule()
.setScheduleStartDate(startDate)
.setScheduleEndDate(startDate)
.setStartTimeOfDay(startTime);
// Set up the transfer job
TransferJob transferJob =
new TransferJob()
.setDescription(jobDescription)
.setProjectId(projectId)
.setTransferSpec(transferSpec)
.setSchedule(schedule)
.setStatus("ENABLED");
// Create a Transfer Service client
GoogleCredentials credential = GoogleCredentials.getApplicationDefault();
if (credential.createScopedRequired()) {
credential = credential.createScoped(StoragetransferScopes.all());
}
Storagetransfer storageTransfer =
new Storagetransfer.Builder(
Utils.getDefaultTransport(),
Utils.getDefaultJsonFactory(),
new HttpCredentialsAdapter(credential))
.build();
// Create the transfer job
TransferJob response = storageTransfer.transferJobs().create(transferJob).execute();
System.out.println("Created transfer job from AWS to GCS:");
System.out.println(response.toPrettyString());
}
}
Cloud 클라이언트 라이브러리
import com.google.storagetransfer.v1.proto.StorageTransferServiceClient;
import com.google.storagetransfer.v1.proto.TransferProto.CreateTransferJobRequest;
import com.google.storagetransfer.v1.proto.TransferTypes.AwsAccessKey;
import com.google.storagetransfer.v1.proto.TransferTypes.AwsS3Data;
import com.google.storagetransfer.v1.proto.TransferTypes.GcsData;
import com.google.storagetransfer.v1.proto.TransferTypes.Schedule;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob.Status;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferSpec;
import com.google.type.Date;
import com.google.type.TimeOfDay;
import java.io.IOException;
import java.util.Calendar;
public class TransferFromAws {
// Creates a one-off transfer job from Amazon S3 to Google Cloud Storage.
public static void transferFromAws(
String projectId,
String jobDescription,
String awsSourceBucket,
String gcsSinkBucket,
long startDateTime)
throws IOException {
// Your Google Cloud Project ID
// String projectId = "your-project-id";
// A short description of this job
// String jobDescription = "Sample transfer job from S3 to GCS.";
// The name of the source AWS bucket to transfer data from
// String awsSourceBucket = "yourAwsSourceBucket";
// The name of the GCS bucket to transfer data to
// String gcsSinkBucket = "your-gcs-bucket";
// What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
// If this is in the past relative to when the job is created, it will run the next day.
// long startDateTime =
// new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();
// The ID used to access your AWS account. Should be accessed via environment variable.
String awsAccessKeyId = System.getenv("AWS_ACCESS_KEY_ID");
// The Secret Key used to access your AWS account. Should be accessed via environment variable.
String awsSecretAccessKey = System.getenv("AWS_SECRET_ACCESS_KEY");
// Set up source and sink
TransferSpec transferSpec =
TransferSpec.newBuilder()
.setAwsS3DataSource(
AwsS3Data.newBuilder()
.setBucketName(awsSourceBucket)
.setAwsAccessKey(
AwsAccessKey.newBuilder()
.setAccessKeyId(awsAccessKeyId)
.setSecretAccessKey(awsSecretAccessKey)))
.setGcsDataSink(GcsData.newBuilder().setBucketName(gcsSinkBucket))
.build();
// Parse epoch timestamp into the model classes
Calendar startCalendar = Calendar.getInstance();
startCalendar.setTimeInMillis(startDateTime);
// Note that this is a Date from the model class package, not a java.util.Date
Date startDate =
Date.newBuilder()
.setYear(startCalendar.get(Calendar.YEAR))
.setMonth(startCalendar.get(Calendar.MONTH) + 1)
.setDay(startCalendar.get(Calendar.DAY_OF_MONTH))
.build();
TimeOfDay startTime =
TimeOfDay.newBuilder()
.setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
.setMinutes(startCalendar.get(Calendar.MINUTE))
.setSeconds(startCalendar.get(Calendar.SECOND))
.build();
Schedule schedule =
Schedule.newBuilder()
.setScheduleStartDate(startDate)
.setScheduleEndDate(startDate)
.setStartTimeOfDay(startTime)
.build();
// Set up the transfer job
TransferJob transferJob =
TransferJob.newBuilder()
.setDescription(jobDescription)
.setProjectId(projectId)
.setTransferSpec(transferSpec)
.setSchedule(schedule)
.setStatus(Status.ENABLED)
.build();
// Create a Transfer Service client
StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();
// Create the transfer job
TransferJob response =
storageTransfer.createTransferJob(
CreateTransferJobRequest.newBuilder().setTransferJob(transferJob).build());
System.out.println("Created transfer job from AWS to GCS:");
System.out.println(response.toString());
}
}
Nearline으로 전달
API 클라이언트 라이브러리
import com.google.api.client.googleapis.util.Utils;
import com.google.api.services.storagetransfer.v1.Storagetransfer;
import com.google.api.services.storagetransfer.v1.StoragetransferScopes;
import com.google.api.services.storagetransfer.v1.model.Date;
import com.google.api.services.storagetransfer.v1.model.GcsData;
import com.google.api.services.storagetransfer.v1.model.ObjectConditions;
import com.google.api.services.storagetransfer.v1.model.Schedule;
import com.google.api.services.storagetransfer.v1.model.TimeOfDay;
import com.google.api.services.storagetransfer.v1.model.TransferJob;
import com.google.api.services.storagetransfer.v1.model.TransferOptions;
import com.google.api.services.storagetransfer.v1.model.TransferSpec;
import com.google.auth.http.HttpCredentialsAdapter;
import com.google.auth.oauth2.GoogleCredentials;
import java.io.IOException;
import java.util.Calendar;
public class TransferToNearlineApiary {
/**
* Creates a one-off transfer job that transfers objects in a standard GCS bucket that are more
* than 30 days old to a Nearline GCS bucket.
*/
public static void transferToNearlineApiary(
String projectId,
String jobDescription,
String gcsSourceBucket,
String gcsNearlineSinkBucket,
long startDateTime)
throws IOException {
// Your Google Cloud Project ID
// String projectId = "your-project-id";
// A short description of this job
// String jobDescription = "Sample transfer job of old objects to a Nearline GCS bucket.";
// The name of the source GCS bucket to transfer data from
// String gcsSourceBucket = "your-gcs-source-bucket";
// The name of the Nearline GCS bucket to transfer old objects to
// String gcsSinkBucket = "your-nearline-gcs-bucket";
// What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
// If this is in the past relative to when the job is created, it will run the next day.
// long startDateTime =
// new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();
// Parse epoch timestamp into the model classes
Calendar startCalendar = Calendar.getInstance();
startCalendar.setTimeInMillis(startDateTime);
// Note that this is a Date from the model class package, not a java.util.Date
Date date =
new Date()
.setYear(startCalendar.get(Calendar.YEAR))
.setMonth(startCalendar.get(Calendar.MONTH) + 1)
.setDay(startCalendar.get(Calendar.DAY_OF_MONTH));
TimeOfDay time =
new TimeOfDay()
.setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
.setMinutes(startCalendar.get(Calendar.MINUTE))
.setSeconds(startCalendar.get(Calendar.SECOND));
TransferJob transferJob =
new TransferJob()
.setDescription(jobDescription)
.setProjectId(projectId)
.setTransferSpec(
new TransferSpec()
.setGcsDataSource(new GcsData().setBucketName(gcsSourceBucket))
.setGcsDataSink(new GcsData().setBucketName(gcsNearlineSinkBucket))
.setObjectConditions(
new ObjectConditions()
.setMinTimeElapsedSinceLastModification("2592000s" /* 30 days */))
.setTransferOptions(
new TransferOptions().setDeleteObjectsFromSourceAfterTransfer(true)))
.setSchedule(new Schedule().setScheduleStartDate(date).setStartTimeOfDay(time))
.setStatus("ENABLED");
// Create a Transfer Service client
GoogleCredentials credential = GoogleCredentials.getApplicationDefault();
if (credential.createScopedRequired()) {
credential = credential.createScoped(StoragetransferScopes.all());
}
Storagetransfer storageTransfer =
new Storagetransfer.Builder(
Utils.getDefaultTransport(),
Utils.getDefaultJsonFactory(),
new HttpCredentialsAdapter(credential))
.build();
// Create the transfer job
TransferJob response = storageTransfer.transferJobs().create(transferJob).execute();
System.out.println("Created transfer job from standard bucket to Nearline bucket:");
System.out.println(response.toPrettyString());
}
}
Cloud 클라이언트 라이브러리
import com.google.protobuf.Duration;
import com.google.storagetransfer.v1.proto.StorageTransferServiceClient;
import com.google.storagetransfer.v1.proto.TransferProto.CreateTransferJobRequest;
import com.google.storagetransfer.v1.proto.TransferTypes.GcsData;
import com.google.storagetransfer.v1.proto.TransferTypes.ObjectConditions;
import com.google.storagetransfer.v1.proto.TransferTypes.Schedule;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob.Status;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferOptions;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferSpec;
import com.google.type.Date;
import com.google.type.TimeOfDay;
import java.io.IOException;
import java.util.Calendar;
public class TransferToNearline {
/**
* Creates a one-off transfer job that transfers objects in a standard GCS bucket that are more
* than 30 days old to a Nearline GCS bucket.
*/
public static void transferToNearline(
String projectId,
String jobDescription,
String gcsSourceBucket,
String gcsNearlineSinkBucket,
long startDateTime)
throws IOException {
// Your Google Cloud Project ID
// String projectId = "your-project-id";
// A short description of this job
// String jobDescription = "Sample transfer job of old objects to a Nearline GCS bucket.";
// The name of the source GCS bucket to transfer data from
// String gcsSourceBucket = "your-gcs-source-bucket";
// The name of the Nearline GCS bucket to transfer old objects to
// String gcsSinkBucket = "your-nearline-gcs-bucket";
// What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
// If this is in the past relative to when the job is created, it will run the next day.
// long startDateTime =
// new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();
// Parse epoch timestamp into the model classes
Calendar startCalendar = Calendar.getInstance();
startCalendar.setTimeInMillis(startDateTime);
// Note that this is a Date from the model class package, not a java.util.Date
Date date =
Date.newBuilder()
.setYear(startCalendar.get(Calendar.YEAR))
.setMonth(startCalendar.get(Calendar.MONTH) + 1)
.setDay(startCalendar.get(Calendar.DAY_OF_MONTH))
.build();
TimeOfDay time =
TimeOfDay.newBuilder()
.setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
.setMinutes(startCalendar.get(Calendar.MINUTE))
.setSeconds(startCalendar.get(Calendar.SECOND))
.build();
TransferJob transferJob =
TransferJob.newBuilder()
.setDescription(jobDescription)
.setProjectId(projectId)
.setTransferSpec(
TransferSpec.newBuilder()
.setGcsDataSource(GcsData.newBuilder().setBucketName(gcsSourceBucket))
.setGcsDataSink(GcsData.newBuilder().setBucketName(gcsNearlineSinkBucket))
.setObjectConditions(
ObjectConditions.newBuilder()
.setMinTimeElapsedSinceLastModification(
Duration.newBuilder().setSeconds(2592000 /* 30 days */)))
.setTransferOptions(
TransferOptions.newBuilder().setDeleteObjectsFromSourceAfterTransfer(true)))
.setSchedule(Schedule.newBuilder().setScheduleStartDate(date).setStartTimeOfDay(time))
.setStatus(Status.ENABLED)
.build();
// Create a Transfer Service client
StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();
// Create the transfer job
TransferJob response =
storageTransfer.createTransferJob(
CreateTransferJobRequest.newBuilder().setTransferJob(transferJob).build());
System.out.println("Created transfer job from standard bucket to Nearline bucket:");
System.out.println(response.toString());
}
}
최근 전송 작업 확인
API 클라이언트 라이브러리
import com.google.api.client.googleapis.util.Utils;
import com.google.api.services.storagetransfer.v1.Storagetransfer;
import com.google.api.services.storagetransfer.v1.StoragetransferScopes;
import com.google.api.services.storagetransfer.v1.model.Operation;
import com.google.api.services.storagetransfer.v1.model.TransferJob;
import com.google.auth.http.HttpCredentialsAdapter;
import com.google.auth.oauth2.GoogleCredentials;
import java.io.IOException;
public class CheckLatestTransferOperationApiary {
// Gets the requested transfer job and checks its latest operation
public static void checkLatestTransferOperationApiary(String projectId, String jobName)
throws IOException {
// Your Google Cloud Project ID
// String projectId = "your-project-id";
// The name of the job to check
// String jobName = "myJob/1234567890";
// Create Storage Transfer client
GoogleCredentials credential = GoogleCredentials.getApplicationDefault();
if (credential.createScopedRequired()) {
credential = credential.createScoped(StoragetransferScopes.all());
}
Storagetransfer storageTransfer =
new Storagetransfer.Builder(
Utils.getDefaultTransport(),
Utils.getDefaultJsonFactory(),
new HttpCredentialsAdapter(credential))
.build();
// Get transfer job and check latest operation
TransferJob transferJob = storageTransfer.transferJobs().get(jobName, projectId).execute();
String latestOperationName = transferJob.getLatestOperationName();
if (latestOperationName != null) {
Operation latestOperation =
storageTransfer.transferOperations().get(latestOperationName).execute();
System.out.println("The latest operation for transfer job " + jobName + " is:");
System.out.println(latestOperation.toPrettyString());
} else {
System.out.println(
"Transfer job "
+ jobName
+ " does not have an operation scheduled yet,"
+ " try again once the job starts running.");
}
}
}
Cloud 클라이언트 라이브러리
import com.google.longrunning.Operation;
import com.google.storagetransfer.v1.proto.StorageTransferServiceClient;
import com.google.storagetransfer.v1.proto.TransferProto.GetTransferJobRequest;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferOperation;
import java.io.IOException;
public class CheckLatestTransferOperation {
// Gets the requested transfer job and checks its latest operation
public static void checkLatestTransferOperation(String projectId, String jobName)
throws IOException {
// Your Google Cloud Project ID
// String projectId = "your-project-id";
// The name of the job to check
// String jobName = "myJob/1234567890";
StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();
// Get transfer job and check latest operation
TransferJob transferJob =
storageTransfer.getTransferJob(
GetTransferJobRequest.newBuilder().setJobName(jobName).setProjectId(projectId).build());
String latestOperationName = transferJob.getLatestOperationName();
if (!latestOperationName.isEmpty()) {
Operation operation = storageTransfer.getOperationsClient().getOperation(latestOperationName);
TransferOperation latestOperation =
TransferOperation.parseFrom(operation.getMetadata().getValue());
System.out.println("The latest operation for transfer job " + jobName + " is:");
System.out.println(latestOperation.toString());
} else {
System.out.println(
"Transfer job "
+ jobName
+ " hasn't run yet,"
+ " try again once the job starts running.");
}
}
}
Python
종속 항목 업데이트
새 라이브러리를 사용하려면 google-cloud-storage-transfer
에 대한 종속 항목을 추가합니다.
이것은 google-api-python-client
의 검색 클라이언트 대신 사용됩니다.
pip install --upgrade google-cloud-storage-transfer
클라이언트 인스턴스화
googleapiclient.discovery
대신 storage_transfer
모듈을 사용합니다.
API 클라이언트 라이브러리
"""A sample for creating a Storage Transfer Service client."""
import googleapiclient.discovery
def create_transfer_client():
return googleapiclient.discovery.build('storagetransfer', 'v1')
Cloud 클라이언트 라이브러리
"""A sample for creating a Storage Transfer Service client."""
from google.cloud import storage_transfer
def create_transfer_client():
return storage_transfer.StorageTransferServiceClient()
샘플 비교
다음은 두 라이브러리의 차이점을 설명하기 위해 이전 API 클라이언트 샘플을 Cloud 클라이언트 라이브러리에 상응하는 샘플과 나란히 보여줍니다.
Amazon S3에서 전송
API 클라이언트 라이브러리
def main(description, project_id, start_date, start_time, source_bucket,
access_key_id, secret_access_key, sink_bucket):
"""Create a one-time transfer from Amazon S3 to Google Cloud Storage."""
storagetransfer = googleapiclient.discovery.build('storagetransfer', 'v1')
# Edit this template with desired parameters.
transfer_job = {
'description': description,
'status': 'ENABLED',
'projectId': project_id,
'schedule': {
'scheduleStartDate': {
'day': start_date.day,
'month': start_date.month,
'year': start_date.year
},
'scheduleEndDate': {
'day': start_date.day,
'month': start_date.month,
'year': start_date.year
},
'startTimeOfDay': {
'hours': start_time.hour,
'minutes': start_time.minute,
'seconds': start_time.second
}
},
'transferSpec': {
'awsS3DataSource': {
'bucketName': source_bucket,
'awsAccessKey': {
'accessKeyId': access_key_id,
'secretAccessKey': secret_access_key
}
},
'gcsDataSink': {
'bucketName': sink_bucket
}
}
}
result = storagetransfer.transferJobs().create(body=transfer_job).execute()
print('Returned transferJob: {}'.format(
json.dumps(result, indent=4)))
Cloud 클라이언트 라이브러리
from datetime import datetime
from google.cloud import storage_transfer
def create_one_time_aws_transfer(
project_id: str, description: str,
source_bucket: str, aws_access_key_id: str,
aws_secret_access_key: str, sink_bucket: str):
"""Creates a one-time transfer job from Amazon S3 to Google Cloud
Storage."""
client = storage_transfer.StorageTransferServiceClient()
# The ID of the Google Cloud Platform Project that owns the job
# project_id = 'my-project-id'
# A useful description for your transfer job
# description = 'My transfer job'
# AWS S3 source bucket name
# source_bucket = 'my-s3-source-bucket'
# AWS Access Key ID
# aws_access_key_id = 'AKIA...'
# AWS Secret Access Key
# aws_secret_access_key = 'HEAoMK2.../...ku8'
# Google Cloud Storage destination bucket name
# sink_bucket = 'my-gcs-destination-bucket'
now = datetime.utcnow()
# Setting the start date and the end date as
# the same time creates a one-time transfer
one_time_schedule = {
'day': now.day,
'month': now.month,
'year': now.year
}
transfer_job_request = storage_transfer.CreateTransferJobRequest({
'transfer_job': {
'project_id': project_id,
'description': description,
'status': storage_transfer.TransferJob.Status.ENABLED,
'schedule': {
'schedule_start_date': one_time_schedule,
'schedule_end_date': one_time_schedule
},
'transfer_spec': {
'aws_s3_data_source': {
'bucket_name': source_bucket,
'aws_access_key': {
'access_key_id': aws_access_key_id,
'secret_access_key': aws_secret_access_key,
}
},
'gcs_data_sink': {
'bucket_name': sink_bucket,
}
}
}
})
result = client.create_transfer_job(transfer_job_request)
print(f'Created transferJob: {result.name}')
Nearline으로 전달
API 클라이언트 라이브러리
def main(description, project_id, start_date, start_time, source_bucket,
sink_bucket):
"""Create a daily transfer from Standard to Nearline Storage class."""
storagetransfer = googleapiclient.discovery.build('storagetransfer', 'v1')
# Edit this template with desired parameters.
transfer_job = {
'description': description,
'status': 'ENABLED',
'projectId': project_id,
'schedule': {
'scheduleStartDate': {
'day': start_date.day,
'month': start_date.month,
'year': start_date.year
},
'startTimeOfDay': {
'hours': start_time.hour,
'minutes': start_time.minute,
'seconds': start_time.second
}
},
'transferSpec': {
'gcsDataSource': {
'bucketName': source_bucket
},
'gcsDataSink': {
'bucketName': sink_bucket
},
'objectConditions': {
'minTimeElapsedSinceLastModification': '2592000s' # 30 days
},
'transferOptions': {
'deleteObjectsFromSourceAfterTransfer': 'true'
}
}
}
result = storagetransfer.transferJobs().create(body=transfer_job).execute()
print('Returned transferJob: {}'.format(
json.dumps(result, indent=4)))
Cloud 클라이언트 라이브러리
google.protobuf.duration_pb2.Duration
가져오기를 확인합니다.
from datetime import datetime
from google.cloud import storage_transfer
from google.protobuf.duration_pb2 import Duration
def create_daily_nearline_30_day_migration(
project_id: str, description: str, source_bucket: str,
sink_bucket: str, start_date: datetime):
"""Create a daily migration from a GCS bucket to a Nearline GCS bucket
for objects untouched for 30 days."""
client = storage_transfer.StorageTransferServiceClient()
# The ID of the Google Cloud Platform Project that owns the job
# project_id = 'my-project-id'
# A useful description for your transfer job
# description = 'My transfer job'
# Google Cloud Storage source bucket name
# source_bucket = 'my-gcs-source-bucket'
# Google Cloud Storage destination bucket name
# sink_bucket = 'my-gcs-destination-bucket'
transfer_job_request = storage_transfer.CreateTransferJobRequest({
'transfer_job': {
'project_id': project_id,
'description': description,
'status': storage_transfer.TransferJob.Status.ENABLED,
'schedule': {
'schedule_start_date': {
'day': start_date.day,
'month': start_date.month,
'year': start_date.year
}
},
'transfer_spec': {
'gcs_data_source': {
'bucket_name': source_bucket,
},
'gcs_data_sink': {
'bucket_name': sink_bucket,
},
'object_conditions': {
'min_time_elapsed_since_last_modification': Duration(
seconds=2592000 # 30 days
)
},
'transfer_options': {
'delete_objects_from_source_after_transfer': True
}
}
}
})
result = client.create_transfer_job(transfer_job_request)
print(f'Created transferJob: {result.name}')
최근 전송 작업 확인
API 클라이언트 라이브러리
"""Command-line sample that checks the latest operation of a transfer.
This sample is used on this page:
https://cloud.google.com/storage/transfer/create-transfer
For more information, see README.md.
"""
import argparse
import json
import googleapiclient.discovery
def check_latest_transfer_operation(project_id, job_name):
"""Check the latest transfer operation associated with a transfer job."""
storagetransfer = googleapiclient.discovery.build("storagetransfer", "v1")
transferJob = (
storagetransfer.transferJobs()
.get(projectId=project_id, jobName=job_name)
.execute()
)
latestOperationName = transferJob.get("latestOperationName")
if latestOperationName:
result = (
storagetransfer.transferOperations().get(name=latestOperationName).execute()
)
print(
"The latest operation for job"
+ job_name
+ " is: {}".format(json.dumps(result, indent=4, sort_keys=True))
)
else:
print(
"Transfer job "
+ job_name
+ " does not have an operation scheduled yet, "
+ "try again once the job starts running."
)
if __name__ == "__main__":
parser = argparse.ArgumentParser(
description=__doc__, formatter_class=argparse.RawDescriptionHelpFormatter
)
parser.add_argument("project_id", help="Your Google Cloud project ID.")
parser.add_argument("job_name", help="Your job name.")
args = parser.parse_args()
check_latest_transfer_operation(args.project_id, args.job_name)
Cloud 클라이언트 라이브러리
storage_transfer.TransferOperation.deserialize
의 사용의 유의 사항.
from google.cloud import storage_transfer
def check_latest_transfer_operation(project_id: str, job_name: str):
"""Checks the latest transfer operation for a given transfer job."""
client = storage_transfer.StorageTransferServiceClient()
# The ID of the Google Cloud Platform Project that owns the job
# project_id = 'my-project-id'
# Storage Transfer Service job name
# job_name = 'transferJobs/1234567890'
transfer_job = client.get_transfer_job({
'project_id': project_id,
'job_name': job_name,
})
if transfer_job.latest_operation_name:
response = client.transport.operations_client.get_operation(
transfer_job.latest_operation_name)
operation = storage_transfer.TransferOperation.deserialize(
response.metadata.value)
print(f"Latest transfer operation for `{job_name}`: {operation}")
else:
print(f"Transfer job {job_name} has not ran yet.")