프로그래매틱 방식으로 데이터 전송 만들기 및 관리

이 페이지에서는 REST API를 통해 직접 Storage Transfer Service를 사용하는 방법과 자바와 Python을 프로그래매틱 방식으로 사용하는 두 가지 일반적인 시나리오를 보여줍니다. Google Cloud Console을 사용하여 전송 작업을 만들려면 Console로 전송 만들기 및 관리를 참조하세요.

Storage Transfer API를 사용하여 전송 작업을 프로그래매틱 방식으로 구성하거나 편집할 때 시간은 UTC여야 합니다. 전송 작업의 일정 지정에 대한 자세한 내용은 일정을 참조하세요.

시작하기 전에

Storage Transfer Service에서 전송 작업을 설정하려면 먼저 필요한 액세스 권한이 있어야 합니다.

  • Storage Transfer Service 액세스: 다음 역할 중 하나를 할당받아야 합니다.

    • roles/owner
    • roles/editor
    • roles/storagetransfer.admin
    • roles/storagetransfer.user
    • 최소한의 roles/storagetransfer.user 권한을 포함하는 커스텀 역할입니다.

      프로젝트 수준 권한 추가 및 보기에 대한 자세한 내용은 프로젝트에 IAM 권한 사용을 참조하세요.

    Storage Transfer Service의 IAM 역할과 권한에 대한 자세한 내용은 IAM 역할 및 권한을 사용한 액세스 제어를 참조하세요.

  • 소스 및 싱크 액세스: Storage Transfer Service는 서비스 계정을 사용하여 전송을 수행합니다. 데이터 소스와 데이터 싱크에 액세스하려면 이 서비스 계정에 소스 권한싱크 권한이 있어야 합니다.

Amazon S3에서 Cloud Storage로 이전

이 예시에서는 파일을 Amazon S3에서 Cloud Storage 버킷으로 이동하는 방법을 학습합니다. 데이터를 Amazon S3에서 Cloud Storage로 이동하면 어떤 영향이 있는지 알아보려면 액세스 구성가격 책정을 검토하세요.

전송 작업을 만들려면 다음을 사용하세요.

전송 작업을 만들 때 Amazon S3 버킷 소스 이름에 bucketNames3:// 프리픽스를 포함하지 마세요.

REST

transferJobs create를 사용하는 요청:
POST https://storagetransfer.googleapis.com/v1/transferJobs
{
    "description": "YOUR DESCRIPTION",
    "status": "ENABLED",
    "projectId": "PROJECT_ID",
    "schedule": {
        "scheduleStartDate": {
            "day": 1,
            "month": 1,
            "year": 2015
        },
        "scheduleEndDate": {
            "day": 1,
            "month": 1,
            "year": 2015
        },
        "startTimeOfDay": {
            "hours": 1,
            "minutes": 1
        }
    },
    "transferSpec": {
        "awsS3DataSource": {
            "bucketName": "AWS_SOURCE_NAME",
            "awsAccessKey": {
                "accessKeyId": "AWS_ACCESS_KEY_ID",
                "secretAccessKey": "AWS_SECRET_ACCESS_KEY"
            }
        },
        "gcsDataSink": {
            "bucketName": "GCS_SINK_NAME"
        }
    }
}
응답:
200 OK
{
    "transferJob": [
        {
            "creationTime": "2015-01-01T01:01:00.000000000Z",
            "description": "YOUR DESCRIPTION",
            "name": "transferJobs/JOB_ID",
            "status": "ENABLED",
            "lastModificationTime": "2015-01-01T01:01:00.000000000Z",
            "projectId": "PROJECT_ID",
            "schedule": {
                "scheduleStartDate": {
                    "day": 1,
                    "month": 1,
                    "year": 2015
                },
                "scheduleEndDate": {
                    "day": 1,
                    "month": 1,
                    "year": 2015
                },
                "startTimeOfDay": {
                    "hours": 1,
                    "minutes": 1
                }
            },
            "transferSpec": {
                "awsS3DataSource": {
                    "bucketName": "AWS_SOURCE_NAME"
                },
                "gcsDataSink": {
                    "bucketName": "GCS_SINK_NAME"
                },
                "objectConditions": {},
                "transferOptions": {}
            }
        }
    ]
}

자바

Storage Transfer Service 클라이언트를 만드는 방법은 Google API 라이브러리의 클라이언트 만들기를 참조하세요.


package com.google.cloud.storage.storagetransfer.samples;

import com.google.api.services.storagetransfer.v1.Storagetransfer;
import com.google.api.services.storagetransfer.v1.model.AwsAccessKey;
import com.google.api.services.storagetransfer.v1.model.AwsS3Data;
import com.google.api.services.storagetransfer.v1.model.Date;
import com.google.api.services.storagetransfer.v1.model.GcsData;
import com.google.api.services.storagetransfer.v1.model.Schedule;
import com.google.api.services.storagetransfer.v1.model.TimeOfDay;
import com.google.api.services.storagetransfer.v1.model.TransferJob;
import com.google.api.services.storagetransfer.v1.model.TransferSpec;
import java.io.IOException;
import java.io.PrintStream;

/** Creates a one-off transfer job from Amazon S3 to Google Cloud Storage. */
public final class AwsRequester {
  /**
   * Creates and executes a request for a TransferJob from Amazon S3 to Cloud Storage.
   *
   * The {@code startDate} and {@code startTime} parameters should be set according to the UTC
   * Time Zone. See:
   * https://developers.google.com/resources/api-libraries/documentation/storagetransfer/v1/java/latest/com/google/api/services/storagetransfer/v1/model/Schedule.html#getStartTimeOfDay()
   *
   * @return the response TransferJob if the request is successful
   * @throws InstantiationException if instantiation fails when building the TransferJob
   * @throws IllegalAccessException if an illegal access occurs when building the TransferJob
   * @throws IOException if the client failed to complete the request
   */
  public static TransferJob createAwsTransferJob(
      String projectId,
      String jobDescription,
      String awsSourceBucket,
      String gcsSinkBucket,
      String startDate,
      String startTime,
      String awsAccessKeyId,
      String awsSecretAccessKey)
      throws InstantiationException, IllegalAccessException, IOException {
    Date date = TransferJobUtils.createDate(startDate);
    TimeOfDay time = TransferJobUtils.createTimeOfDay(startTime);
    TransferJob transferJob =
        new TransferJob()
            .setDescription(jobDescription)
            .setProjectId(projectId)
            .setTransferSpec(
                new TransferSpec()
                    .setAwsS3DataSource(
                        new AwsS3Data()
                            .setBucketName(awsSourceBucket)
                            .setAwsAccessKey(
                                new AwsAccessKey()
                                    .setAccessKeyId(awsAccessKeyId)
                                    .setSecretAccessKey(awsSecretAccessKey)))
                    .setGcsDataSink(new GcsData().setBucketName(gcsSinkBucket)))
            .setSchedule(
                new Schedule()
                    .setScheduleStartDate(date)
                    .setScheduleEndDate(date)
                    .setStartTimeOfDay(time))
            .setStatus("ENABLED");

    Storagetransfer client = TransferClientCreator.createStorageTransferClient();
    return client.transferJobs().create(transferJob).execute();
  }

  public static void run(PrintStream out)
      throws InstantiationException, IllegalAccessException, IOException {
    String projectId = TransferJobUtils.getPropertyOrFail("projectId");
    String jobDescription = TransferJobUtils.getPropertyOrFail("jobDescription");
    String awsSourceBucket = TransferJobUtils.getPropertyOrFail("awsSourceBucket");
    String gcsSinkBucket = TransferJobUtils.getPropertyOrFail("gcsSinkBucket");
    String startDate = TransferJobUtils.getPropertyOrFail("startDate");
    String startTime = TransferJobUtils.getPropertyOrFail("startTime");
    String awsAccessKeyId = TransferJobUtils.getEnvOrFail("AWS_ACCESS_KEY_ID");
    String awsSecretAccessKey = TransferJobUtils.getEnvOrFail("AWS_SECRET_ACCESS_KEY");

    TransferJob responseT =
        createAwsTransferJob(
            projectId,
            jobDescription,
            awsSourceBucket,
            gcsSinkBucket,
            startDate,
            startTime,
            awsAccessKeyId,
            awsSecretAccessKey);
    out.println("Return transferJob: " + responseT.toPrettyString());
  }

  /** Output the contents of a successfully created TransferJob. */
  public static void main(String[] args) {
    try {
      run(System.out);
    } catch (Exception e) {
      e.printStackTrace();
    }
  }
}

Python

Storage Transfer Service 클라이언트를 만드는 방법은 Google API 라이브러리의 클라이언트 만들기를 참조하세요.

"""Command-line sample that creates a one-time transfer from Amazon S3 to
Google Cloud Storage.

This sample is used on this page:

    https://cloud.google.com/storage/transfer/create-transfer

For more information, see README.md.
"""

import argparse
import datetime
import json

import googleapiclient.discovery

def main(description, project_id, start_date, start_time, source_bucket,
         access_key_id, secret_access_key, sink_bucket):
    """Create a one-time transfer from Amazon S3 to Google Cloud Storage."""
    storagetransfer = googleapiclient.discovery.build('storagetransfer', 'v1')

    # Edit this template with desired parameters.
    transfer_job = {
        'description': description,
        'status': 'ENABLED',
        'projectId': project_id,
        'schedule': {
            'scheduleStartDate': {
                'day': start_date.day,
                'month': start_date.month,
                'year': start_date.year
            },
            'scheduleEndDate': {
                'day': start_date.day,
                'month': start_date.month,
                'year': start_date.year
            },
            'startTimeOfDay': {
                'hours': start_time.hour,
                'minutes': start_time.minute,
                'seconds': start_time.second
            }
        },
        'transferSpec': {
            'awsS3DataSource': {
                'bucketName': source_bucket,
                'awsAccessKey': {
                    'accessKeyId': access_key_id,
                    'secretAccessKey': secret_access_key
                }
            },
            'gcsDataSink': {
                'bucketName': sink_bucket
            }
        }
    }

    result = storagetransfer.transferJobs().create(body=transfer_job).execute()
    print('Returned transferJob: {}'.format(
        json.dumps(result, indent=4)))

if __name__ == '__main__':
    parser = argparse.ArgumentParser(
        description=__doc__,
        formatter_class=argparse.RawDescriptionHelpFormatter)
    parser.add_argument('description', help='Transfer description.')
    parser.add_argument('project_id', help='Your Google Cloud project ID.')
    parser.add_argument('start_date', help='Date YYYY/MM/DD.')
    parser.add_argument('start_time', help='UTC Time (24hr) HH:MM:SS.')
    parser.add_argument('source_bucket', help='AWS source bucket name.')
    parser.add_argument('access_key_id', help='Your AWS access key id.')
    parser.add_argument(
        'secret_access_key',
        help='Your AWS secret access key.'
    )
    parser.add_argument('sink_bucket', help='GCS sink bucket name.')

    args = parser.parse_args()
    start_date = datetime.datetime.strptime(args.start_date, '%Y/%m/%d')
    start_time = datetime.datetime.strptime(args.start_time, '%H:%M:%S')

    main(
        args.description,
        args.project_id,
        start_date,
        start_time,
        args.source_bucket,
        args.access_key_id,
        args.secret_access_key,
        args.sink_bucket)

Microsoft Azure Blob Storage와 Cloud Storage 간 전송

이 예시에서는 Microsoft Azure Storage에서 Cloud Storage 버킷으로 파일을 이동하는 방법을 알아봅니다. 데이터를 Microsoft Azure Storage에서 Cloud Storage로 이동하면 어떤 영향이 있는지 알아보려면 액세스 구성가격 책정을 검토하세요.

REST

transferJobs create를 사용하는 요청:
POST https://storagetransfer.googleapis.com/v1/transferJobs
{
    "description": "YOUR DESCRIPTION",
    "status": "ENABLED",
    "projectId": "PROJECT_ID",
    "schedule": {
        "scheduleStartDate": {
            "day": 14,
            "month": 2,
            "year": 2020
        },
        "scheduleEndDate": {
            "day": 14
            "month": 2,
            "year": 2020
        },
        "startTimeOfDay": {
            "hours": 1,
            "minutes": 1
        }
    },
    "transferSpec": {
        "azureBlobStorageDataSource": {
            "storageAccount": "AZURE_SOURCE_NAME",
            "azureCredentials": {
                "sasToken": "AZURE_SAS_TOKEN",
            },
            "container": "AZURE_CONTAINER",
        },
        "gcsDataSink": {
            "bucketName": "GCS_SINK_NAME"
        }
    }
}
응답:
200 OK
{
    "transferJob": [
        {
            "creationTime": "2020-02-14T01:01:00.000000000Z",
            "description": "YOUR DESCRIPTION",
            "name": "transferJobs/JOB_ID",
            "status": "ENABLED",
            "lastModificationTime": "2020-02-14T01:01:00.000000000Z",
            "projectId": "PROJECT_ID",
            "schedule": {
                "scheduleStartDate": {
                    "day": 14
                    "month": 2,
                    "year": 2020
                },
                "scheduleEndDate": {
                    "day": 14,
                    "month": 2,
                    "year": 2020
                },
                "startTimeOfDay": {
                    "hours": 1,
                    "minutes": 1
                }
            },
            "transferSpec": {
                "azureBlobStorageDataSource": {
                    "storageAccount": "AZURE_SOURCE_NAME",
                    "azureCredentials": {
                        "sasToken": "AZURE_SAS_TOKEN",
                    },
                    "container": "AZURE_CONTAINER",
                },
                "objectConditions": {},
                "transferOptions": {}
            }
        }
    ]
}

Cloud Storage 버킷 간의 전송

이 예시에서는 하나의 Cloud Storage 버킷에서 다른 Cloud Storage 버킷으로 파일을 이동하는 방법을 알아봅니다. 예를 들어 데이터를 다른 위치의 버킷으로 복제할 수 있습니다.

전송 작업을 만들려면 다음을 사용하세요.

REST

transferJobs create를 사용하는 요청:
POST https://storagetransfer.googleapis.com/v1/transferJobs
{
    "description": "YOUR DESCRIPTION",
    "status": "ENABLED",
    "projectId": "PROJECT_ID",
    "schedule": {
        "scheduleStartDate": {
            "day": 1,
            "month": 1,
            "year": 2015
        },
        "startTimeOfDay": {
            "hours": 1,
            "minutes": 1
        }
    },
    "transferSpec": {
        "gcsDataSource": {
            "bucketName": "GCS_SOURCE_NAME"
        },
        "gcsDataSink": {
            "bucketName": "GCS_NEARLINE_SINK_NAME"
        },
        "objectConditions": {
            "minTimeElapsedSinceLastModification": "2592000s"
        },
        "transferOptions": {
            "deleteObjectsFromSourceAfterTransfer": true
        }
    }
}
응답:
200 OK
{
    "transferJob": [
        {
            "creationTime": "2015-01-01T01:01:00.000000000Z",
            "description": "YOUR DESCRIPTION",
            "name": "transferJobs/JOB_ID",
            "status": "ENABLED",
            "lastModificationTime": "2015-01-01T01:01:00.000000000Z",
            "projectId": "PROJECT_ID",
            "schedule": {
                "scheduleStartDate": {
                    "day": 1,
                    "month": 1,
                    "year": 2015
                },
                "startTimeOfDay": {
                    "hours": 1,
                    "minutes": 1
                }
            },
            "transferSpec": {
                "gcsDataSource": {
                    "bucketName": "GCS_SOURCE_NAME",
                },
                "gcsDataSink": {
                    "bucketName": "GCS_NEARLINE_SINK_NAME"
                },
                "objectConditions": {
                    "minTimeElapsedSinceLastModification": "2592000.000s"
                },
                "transferOptions": {
                    "deleteObjectsFromSourceAfterTransfer": true
                }
            }
        }
    ]
}

자바

Storage Transfer Service 클라이언트를 만드는 방법은 Google API 라이브러리의 클라이언트 만들기를 참조하세요.


package com.google.cloud.storage.storagetransfer.samples;

import com.google.api.services.storagetransfer.v1.Storagetransfer;
import com.google.api.services.storagetransfer.v1.model.Date;
import com.google.api.services.storagetransfer.v1.model.GcsData;
import com.google.api.services.storagetransfer.v1.model.ObjectConditions;
import com.google.api.services.storagetransfer.v1.model.Schedule;
import com.google.api.services.storagetransfer.v1.model.TimeOfDay;
import com.google.api.services.storagetransfer.v1.model.TransferJob;
import com.google.api.services.storagetransfer.v1.model.TransferOptions;
import com.google.api.services.storagetransfer.v1.model.TransferSpec;
import java.io.IOException;
import java.io.PrintStream;

/**
 * Creates a daily transfer from a standard Cloud Storage bucket to a Cloud Storage Nearline bucket
 * for files untouched for 30 days.
 */
public final class NearlineRequester {

  /**
   * Creates and executes a request for a TransferJob to Cloud Storage Nearline.
   *
   * The {@code startDate} and {@code startTime} parameters should be set according to the UTC
   * Time Zone. See:
   * https://developers.google.com/resources/api-libraries/documentation/storagetransfer/v1/java/latest/com/google/api/services/storagetransfer/v1/model/Schedule.html#getStartTimeOfDay()
   *
   * @return the response TransferJob if the request is successful
   * @throws InstantiationException if instantiation fails when building the TransferJob
   * @throws IllegalAccessException if an illegal access occurs when building the TransferJob
   * @throws IOException if the client failed to complete the request
   */
  public static TransferJob createNearlineTransferJob(
      String projectId,
      String jobDescription,
      String gcsSourceBucket,
      String gcsNearlineSinkBucket,
      String startDate,
      String startTime)
      throws InstantiationException, IllegalAccessException, IOException {
    Date date = TransferJobUtils.createDate(startDate);
    TimeOfDay time = TransferJobUtils.createTimeOfDay(startTime);
    TransferJob transferJob =
        new TransferJob()
            .setDescription(jobDescription)
            .setProjectId(projectId)
            .setTransferSpec(
                new TransferSpec()
                    .setGcsDataSource(new GcsData().setBucketName(gcsSourceBucket))
                    .setGcsDataSink(new GcsData().setBucketName(gcsNearlineSinkBucket))
                    .setObjectConditions(
                        new ObjectConditions()
                            .setMinTimeElapsedSinceLastModification("2592000s" /* 30 days */))
                    .setTransferOptions(
                        new TransferOptions().setDeleteObjectsFromSourceAfterTransfer(true)))
            .setSchedule(new Schedule().setScheduleStartDate(date).setStartTimeOfDay(time))
            .setStatus("ENABLED");

    Storagetransfer client = TransferClientCreator.createStorageTransferClient();
    return client.transferJobs().create(transferJob).execute();
  }

  public static void run(PrintStream out)
      throws InstantiationException, IllegalAccessException, IOException {
    String projectId = TransferJobUtils.getPropertyOrFail("projectId");
    String jobDescription = TransferJobUtils.getPropertyOrFail("jobDescription");
    String gcsSourceBucket = TransferJobUtils.getPropertyOrFail("gcsSourceBucket");
    String gcsNearlineSinkBucket = TransferJobUtils.getPropertyOrFail("gcsNearlineSinkBucket");
    String startDate = TransferJobUtils.getPropertyOrFail("startDate");
    String startTime = TransferJobUtils.getPropertyOrFail("startTime");

    TransferJob responseT =
        createNearlineTransferJob(
            projectId,
            jobDescription,
            gcsSourceBucket,
            gcsNearlineSinkBucket,
            startDate,
            startTime);
    out.println("Return transferJob: " + responseT.toPrettyString());
  }

  /**
   * Output the contents of a successfully created TransferJob.
   *
   * @param args arguments from the command line
   */
  public static void main(String[] args) {
    try {
      run(System.out);
    } catch (Exception e) {
      e.printStackTrace();
    }
  }
}

Python

Storage Transfer Service 클라이언트를 만드는 방법은 Google API 라이브러리의 클라이언트 만들기를 참조하세요.


"""Command-line sample that creates a daily transfer from a standard
GCS bucket to a Nearline GCS bucket for objects untouched for 30 days.

This sample is used on this page:

    https://cloud.google.com/storage/transfer/create-transfer

For more information, see README.md.
"""

import argparse
import datetime
import json

import googleapiclient.discovery

def main(description, project_id, start_date, start_time, source_bucket,
         sink_bucket):
    """Create a daily transfer from Standard to Nearline Storage class."""
    storagetransfer = googleapiclient.discovery.build('storagetransfer', 'v1')

    # Edit this template with desired parameters.
    transfer_job = {
        'description': description,
        'status': 'ENABLED',
        'projectId': project_id,
        'schedule': {
            'scheduleStartDate': {
                'day': start_date.day,
                'month': start_date.month,
                'year': start_date.year
            },
            'startTimeOfDay': {
                'hours': start_time.hour,
                'minutes': start_time.minute,
                'seconds': start_time.second
            }
        },
        'transferSpec': {
            'gcsDataSource': {
                'bucketName': source_bucket
            },
            'gcsDataSink': {
                'bucketName': sink_bucket
            },
            'objectConditions': {
                'minTimeElapsedSinceLastModification': '2592000s'  # 30 days
            },
            'transferOptions': {
                'deleteObjectsFromSourceAfterTransfer': 'true'
            }
        }
    }

    result = storagetransfer.transferJobs().create(body=transfer_job).execute()
    print('Returned transferJob: {}'.format(
        json.dumps(result, indent=4)))

if __name__ == '__main__':
    parser = argparse.ArgumentParser(
        description=__doc__,
        formatter_class=argparse.RawDescriptionHelpFormatter)
    parser.add_argument('description', help='Transfer description.')
    parser.add_argument('project_id', help='Your Google Cloud project ID.')
    parser.add_argument('start_date', help='Date YYYY/MM/DD.')
    parser.add_argument('start_time', help='UTC Time (24hr) HH:MM:SS.')
    parser.add_argument('source_bucket', help='Standard GCS bucket name.')
    parser.add_argument('sink_bucket', help='Nearline GCS bucket name.')

    args = parser.parse_args()
    start_date = datetime.datetime.strptime(args.start_date, '%Y/%m/%d')
    start_time = datetime.datetime.strptime(args.start_time, '%H:%M:%S')

    main(
        args.description,
        args.project_id,
        start_date,
        start_time,
        args.source_bucket,
        args.sink_bucket)

전송 작업 상태 확인

위의 예시에서 전송 작업의 상태를 확인하려고 할 수 있습니다. 다음 예제 코드는 작업 ID와 프로젝트 ID를 기준으로 전송 작업의 상태를 반환합니다.

REST

transferOperations list를 사용하는 요청:
GET https://storagetransfer.googleapis.com/v1/transferOperations?filter=%7B"project_id":"PROJECT_ID","job_names":%5B"transferJobs/JOB_ID"%5D%7D
응답:

Cloud Storage

소스가 Cloud Storage 버킷이면 응답이 다음과 같습니다.

200 OK
{
    "operations": [
        {
            "done": true,
            "metadata": {
                "@type": "type.googleapis.com/google.storagetransfer.v1.TransferOperation",
                "counters": {},
                "endTime": "2015-01-01T01:01:00.000Z",
                "name": "transferOperations/000000000000000000",
                "projectId": "PROJECT_ID",
                "startTime": "2015-01-01T01:01:00.000",
                "transferSpec": {
                    "gcsDataSink": {
                        "bucketName": "GCS_NEARLINE_SINK_NAME"
                    },
                    "gcsDataSource": {
                        "bucketName": "GCS_SOURCE_NAME"
                    },
                    "objectConditions": {
                        "minTimeElapsedSinceLastModification": "2592000.000s"
                    },
                    "transferOptions": {
                        "deleteObjectsFromSourceAfterTransfer": true
                    }
                },
                "transferStatus": "SUCCESS"
            },
            "name": "transferOperations/000000000000000000",
            "response": {
                "@type": "type.googleapis.com/google.protobuf.Empty"
            }
        }
    ]
}

Amazon S3

소스가 Amazon S3 버킷이면 응답이 다음과 같습니다.

200 OK
{
    "operations": [
        {
            "done": true,
            "metadata": {
                "@type": "type.googleapis.com/google.storagetransfer.v1.TransferOperation",
                "counters": {},
                "endTime": "2015-01-01T01:01:00.000Z",
                "name": "transferOperations/000000000000000000",
                "projectId": "PROJECT_ID",
                "startTime": "2015-01-01T01:01:00.000",
                "transferSpec": {
                    "awsS3DataSource": {
                        "bucketName": "AWS_SOURCE_NAME"
                    },
                    "gcsDataSink": {
                        "bucketName": "GCS_SINK_NAME"
                    },
                    "objectConditions": {},
                    "transferOptions": {}
                },
                "transferStatus": "SUCCESS"
            },
            "name": "transferOperations/000000000000000000",
            "response": {
                "@type": "type.googleapis.com/google.protobuf.Empty"
            }
        }
    ]
}

Microsoft Azure Blob Storage

소스가 Microsoft Azure Storage 버킷인 경우 응답은 다음과 같습니다.

200 OK
{
    "operations": [
        {
            "done": true,
            "metadata": {
                "@type": "type.googleapis.com/google.storagetransfer.v1.TransferOperation",
                "counters": {},
                "endTime": "2020-02-14T01:01:00.000Z",
                "name": "transferOperations/000000000000000000",
                "projectId": "PROJECT_ID",
                "startTime": "2020-02-14T01:01:00.000",
                "transferSpec": {
                    "azureBlobStorageDataSource": {
                        "storageAccount": "AZURE_SOURCE_NAME"
                    },
                    "gcsDataSink": {
                        "bucketName": "GCS_SINK_NAME"
                    },
                    "objectConditions": {},
                    "transferOptions": {}
                },
                "transferStatus": "SUCCESS"
            },
            "name": "transferOperations/000000000000000000",
            "response": {
                "@type": "type.googleapis.com/google.protobuf.Empty"
            }
        }
    ]
}

자바

Storage Transfer Service 클라이언트를 만드는 방법은 애플리케이션 설정을 참조하세요.


package com.google.cloud.storage.storagetransfer.samples;

import com.google.api.services.storagetransfer.v1.Storagetransfer;
import com.google.api.services.storagetransfer.v1.model.ListOperationsResponse;
import java.io.IOException;
import java.util.logging.Logger;

/**
 * Queries for TransferOperations associated with a specific TransferJob. A TransferJob is done when
 * all of its associated TransferOperations have completed.
 */
public final class RequestChecker {

  private static final String PROJECT_ID = "YOUR_PROJECT_ID";
  private static final String JOB_NAME = "YOUR_JOB_NAME";

  private static final Logger LOG = Logger.getLogger(RequestChecker.class.getName());

  /**
   * Creates and executes a query for all associated TransferOperations.
   *
   * @param client a Storagetransfer client, for interacting with the Storage Transfer API
   * @param projectId the project to query within
   * @param jobName the job Name of the relevant TransferJob
   * @return an object containing information on associated TransferOperations
   * @throws IOException if the client failed to complete the request
   */
  public static ListOperationsResponse checkTransfer(
      Storagetransfer client, String projectId, String jobName) throws IOException {
    return client
        .transferOperations()
        .list("transferOperations")
        .setFilter("{\"project_id\": \"" + projectId + "\", \"job_names\": [\"" + jobName + "\"] }")
        .execute();
  }

  /**
   * Output the returned list of TransferOperations.
   *
   * @param args arguments from the command line
   */
  public static void main(String[] args) {
    try {
      ListOperationsResponse resp =
          checkTransfer(TransferClientCreator.createStorageTransferClient(), PROJECT_ID, JOB_NAME);
      LOG.info("Result of transferOperations/list: " + resp.toPrettyString());
    } catch (Exception e) {
      e.printStackTrace();
    }
  }
}

Python

Storage Transfer Service 클라이언트를 만드는 방법은 애플리케이션 설정을 참조하세요.


"""Command-line sample that checks the status of an in-process transfer.

This sample is used on this page:

    https://cloud.google.com/storage/transfer/create-transfer

For more information, see README.md.
"""

import argparse
import json

import googleapiclient.discovery

def main(project_id, job_name):
    """Review the transfer operations associated with a transfer job."""
    storagetransfer = googleapiclient.discovery.build('storagetransfer', 'v1')

    filterString = (
        '{{"project_id": "{project_id}", '
        '"job_names": ["{job_name}"]}}'
    ).format(project_id=project_id, job_name=job_name)

    result = storagetransfer.transferOperations().list(
        name="transferOperations",
        filter=filterString).execute()
    print('Result of transferOperations/list: {}'.format(
        json.dumps(result, indent=4, sort_keys=True)))

if __name__ == '__main__':
    parser = argparse.ArgumentParser(
        description=__doc__,
        formatter_class=argparse.RawDescriptionHelpFormatter)
    parser.add_argument('project_id', help='Your Google Cloud project ID.')
    parser.add_argument('job_name', help='Your job name.')

    args = parser.parse_args()

    main(args.project_id, args.job_name)

전송 작업 취소

단일 전송 작업을 취소하려면 transferOperations cancel 메서드를 사용합니다. 예약되어 있는 이후의 전송 작업을 포함하여 전체 전송 작업을 삭제하려면 transferJobs patch 메서드를 사용하여 전송 작업 상태DELETED로 설정합니다. 작업의 전송 상태를 업데이트해도 현재 실행 중인 전송 작업에는 영향을 주지 않습니다. 진행 중인 전송 작업을 취소하려면 transferOperations cancel 메서드를 사용합니다.

다음 단계

Cloud Storage 사용 방법을 알아봅니다.