데이터 세트 복사
컬렉션을 사용해 정리하기
내 환경설정을 기준으로 콘텐츠를 저장하고 분류하세요.
전송 구성을 만들어 프로젝트, 위치 또는 둘 다에서 데이터 세트의 모든 테이블을 복사합니다.
더 살펴보기
이 코드 샘플이 포함된 자세한 문서는 다음을 참조하세요.
코드 샘플
달리 명시되지 않는 한 이 페이지의 콘텐츠에는 Creative Commons Attribution 4.0 라이선스에 따라 라이선스가 부여되며, 코드 샘플에는 Apache 2.0 라이선스에 따라 라이선스가 부여됩니다. 자세한 내용은 Google Developers 사이트 정책을 참조하세요. 자바는 Oracle 및/또는 Oracle 계열사의 등록 상표입니다.
[[["이해하기 쉬움","easyToUnderstand","thumb-up"],["문제가 해결됨","solvedMyProblem","thumb-up"],["기타","otherUp","thumb-up"]],[["이해하기 어려움","hardToUnderstand","thumb-down"],["잘못된 정보 또는 샘플 코드","incorrectInformationOrSampleCode","thumb-down"],["필요한 정보/샘플이 없음","missingTheInformationSamplesINeed","thumb-down"],["번역 문제","translationIssue","thumb-down"],["기타","otherDown","thumb-down"]],[],[[["\u003cp\u003eA transfer configuration can be created to copy all tables within a dataset to another project, location, or both, using the BigQuery Data Transfer Service.\u003c/p\u003e\n"],["\u003cp\u003eThe Java and Python code samples provided demonstrate how to programmatically copy a dataset to a destination project and dataset, while specifying a scheduled run time.\u003c/p\u003e\n"],["\u003cp\u003eThe dataset transfer configuration requires specifying source and destination project and dataset IDs, as well as a display name and the \u003ccode\u003ecross_region_copy\u003c/code\u003e data source.\u003c/p\u003e\n"],["\u003cp\u003eAuthentication to BigQuery is required using Application Default Credentials, as outlined in the documentation for setting up client libraries.\u003c/p\u003e\n"],["\u003cp\u003eA created transfer configuration will run according to the defined schedule, such as every 24 hours, as specified during its setup.\u003c/p\u003e\n"]]],[],null,["# Copy a dataset\n\nCreate a transfer configuration to copy all tables in a dataset across projects, locations, or both.\n\nExplore further\n---------------\n\n\nFor detailed documentation that includes this code sample, see the following:\n\n- [Manage datasets](/bigquery/docs/managing-datasets)\n\nCode sample\n-----------\n\n### Java\n\n\nBefore trying this sample, follow the Java setup instructions in the\n[BigQuery quickstart using\nclient libraries](/bigquery/docs/quickstarts/quickstart-client-libraries).\n\n\nFor more information, see the\n[BigQuery Java API\nreference documentation](/java/docs/reference/google-cloud-bigquery/latest/overview).\n\n\nTo authenticate to BigQuery, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for client libraries](/bigquery/docs/authentication#client-libs).\n\n import com.google.api.gax.rpc.https://cloud.google.com/java/docs/reference/gax/latest/com.google.api.gax.rpc.ApiException.html;\n import com.google.cloud.bigquery.datatransfer.v1.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.CreateTransferConfigRequest.html;\n import com.google.cloud.bigquery.datatransfer.v1.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.DataTransferServiceClient.html;\n import com.google.cloud.bigquery.datatransfer.v1.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ProjectName.html;\n import com.google.cloud.bigquery.datatransfer.v1.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.TransferConfig.html;\n import com.google.protobuf.https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Struct.html;\n import com.google.protobuf.https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Value.html;\n import java.io.IOException;\n import java.util.HashMap;\n import java.util.Map;\n\n // Sample to copy dataset from another gcp project\n public class CopyDataset {\n\n public static void main(String[] args) throws IOException {\n // TODO(developer): Replace these variables before running the sample.\n final String destinationProjectId = \"MY_DESTINATION_PROJECT_ID\";\n final String destinationDatasetId = \"MY_DESTINATION_DATASET_ID\";\n final String sourceProjectId = \"MY_SOURCE_PROJECT_ID\";\n final String sourceDatasetId = \"MY_SOURCE_DATASET_ID\";\n Map\u003cString, Value\u003e params = new HashMap\u003c\u003e();\n params.put(\"source_project_id\", https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Value.html.newBuilder().setStringValue(sourceProjectId).build());\n params.put(\"source_dataset_id\", https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Value.html.newBuilder().setStringValue(sourceDatasetId).build());\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.TransferConfig.html transferConfig =\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.TransferConfig.html.newBuilder()\n .setDestinationDatasetId(destinationDatasetId)\n .setDisplayName(\"Your Dataset Copy Name\")\n .setDataSourceId(\"cross_region_copy\")\n .setParams(https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Struct.html.newBuilder().https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Struct.Builder.html#com_google_protobuf_Struct_Builder_putAllFields_java_util_Map_java_lang_String_com_google_protobuf_Value__(params).build())\n .setSchedule(\"every 24 hours\")\n .build();\n copyDataset(destinationProjectId, transferConfig);\n }\n\n public static void copyDataset(String projectId, https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.TransferConfig.html transferConfig)\n throws IOException {\n try (https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.DataTransferServiceClient.html dataTransferServiceClient = https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.DataTransferServiceClient.html.create()) {\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ProjectName.html parent = https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ProjectName.html.of(projectId);\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.CreateTransferConfigRequest.html request =\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.CreateTransferConfigRequest.html.newBuilder()\n .setParent(parent.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ProjectName.html#com_google_cloud_bigquery_datatransfer_v1_ProjectName_toString__())\n .setTransferConfig(transferConfig)\n .build();\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.TransferConfig.html config = dataTransferServiceClient.createTransferConfig(request);\n System.out.println(\"Copy dataset created successfully :\" + config.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.TransferConfig.html#com_google_cloud_bigquery_datatransfer_v1_TransferConfig_getName__());\n } catch (https://cloud.google.com/java/docs/reference/gax/latest/com.google.api.gax.rpc.ApiException.html ex) {\n System.out.print(\"Copy dataset was not created.\" + ex.toString());\n }\n }\n }\n\n### Python\n\n\nBefore trying this sample, follow the Python setup instructions in the\n[BigQuery quickstart using\nclient libraries](/bigquery/docs/quickstarts/quickstart-client-libraries).\n\n\nFor more information, see the\n[BigQuery Python API\nreference documentation](/python/docs/reference/bigquery/latest).\n\n\nTo authenticate to BigQuery, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for client libraries](/bigquery/docs/authentication#client-libs).\n\n from google.cloud import bigquery_datatransfer\n\n transfer_client = bigquery_datatransfer.https://cloud.google.com/python/docs/reference/bigquerydatatransfer/latest/google.cloud.bigquery_datatransfer_v1.services.data_transfer_service.DataTransferServiceClient.html()\n\n destination_project_id = \"my-destination-project\"\n destination_dataset_id = \"my_destination_dataset\"\n source_project_id = \"my-source-project\"\n source_dataset_id = \"my_source_dataset\"\n transfer_config = bigquery_datatransfer.https://cloud.google.com/python/docs/reference/bigquerydatatransfer/latest/google.cloud.bigquery_datatransfer_v1.types.TransferConfig.html(\n destination_dataset_id=destination_dataset_id,\n display_name=\"Your Dataset Copy Name\",\n data_source_id=\"cross_region_copy\",\n params={\n \"source_project_id\": source_project_id,\n \"source_dataset_id\": source_dataset_id,\n },\n schedule=\"every 24 hours\",\n )\n transfer_config = transfer_client.https://cloud.google.com/python/docs/reference/bigquerydatatransfer/latest/google.cloud.bigquery_datatransfer_v1.services.data_transfer_service.DataTransferServiceClient.html#google_cloud_bigquery_datatransfer_v1_services_data_transfer_service_DataTransferServiceClient_create_transfer_config(\n parent=transfer_client.https://cloud.google.com/python/docs/reference/bigquerydatatransfer/latest/google.cloud.bigquery_datatransfer_v1.services.data_transfer_service.DataTransferServiceClient.html#google_cloud_bigquery_datatransfer_v1_services_data_transfer_service_DataTransferServiceClient_common_project_path(destination_project_id),\n transfer_config=transfer_config,\n )\n print(f\"Created transfer config: {transfer_config.name}\")\n\nWhat's next\n-----------\n\n\nTo search and filter code samples for other Google Cloud products, see the\n[Google Cloud sample browser](/docs/samples?product=bigquerydatatransfer)."]]