Menyalin set data
Tetap teratur dengan koleksi
Simpan dan kategorikan konten berdasarkan preferensi Anda.
Membuat konfigurasi transfer untuk menyalin semua tabel dalam set data di seluruh project, lokasi, atau keduanya.
Jelajahi lebih lanjut
Untuk melihat dokumentasi mendetail yang menyertakan contoh kode ini, lihat referensi berikut:
Contoh kode
Kecuali dinyatakan lain, konten di halaman ini dilisensikan berdasarkan Lisensi Creative Commons Attribution 4.0, sedangkan contoh kode dilisensikan berdasarkan Lisensi Apache 2.0. Untuk mengetahui informasi selengkapnya, lihat Kebijakan Situs Google Developers. Java adalah merek dagang terdaftar dari Oracle dan/atau afiliasinya.
[[["Mudah dipahami","easyToUnderstand","thumb-up"],["Memecahkan masalah saya","solvedMyProblem","thumb-up"],["Lainnya","otherUp","thumb-up"]],[["Sulit dipahami","hardToUnderstand","thumb-down"],["Informasi atau kode contoh salah","incorrectInformationOrSampleCode","thumb-down"],["Informasi/contoh yang saya butuhkan tidak ada","missingTheInformationSamplesINeed","thumb-down"],["Masalah terjemahan","translationIssue","thumb-down"],["Lainnya","otherDown","thumb-down"]],[],[[["\u003cp\u003eA transfer configuration can be created to copy all tables within a dataset to another project, location, or both, using the BigQuery Data Transfer Service.\u003c/p\u003e\n"],["\u003cp\u003eThe Java and Python code samples provided demonstrate how to programmatically copy a dataset to a destination project and dataset, while specifying a scheduled run time.\u003c/p\u003e\n"],["\u003cp\u003eThe dataset transfer configuration requires specifying source and destination project and dataset IDs, as well as a display name and the \u003ccode\u003ecross_region_copy\u003c/code\u003e data source.\u003c/p\u003e\n"],["\u003cp\u003eAuthentication to BigQuery is required using Application Default Credentials, as outlined in the documentation for setting up client libraries.\u003c/p\u003e\n"],["\u003cp\u003eA created transfer configuration will run according to the defined schedule, such as every 24 hours, as specified during its setup.\u003c/p\u003e\n"]]],[],null,["# Copy a dataset\n\nCreate a transfer configuration to copy all tables in a dataset across projects, locations, or both.\n\nExplore further\n---------------\n\n\nFor detailed documentation that includes this code sample, see the following:\n\n- [Manage datasets](/bigquery/docs/managing-datasets)\n\nCode sample\n-----------\n\n### Java\n\n\nBefore trying this sample, follow the Java setup instructions in the\n[BigQuery quickstart using\nclient libraries](/bigquery/docs/quickstarts/quickstart-client-libraries).\n\n\nFor more information, see the\n[BigQuery Java API\nreference documentation](/java/docs/reference/google-cloud-bigquery/latest/overview).\n\n\nTo authenticate to BigQuery, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for client libraries](/bigquery/docs/authentication#client-libs).\n\n import com.google.api.gax.rpc.https://cloud.google.com/java/docs/reference/gax/latest/com.google.api.gax.rpc.ApiException.html;\n import com.google.cloud.bigquery.datatransfer.v1.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.CreateTransferConfigRequest.html;\n import com.google.cloud.bigquery.datatransfer.v1.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.DataTransferServiceClient.html;\n import com.google.cloud.bigquery.datatransfer.v1.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ProjectName.html;\n import com.google.cloud.bigquery.datatransfer.v1.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.TransferConfig.html;\n import com.google.protobuf.https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Struct.html;\n import com.google.protobuf.https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Value.html;\n import java.io.IOException;\n import java.util.HashMap;\n import java.util.Map;\n\n // Sample to copy dataset from another gcp project\n public class CopyDataset {\n\n public static void main(String[] args) throws IOException {\n // TODO(developer): Replace these variables before running the sample.\n final String destinationProjectId = \"MY_DESTINATION_PROJECT_ID\";\n final String destinationDatasetId = \"MY_DESTINATION_DATASET_ID\";\n final String sourceProjectId = \"MY_SOURCE_PROJECT_ID\";\n final String sourceDatasetId = \"MY_SOURCE_DATASET_ID\";\n Map\u003cString, Value\u003e params = new HashMap\u003c\u003e();\n params.put(\"source_project_id\", https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Value.html.newBuilder().setStringValue(sourceProjectId).build());\n params.put(\"source_dataset_id\", https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Value.html.newBuilder().setStringValue(sourceDatasetId).build());\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.TransferConfig.html transferConfig =\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.TransferConfig.html.newBuilder()\n .setDestinationDatasetId(destinationDatasetId)\n .setDisplayName(\"Your Dataset Copy Name\")\n .setDataSourceId(\"cross_region_copy\")\n .setParams(https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Struct.html.newBuilder().https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Struct.Builder.html#com_google_protobuf_Struct_Builder_putAllFields_java_util_Map_java_lang_String_com_google_protobuf_Value__(params).build())\n .setSchedule(\"every 24 hours\")\n .build();\n copyDataset(destinationProjectId, transferConfig);\n }\n\n public static void copyDataset(String projectId, https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.TransferConfig.html transferConfig)\n throws IOException {\n try (https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.DataTransferServiceClient.html dataTransferServiceClient = https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.DataTransferServiceClient.html.create()) {\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ProjectName.html parent = https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ProjectName.html.of(projectId);\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.CreateTransferConfigRequest.html request =\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.CreateTransferConfigRequest.html.newBuilder()\n .setParent(parent.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ProjectName.html#com_google_cloud_bigquery_datatransfer_v1_ProjectName_toString__())\n .setTransferConfig(transferConfig)\n .build();\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.TransferConfig.html config = dataTransferServiceClient.createTransferConfig(request);\n System.out.println(\"Copy dataset created successfully :\" + config.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.TransferConfig.html#com_google_cloud_bigquery_datatransfer_v1_TransferConfig_getName__());\n } catch (https://cloud.google.com/java/docs/reference/gax/latest/com.google.api.gax.rpc.ApiException.html ex) {\n System.out.print(\"Copy dataset was not created.\" + ex.toString());\n }\n }\n }\n\n### Python\n\n\nBefore trying this sample, follow the Python setup instructions in the\n[BigQuery quickstart using\nclient libraries](/bigquery/docs/quickstarts/quickstart-client-libraries).\n\n\nFor more information, see the\n[BigQuery Python API\nreference documentation](/python/docs/reference/bigquery/latest).\n\n\nTo authenticate to BigQuery, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for client libraries](/bigquery/docs/authentication#client-libs).\n\n from google.cloud import bigquery_datatransfer\n\n transfer_client = bigquery_datatransfer.https://cloud.google.com/python/docs/reference/bigquerydatatransfer/latest/google.cloud.bigquery_datatransfer_v1.services.data_transfer_service.DataTransferServiceClient.html()\n\n destination_project_id = \"my-destination-project\"\n destination_dataset_id = \"my_destination_dataset\"\n source_project_id = \"my-source-project\"\n source_dataset_id = \"my_source_dataset\"\n transfer_config = bigquery_datatransfer.https://cloud.google.com/python/docs/reference/bigquerydatatransfer/latest/google.cloud.bigquery_datatransfer_v1.types.TransferConfig.html(\n destination_dataset_id=destination_dataset_id,\n display_name=\"Your Dataset Copy Name\",\n data_source_id=\"cross_region_copy\",\n params={\n \"source_project_id\": source_project_id,\n \"source_dataset_id\": source_dataset_id,\n },\n schedule=\"every 24 hours\",\n )\n transfer_config = transfer_client.https://cloud.google.com/python/docs/reference/bigquerydatatransfer/latest/google.cloud.bigquery_datatransfer_v1.services.data_transfer_service.DataTransferServiceClient.html#google_cloud_bigquery_datatransfer_v1_services_data_transfer_service_DataTransferServiceClient_create_transfer_config(\n parent=transfer_client.https://cloud.google.com/python/docs/reference/bigquerydatatransfer/latest/google.cloud.bigquery_datatransfer_v1.services.data_transfer_service.DataTransferServiceClient.html#google_cloud_bigquery_datatransfer_v1_services_data_transfer_service_DataTransferServiceClient_common_project_path(destination_project_id),\n transfer_config=transfer_config,\n )\n print(f\"Created transfer config: {transfer_config.name}\")\n\nWhat's next\n-----------\n\n\nTo search and filter code samples for other Google Cloud products, see the\n[Google Cloud sample browser](/docs/samples?product=bigquerydatatransfer)."]]