Amazon Redshift からデータを読み込む
コレクションでコンテンツを整理
必要に応じて、コンテンツの保存と分類を行います。
Amazon Redshift から BigQuery への定期的な読み込みジョブをスケジュールします。
もっと見る
このコードサンプルを含む詳細なドキュメントについては、以下をご覧ください。
コードサンプル
特に記載のない限り、このページのコンテンツはクリエイティブ・コモンズの表示 4.0 ライセンスにより使用許諾されます。コードサンプルは Apache 2.0 ライセンスにより使用許諾されます。詳しくは、Google Developers サイトのポリシーをご覧ください。Java は Oracle および関連会社の登録商標です。
[[["わかりやすい","easyToUnderstand","thumb-up"],["問題の解決に役立った","solvedMyProblem","thumb-up"],["その他","otherUp","thumb-up"]],[["わかりにくい","hardToUnderstand","thumb-down"],["情報またはサンプルコードが不正確","incorrectInformationOrSampleCode","thumb-down"],["必要な情報 / サンプルがない","missingTheInformationSamplesINeed","thumb-down"],["翻訳に関する問題","translationIssue","thumb-down"],["その他","otherDown","thumb-down"]],[],[[["\u003cp\u003eThis code sample demonstrates how to schedule recurring data load jobs from Amazon Redshift into BigQuery.\u003c/p\u003e\n"],["\u003cp\u003eThe process involves setting up a transfer configuration with details like JDBC URL, database credentials, AWS access keys, S3 bucket, and Redshift schema.\u003c/p\u003e\n"],["\u003cp\u003eThe code uses the BigQuery Data Transfer Service client library for Java to create and manage the transfer configuration.\u003c/p\u003e\n"],["\u003cp\u003eYou must authenticate to BigQuery and set up Application Default Credentials before executing this transfer job, as well as fulfill the setup instructions detailed in the BigQuery quickstart using client libraries guide.\u003c/p\u003e\n"],["\u003cp\u003eThe defined transfer configuration can be set to run on a recurring basis, such as every 24 hours.\u003c/p\u003e\n"]]],[],null,["# Load data from Amazon Redshift\n\nSchedule recurring load jobs from Amazon Redshift into BigQuery.\n\nExplore further\n---------------\n\n\nFor detailed documentation that includes this code sample, see the following:\n\n- [Migrate schema and data from Amazon Redshift](/bigquery/docs/migration/redshift)\n\nCode sample\n-----------\n\n### Java\n\n\nBefore trying this sample, follow the Java setup instructions in the\n[BigQuery quickstart using\nclient libraries](/bigquery/docs/quickstarts/quickstart-client-libraries).\n\n\nFor more information, see the\n[BigQuery Java API\nreference documentation](/java/docs/reference/google-cloud-bigquery/latest/overview).\n\n\nTo authenticate to BigQuery, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for client libraries](/bigquery/docs/authentication#client-libs).\n\n import com.google.api.gax.rpc.https://cloud.google.com/java/docs/reference/gax/latest/com.google.api.gax.rpc.ApiException.html;\n import com.google.cloud.bigquery.datatransfer.v1.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.CreateTransferConfigRequest.html;\n import com.google.cloud.bigquery.datatransfer.v1.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.DataTransferServiceClient.html;\n import com.google.cloud.bigquery.datatransfer.v1.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ProjectName.html;\n import com.google.cloud.bigquery.datatransfer.v1.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.TransferConfig.html;\n import com.google.protobuf.https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Struct.html;\n import com.google.protobuf.https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Value.html;\n import java.io.IOException;\n import java.util.HashMap;\n import java.util.Map;\n\n // Sample to create redshift transfer config\n public class CreateRedshiftTransfer {\n\n public static void main(String[] args) throws IOException {\n // TODO(developer): Replace these variables before running the sample.\n final String projectId = \"MY_PROJECT_ID\";\n String datasetId = \"MY_DATASET_ID\";\n String datasetRegion = \"US\";\n String jdbcUrl = \"MY_JDBC_URL_CONNECTION_REDSHIFT\";\n String dbUserName = \"MY_USERNAME\";\n String dbPassword = \"MY_PASSWORD\";\n String accessKeyId = \"MY_AWS_ACCESS_KEY_ID\";\n String secretAccessId = \"MY_AWS_SECRET_ACCESS_ID\";\n String s3Bucket = \"MY_S3_BUCKET_URI\";\n String redShiftSchema = \"MY_REDSHIFT_SCHEMA\";\n String tableNamePatterns = \"*\";\n String vpcAndReserveIpRange = \"MY_VPC_AND_IP_RANGE\";\n Map\u003cString, Value\u003e params = new HashMap\u003c\u003e();\n params.put(\"jdbc_url\", https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Value.html.newBuilder().setStringValue(jdbcUrl).build());\n params.put(\"database_username\", https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Value.html.newBuilder().setStringValue(dbUserName).build());\n params.put(\"database_password\", https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Value.html.newBuilder().setStringValue(dbPassword).build());\n params.put(\"access_key_id\", https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Value.html.newBuilder().setStringValue(accessKeyId).build());\n params.put(\"secret_access_key\", https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Value.html.newBuilder().setStringValue(secretAccessId).build());\n params.put(\"s3_bucket\", https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Value.html.newBuilder().setStringValue(s3Bucket).build());\n params.put(\"redshift_schema\", https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Value.html.newBuilder().setStringValue(redShiftSchema).build());\n params.put(\"table_name_patterns\", https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Value.html.newBuilder().setStringValue(tableNamePatterns).build());\n params.put(\n \"migration_infra_cidr\", https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Value.html.newBuilder().setStringValue(vpcAndReserveIpRange).build());\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.TransferConfig.html transferConfig =\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.TransferConfig.html.newBuilder()\n .setDestinationDatasetId(datasetId)\n .https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.TransferConfig.Builder.html#com_google_cloud_bigquery_datatransfer_v1_TransferConfig_Builder_setDatasetRegion_java_lang_String_(datasetRegion)\n .setDisplayName(\"Your Redshift Config Name\")\n .setDataSourceId(\"redshift\")\n .setParams(https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Struct.html.newBuilder().https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Struct.Builder.html#com_google_protobuf_Struct_Builder_putAllFields_java_util_Map_java_lang_String_com_google_protobuf_Value__(params).build())\n .setSchedule(\"every 24 hours\")\n .build();\n createRedshiftTransfer(projectId, transferConfig);\n }\n\n public static void createRedshiftTransfer(String projectId, https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.TransferConfig.html transferConfig)\n throws IOException {\n try (https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.DataTransferServiceClient.html client = https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.DataTransferServiceClient.html.create()) {\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ProjectName.html parent = https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ProjectName.html.of(projectId);\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.CreateTransferConfigRequest.html request =\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.CreateTransferConfigRequest.html.newBuilder()\n .setParent(parent.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ProjectName.html#com_google_cloud_bigquery_datatransfer_v1_ProjectName_toString__())\n .setTransferConfig(transferConfig)\n .build();\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.TransferConfig.html config = client.createTransferConfig(request);\n System.out.println(\"Cloud redshift transfer created successfully :\" + config.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.TransferConfig.html#com_google_cloud_bigquery_datatransfer_v1_TransferConfig_getName__());\n } catch (https://cloud.google.com/java/docs/reference/gax/latest/com.google.api.gax.rpc.ApiException.html ex) {\n System.out.print(\"Cloud redshift transfer was not created.\" + ex.toString());\n }\n }\n }\n\nWhat's next\n-----------\n\n\nTo search and filter code samples for other Google Cloud products, see the\n[Google Cloud sample browser](/docs/samples?product=bigquerydatatransfer)."]]