安排从 Amazon S3 到 BigQuery 的周期性加载作业。
深入探索
如需查看包含此代码示例的详细文档,请参阅以下内容:
代码示例
Java
试用此示例之前,请按照 BigQuery 快速入门:使用客户端库中的 Java 设置说明进行操作。如需了解详情,请参阅 BigQuery Java API 参考文档。
如需向 BigQuery 进行身份验证,请设置应用默认凭据。 如需了解详情,请参阅为客户端库设置身份验证。
import com.google.api.gax.rpc.ApiException;
import com.google.cloud.bigquery.datatransfer.v1.CreateTransferConfigRequest;
import com.google.cloud.bigquery.datatransfer.v1.DataTransferServiceClient;
import com.google.cloud.bigquery.datatransfer.v1.ProjectName;
import com.google.cloud.bigquery.datatransfer.v1.TransferConfig;
import com.google.protobuf.Struct;
import com.google.protobuf.Value;
import java.io.IOException;
import java.util.HashMap;
import java.util.Map;
// Sample to create amazon s3 transfer config.
public class CreateAmazonS3Transfer {
public static void main(String[] args) throws IOException {
// TODO(developer): Replace these variables before running the sample.
final String projectId = "MY_PROJECT_ID";
String datasetId = "MY_DATASET_ID";
String tableId = "MY_TABLE_ID";
// Amazon S3 Bucket Uri with read role permission
String sourceUri = "s3://your-bucket-name/*";
String awsAccessKeyId = "MY_AWS_ACCESS_KEY_ID";
String awsSecretAccessId = "AWS_SECRET_ACCESS_ID";
String sourceFormat = "CSV";
String fieldDelimiter = ",";
String skipLeadingRows = "1";
Map<String, Value> params = new HashMap<>();
params.put(
"destination_table_name_template", Value.newBuilder().setStringValue(tableId).build());
params.put("data_path", Value.newBuilder().setStringValue(sourceUri).build());
params.put("access_key_id", Value.newBuilder().setStringValue(awsAccessKeyId).build());
params.put("secret_access_key", Value.newBuilder().setStringValue(awsSecretAccessId).build());
params.put("source_format", Value.newBuilder().setStringValue(sourceFormat).build());
params.put("field_delimiter", Value.newBuilder().setStringValue(fieldDelimiter).build());
params.put("skip_leading_rows", Value.newBuilder().setStringValue(skipLeadingRows).build());
TransferConfig transferConfig =
TransferConfig.newBuilder()
.setDestinationDatasetId(datasetId)
.setDisplayName("Your Aws S3 Config Name")
.setDataSourceId("amazon_s3")
.setParams(Struct.newBuilder().putAllFields(params).build())
.setSchedule("every 24 hours")
.build();
createAmazonS3Transfer(projectId, transferConfig);
}
public static void createAmazonS3Transfer(String projectId, TransferConfig transferConfig)
throws IOException {
try (DataTransferServiceClient client = DataTransferServiceClient.create()) {
ProjectName parent = ProjectName.of(projectId);
CreateTransferConfigRequest request =
CreateTransferConfigRequest.newBuilder()
.setParent(parent.toString())
.setTransferConfig(transferConfig)
.build();
TransferConfig config = client.createTransferConfig(request);
System.out.println("Amazon s3 transfer created successfully :" + config.getName());
} catch (ApiException ex) {
System.out.print("Amazon s3 transfer was not created." + ex.toString());
}
}
}
后续步骤
如需搜索和过滤其他 Google Cloud 产品的代码示例,请参阅 Google Cloud 示例浏览器。