public static final class DataSourceServiceGrpc.DataSourceServiceStub extends AbstractStub<DataSourceServiceGrpc.DataSourceServiceStub>
The Google BigQuery Data Transfer API allows BigQuery users to
configure transfer of their data from other Google Products into BigQuery.
This service exposes methods that should be used by data source backend.
Inheritance
java.lang.Object >
io.grpc.stub.AbstractStub >
DataSourceServiceGrpc.DataSourceServiceStub
Inherited Members
io.grpc.stub.AbstractStub.<T>newStub(io.grpc.stub.AbstractStub.StubFactory<T>,io.grpc.Channel)
io.grpc.stub.AbstractStub.<T>newStub(io.grpc.stub.AbstractStub.StubFactory<T>,io.grpc.Channel,io.grpc.CallOptions)
io.grpc.stub.AbstractStub.<T>withOption(io.grpc.CallOptions.Key<T>,T)
io.grpc.stub.AbstractStub.build(io.grpc.Channel,io.grpc.CallOptions)
io.grpc.stub.AbstractStub.getCallOptions()
io.grpc.stub.AbstractStub.getChannel()
io.grpc.stub.AbstractStub.withCallCredentials(io.grpc.CallCredentials)
io.grpc.stub.AbstractStub.withChannel(io.grpc.Channel)
io.grpc.stub.AbstractStub.withCompression(java.lang.String)
io.grpc.stub.AbstractStub.withDeadline(io.grpc.Deadline)
io.grpc.stub.AbstractStub.withDeadlineAfter(long,java.util.concurrent.TimeUnit)
io.grpc.stub.AbstractStub.withExecutor(java.util.concurrent.Executor)
io.grpc.stub.AbstractStub.withInterceptors(io.grpc.ClientInterceptor...)
io.grpc.stub.AbstractStub.withMaxInboundMessageSize(int)
io.grpc.stub.AbstractStub.withMaxOutboundMessageSize(int)
io.grpc.stub.AbstractStub.withWaitForReady()
Methods
build(Channel channel, CallOptions callOptions)
protected DataSourceServiceGrpc.DataSourceServiceStub build(Channel channel, CallOptions callOptions)
Parameters
Name | Description |
channel | io.grpc.Channel
|
callOptions | io.grpc.CallOptions
|
Returns
Overrides
io.grpc.stub.AbstractStub.build(io.grpc.Channel,io.grpc.CallOptions)
createDataSourceDefinition(CreateDataSourceDefinitionRequest request, StreamObserver<DataSourceDefinition> responseObserver)
public void createDataSourceDefinition(CreateDataSourceDefinitionRequest request, StreamObserver<DataSourceDefinition> responseObserver)
Creates a data source definition. Calling this method will automatically
use your credentials to create the following Google Cloud resources in
YOUR Google Cloud project.
- OAuth client
- Pub/Sub Topics and Subscriptions in each supported_location_ids. e.g.,
projects/{project_id}/{topics|subscriptions}/bigquerydatatransfer.{data_source_id}.{location_id}.run
The field data_source.client_id should be left empty in the input request,
as the API will create a new OAuth client on behalf of the caller. On the
other hand data_source.scopes usually need to be set when there are OAuth
scopes that need to be granted by end users.
- We need a longer deadline due to the 60 seconds SLO from Pub/Sub admin
Operations. This also applies to update and delete data source definition.
Parameters
deleteDataSourceDefinition(DeleteDataSourceDefinitionRequest request, StreamObserver<Empty> responseObserver)
public void deleteDataSourceDefinition(DeleteDataSourceDefinitionRequest request, StreamObserver<Empty> responseObserver)
Deletes a data source definition, all of the transfer configs associated
with this data source definition (if any) must be deleted first by the user
in ALL regions, in order to delete the data source definition.
This method is primarily meant for deleting data sources created during
testing stage.
If the data source is referenced by transfer configs in the region
specified in the request URL, the method will fail immediately. If in the
current region (e.g., US) it's not used by any transfer configs, but in
another region (e.g., EU) it is, then although the method will succeed in
region US, but it will fail when the deletion operation is replicated to
region EU. And eventually, the system will replicate the data source
definition back from EU to US, in order to bring all regions to
consistency. The final effect is that the data source appears to be
'undeleted' in the US region.
Parameters
finishRun(FinishRunRequest request, StreamObserver<Empty> responseObserver)
public void finishRun(FinishRunRequest request, StreamObserver<Empty> responseObserver)
Notify the Data Transfer Service that the data source is done processing
the run. No more status updates or requests to start/monitor jobs will be
accepted. The run will be finalized by the Data Transfer Service when all
monitored jobs are completed.
Does not need to be called if the run is set to FAILED.
Parameters
getDataSourceDefinition(GetDataSourceDefinitionRequest request, StreamObserver<DataSourceDefinition> responseObserver)
public void getDataSourceDefinition(GetDataSourceDefinitionRequest request, StreamObserver<DataSourceDefinition> responseObserver)
Retrieves an existing data source definition.
Parameters
listDataSourceDefinitions(ListDataSourceDefinitionsRequest request, StreamObserver<ListDataSourceDefinitionsResponse> responseObserver)
public void listDataSourceDefinitions(ListDataSourceDefinitionsRequest request, StreamObserver<ListDataSourceDefinitionsResponse> responseObserver)
Lists supported data source definitions.
Parameters
logTransferRunMessages(LogTransferRunMessagesRequest request, StreamObserver<Empty> responseObserver)
public void logTransferRunMessages(LogTransferRunMessagesRequest request, StreamObserver<Empty> responseObserver)
Log messages for a transfer run. If successful (at least 1 message), resets
data_source.update_deadline_seconds timer.
Parameters
startBigQueryJobs(StartBigQueryJobsRequest request, StreamObserver<Empty> responseObserver)
public void startBigQueryJobs(StartBigQueryJobsRequest request, StreamObserver<Empty> responseObserver)
Notify the Data Transfer Service that data is ready for loading.
The Data Transfer Service will start and monitor multiple BigQuery Load
jobs for a transfer run. Monitored jobs will be automatically retried
and produce log messages when starting and finishing a job.
Can be called multiple times for the same transfer run.
Parameters
updateDataSourceDefinition(UpdateDataSourceDefinitionRequest request, StreamObserver<DataSourceDefinition> responseObserver)
public void updateDataSourceDefinition(UpdateDataSourceDefinitionRequest request, StreamObserver<DataSourceDefinition> responseObserver)
Updates an existing data source definition. If changing
supported_location_ids, triggers same effects as mentioned in "Create a
data source definition."
Parameters
updateTransferRun(UpdateTransferRunRequest request, StreamObserver<TransferRun> responseObserver)
public void updateTransferRun(UpdateTransferRunRequest request, StreamObserver<TransferRun> responseObserver)
Update a transfer run. If successful, resets
data_source.update_deadline_seconds timer.
Parameters