Interface PredictionServiceGrpc.AsyncService (3.24.0)

public static interface PredictionServiceGrpc.AsyncService

A service for online predictions and explanations.

Methods

explain(ExplainRequest request, StreamObserver<ExplainResponse> responseObserver)

public default void explain(ExplainRequest request, StreamObserver<ExplainResponse> responseObserver)

Perform an online explanation. If deployed_model_id is specified, the corresponding DeployModel must have explanation_spec populated. If deployed_model_id is not specified, all DeployedModels must have explanation_spec populated.

Parameters
NameDescription
requestExplainRequest
responseObserverio.grpc.stub.StreamObserver<ExplainResponse>

predict(PredictRequest request, StreamObserver<PredictResponse> responseObserver)

public default void predict(PredictRequest request, StreamObserver<PredictResponse> responseObserver)

Perform an online prediction.

Parameters
NameDescription
requestPredictRequest
responseObserverio.grpc.stub.StreamObserver<PredictResponse>

rawPredict(RawPredictRequest request, StreamObserver<HttpBody> responseObserver)

public default void rawPredict(RawPredictRequest request, StreamObserver<HttpBody> responseObserver)

Perform an online prediction with an arbitrary HTTP payload. The response includes the following HTTP headers:

  • X-Vertex-AI-Endpoint-Id: ID of the Endpoint that served this prediction.
  • X-Vertex-AI-Deployed-Model-Id: ID of the Endpoint's DeployedModel that served this prediction.
Parameters
NameDescription
requestRawPredictRequest
responseObserverio.grpc.stub.StreamObserver<com.google.api.HttpBody>

serverStreamingPredict(StreamingPredictRequest request, StreamObserver<StreamingPredictResponse> responseObserver)

public default void serverStreamingPredict(StreamingPredictRequest request, StreamObserver<StreamingPredictResponse> responseObserver)

Perform a server-side streaming online prediction request for Vertex LLM streaming.

Parameters
NameDescription
requestStreamingPredictRequest
responseObserverio.grpc.stub.StreamObserver<StreamingPredictResponse>