Interface SafetyRatingOrBuilder (3.33.0)

public interface SafetyRatingOrBuilder extends MessageOrBuilder

Implements

MessageOrBuilder

Methods

getBlocked()

public abstract boolean getBlocked()

Output only. Indicates whether the content was filtered out because of this rating.

bool blocked = 3 [(.google.api.field_behavior) = OUTPUT_ONLY];

Returns
TypeDescription
boolean

The blocked.

getCategory()

public abstract HarmCategory getCategory()

Output only. Harm category.

.google.cloud.aiplatform.v1.HarmCategory category = 1 [(.google.api.field_behavior) = OUTPUT_ONLY];

Returns
TypeDescription
HarmCategory

The category.

getCategoryValue()

public abstract int getCategoryValue()

Output only. Harm category.

.google.cloud.aiplatform.v1.HarmCategory category = 1 [(.google.api.field_behavior) = OUTPUT_ONLY];

Returns
TypeDescription
int

The enum numeric value on the wire for category.

getProbability()

public abstract SafetyRating.HarmProbability getProbability()

Output only. Harm probability levels in the content.

.google.cloud.aiplatform.v1.SafetyRating.HarmProbability probability = 2 [(.google.api.field_behavior) = OUTPUT_ONLY];

Returns
TypeDescription
SafetyRating.HarmProbability

The probability.

getProbabilityValue()

public abstract int getProbabilityValue()

Output only. Harm probability levels in the content.

.google.cloud.aiplatform.v1.SafetyRating.HarmProbability probability = 2 [(.google.api.field_behavior) = OUTPUT_ONLY];

Returns
TypeDescription
int

The enum numeric value on the wire for probability.