Class ModelMonitoringObjectiveConfig.Types.ExplanationConfig (1.6.0)

public sealed class ExplanationConfig : IMessage<ModelMonitoringObjectiveConfig.Types.ExplanationConfig>, IEquatable<ModelMonitoringObjectiveConfig.Types.ExplanationConfig>, IDeepCloneable<ModelMonitoringObjectiveConfig.Types.ExplanationConfig>, IBufferMessage, IMessage

The config for integrating with Vertex Explainable AI. Only applicable if the Model has explanation_spec populated.

Inheritance

Object > ModelMonitoringObjectiveConfig.Types.ExplanationConfig

Namespace

Google.Cloud.AIPlatform.V1

Assembly

Google.Cloud.AIPlatform.V1.dll

Constructors

ExplanationConfig()

public ExplanationConfig()

ExplanationConfig(ModelMonitoringObjectiveConfig.Types.ExplanationConfig)

public ExplanationConfig(ModelMonitoringObjectiveConfig.Types.ExplanationConfig other)
Parameter
NameDescription
otherModelMonitoringObjectiveConfig.Types.ExplanationConfig

Properties

EnableFeatureAttributes

public bool EnableFeatureAttributes { get; set; }

If want to analyze the Vertex Explainable AI feature attribute scores or not. If set to true, Vertex AI will log the feature attributions from explain response and do the skew/drift detection for them.

Property Value
TypeDescription
Boolean

ExplanationBaseline

public ModelMonitoringObjectiveConfig.Types.ExplanationConfig.Types.ExplanationBaseline ExplanationBaseline { get; set; }

Predictions generated by the BatchPredictionJob using baseline dataset.

Property Value
TypeDescription
ModelMonitoringObjectiveConfig.Types.ExplanationConfig.Types.ExplanationBaseline