- 0.58.0 (latest)
- 0.57.0
- 0.56.0
- 0.55.0
- 0.54.0
- 0.53.0
- 0.52.0
- 0.51.0
- 0.50.0
- 0.49.0
- 0.48.0
- 0.47.0
- 0.46.0
- 0.45.0
- 0.44.0
- 0.43.0
- 0.42.0
- 0.41.0
- 0.40.0
- 0.39.0
- 0.38.0
- 0.37.0
- 0.36.0
- 0.35.0
- 0.34.0
- 0.33.0
- 0.32.0
- 0.31.0
- 0.30.0
- 0.29.0
- 0.28.0
- 0.27.0
- 0.26.0
- 0.25.0
- 0.24.0
- 0.23.0
- 0.22.0
- 0.21.0
- 0.20.0
- 0.19.0
- 0.18.0
- 0.17.0
- 0.16.0
- 0.15.0
- 0.14.0
- 0.13.0
- 0.12.0
- 0.11.0
- 0.10.0
- 0.9.1
- 0.8.0
- 0.7.0
- 0.6.0
- 0.5.0
- 0.4.0
- 0.3.0
- 0.2.0
- 0.1.0
Metadata of the input of a feature.
Fields other than InputMetadata.input_baselines are applicable only for Models that are using Vertex AI-provided images for Tensorflow.
Inherits
- Object
Extended By
- Google::Protobuf::MessageExts::ClassMethods
Includes
- Google::Protobuf::MessageExts
Methods
#dense_shape_tensor_name
def dense_shape_tensor_name() -> ::String
- (::String) — Specifies the shape of the values of the input if the input is a sparse representation. Refer to Tensorflow documentation for more details: https://www.tensorflow.org/api_docs/python/tf/sparse/SparseTensor.
#dense_shape_tensor_name=
def dense_shape_tensor_name=(value) -> ::String
- value (::String) — Specifies the shape of the values of the input if the input is a sparse representation. Refer to Tensorflow documentation for more details: https://www.tensorflow.org/api_docs/python/tf/sparse/SparseTensor.
- (::String) — Specifies the shape of the values of the input if the input is a sparse representation. Refer to Tensorflow documentation for more details: https://www.tensorflow.org/api_docs/python/tf/sparse/SparseTensor.
#encoded_baselines
def encoded_baselines() -> ::Array<::Google::Protobuf::Value>
-
(::Array<::Google::Protobuf::Value>) — A list of baselines for the encoded tensor.
The shape of each baseline should match the shape of the encoded tensor. If a scalar is provided, Vertex AI broadcasts to the same shape as the encoded tensor.
#encoded_baselines=
def encoded_baselines=(value) -> ::Array<::Google::Protobuf::Value>
-
value (::Array<::Google::Protobuf::Value>) — A list of baselines for the encoded tensor.
The shape of each baseline should match the shape of the encoded tensor. If a scalar is provided, Vertex AI broadcasts to the same shape as the encoded tensor.
-
(::Array<::Google::Protobuf::Value>) — A list of baselines for the encoded tensor.
The shape of each baseline should match the shape of the encoded tensor. If a scalar is provided, Vertex AI broadcasts to the same shape as the encoded tensor.
#encoded_tensor_name
def encoded_tensor_name() -> ::String
-
(::String) — Encoded tensor is a transformation of the input tensor. Must be provided
if choosing
Integrated Gradients attribution
or XRAI attribution and the
input tensor is not differentiable.
An encoded tensor is generated if the input tensor is encoded by a lookup table.
#encoded_tensor_name=
def encoded_tensor_name=(value) -> ::String
-
value (::String) — Encoded tensor is a transformation of the input tensor. Must be provided
if choosing
Integrated Gradients attribution
or XRAI attribution and the
input tensor is not differentiable.
An encoded tensor is generated if the input tensor is encoded by a lookup table.
-
(::String) — Encoded tensor is a transformation of the input tensor. Must be provided
if choosing
Integrated Gradients attribution
or XRAI attribution and the
input tensor is not differentiable.
An encoded tensor is generated if the input tensor is encoded by a lookup table.
#encoding
def encoding() -> ::Google::Cloud::AIPlatform::V1::ExplanationMetadata::InputMetadata::Encoding
- (::Google::Cloud::AIPlatform::V1::ExplanationMetadata::InputMetadata::Encoding) — Defines how the feature is encoded into the input tensor. Defaults to IDENTITY.
#encoding=
def encoding=(value) -> ::Google::Cloud::AIPlatform::V1::ExplanationMetadata::InputMetadata::Encoding
- value (::Google::Cloud::AIPlatform::V1::ExplanationMetadata::InputMetadata::Encoding) — Defines how the feature is encoded into the input tensor. Defaults to IDENTITY.
- (::Google::Cloud::AIPlatform::V1::ExplanationMetadata::InputMetadata::Encoding) — Defines how the feature is encoded into the input tensor. Defaults to IDENTITY.
#feature_value_domain
def feature_value_domain() -> ::Google::Cloud::AIPlatform::V1::ExplanationMetadata::InputMetadata::FeatureValueDomain
- (::Google::Cloud::AIPlatform::V1::ExplanationMetadata::InputMetadata::FeatureValueDomain) — The domain details of the input feature value. Like min/max, original mean or standard deviation if normalized.
#feature_value_domain=
def feature_value_domain=(value) -> ::Google::Cloud::AIPlatform::V1::ExplanationMetadata::InputMetadata::FeatureValueDomain
- value (::Google::Cloud::AIPlatform::V1::ExplanationMetadata::InputMetadata::FeatureValueDomain) — The domain details of the input feature value. Like min/max, original mean or standard deviation if normalized.
- (::Google::Cloud::AIPlatform::V1::ExplanationMetadata::InputMetadata::FeatureValueDomain) — The domain details of the input feature value. Like min/max, original mean or standard deviation if normalized.
#group_name
def group_name() -> ::String
- (::String) — Name of the group that the input belongs to. Features with the same group name will be treated as one feature when computing attributions. Features grouped together can have different shapes in value. If provided, there will be one single attribution generated in Attribution.feature_attributions, keyed by the group name.
#group_name=
def group_name=(value) -> ::String
- value (::String) — Name of the group that the input belongs to. Features with the same group name will be treated as one feature when computing attributions. Features grouped together can have different shapes in value. If provided, there will be one single attribution generated in Attribution.feature_attributions, keyed by the group name.
- (::String) — Name of the group that the input belongs to. Features with the same group name will be treated as one feature when computing attributions. Features grouped together can have different shapes in value. If provided, there will be one single attribution generated in Attribution.feature_attributions, keyed by the group name.
#index_feature_mapping
def index_feature_mapping() -> ::Array<::String>
- (::Array<::String>) — A list of feature names for each index in the input tensor. Required when the input InputMetadata.encoding is BAG_OF_FEATURES, BAG_OF_FEATURES_SPARSE, INDICATOR.
#index_feature_mapping=
def index_feature_mapping=(value) -> ::Array<::String>
- value (::Array<::String>) — A list of feature names for each index in the input tensor. Required when the input InputMetadata.encoding is BAG_OF_FEATURES, BAG_OF_FEATURES_SPARSE, INDICATOR.
- (::Array<::String>) — A list of feature names for each index in the input tensor. Required when the input InputMetadata.encoding is BAG_OF_FEATURES, BAG_OF_FEATURES_SPARSE, INDICATOR.
#indices_tensor_name
def indices_tensor_name() -> ::String
- (::String) — Specifies the index of the values of the input tensor. Required when the input tensor is a sparse representation. Refer to Tensorflow documentation for more details: https://www.tensorflow.org/api_docs/python/tf/sparse/SparseTensor.
#indices_tensor_name=
def indices_tensor_name=(value) -> ::String
- value (::String) — Specifies the index of the values of the input tensor. Required when the input tensor is a sparse representation. Refer to Tensorflow documentation for more details: https://www.tensorflow.org/api_docs/python/tf/sparse/SparseTensor.
- (::String) — Specifies the index of the values of the input tensor. Required when the input tensor is a sparse representation. Refer to Tensorflow documentation for more details: https://www.tensorflow.org/api_docs/python/tf/sparse/SparseTensor.
#input_baselines
def input_baselines() -> ::Array<::Google::Protobuf::Value>
-
(::Array<::Google::Protobuf::Value>) — Baseline inputs for this feature.
If no baseline is specified, Vertex AI chooses the baseline for this feature. If multiple baselines are specified, Vertex AI returns the average attributions across them in Attribution.feature_attributions.
For Vertex AI-provided Tensorflow images (both 1.x and 2.x), the shape of each baseline must match the shape of the input tensor. If a scalar is provided, we broadcast to the same shape as the input tensor.
For custom images, the element of the baselines must be in the same format as the feature's input in the instance[]. The schema of any single instance may be specified via Endpoint's DeployedModels' [Model's][google.cloud.aiplatform.v1.DeployedModel.model] [PredictSchemata's][google.cloud.aiplatform.v1.Model.predict_schemata] instance_schema_uri.
#input_baselines=
def input_baselines=(value) -> ::Array<::Google::Protobuf::Value>
-
value (::Array<::Google::Protobuf::Value>) — Baseline inputs for this feature.
If no baseline is specified, Vertex AI chooses the baseline for this feature. If multiple baselines are specified, Vertex AI returns the average attributions across them in Attribution.feature_attributions.
For Vertex AI-provided Tensorflow images (both 1.x and 2.x), the shape of each baseline must match the shape of the input tensor. If a scalar is provided, we broadcast to the same shape as the input tensor.
For custom images, the element of the baselines must be in the same format as the feature's input in the instance[]. The schema of any single instance may be specified via Endpoint's DeployedModels' [Model's][google.cloud.aiplatform.v1.DeployedModel.model] [PredictSchemata's][google.cloud.aiplatform.v1.Model.predict_schemata] instance_schema_uri.
-
(::Array<::Google::Protobuf::Value>) — Baseline inputs for this feature.
If no baseline is specified, Vertex AI chooses the baseline for this feature. If multiple baselines are specified, Vertex AI returns the average attributions across them in Attribution.feature_attributions.
For Vertex AI-provided Tensorflow images (both 1.x and 2.x), the shape of each baseline must match the shape of the input tensor. If a scalar is provided, we broadcast to the same shape as the input tensor.
For custom images, the element of the baselines must be in the same format as the feature's input in the instance[]. The schema of any single instance may be specified via Endpoint's DeployedModels' [Model's][google.cloud.aiplatform.v1.DeployedModel.model] [PredictSchemata's][google.cloud.aiplatform.v1.Model.predict_schemata] instance_schema_uri.
#input_tensor_name
def input_tensor_name() -> ::String
- (::String) — Name of the input tensor for this feature. Required and is only applicable to Vertex AI-provided images for Tensorflow.
#input_tensor_name=
def input_tensor_name=(value) -> ::String
- value (::String) — Name of the input tensor for this feature. Required and is only applicable to Vertex AI-provided images for Tensorflow.
- (::String) — Name of the input tensor for this feature. Required and is only applicable to Vertex AI-provided images for Tensorflow.
#modality
def modality() -> ::String
- (::String) — Modality of the feature. Valid values are: numeric, image. Defaults to numeric.
#modality=
def modality=(value) -> ::String
- value (::String) — Modality of the feature. Valid values are: numeric, image. Defaults to numeric.
- (::String) — Modality of the feature. Valid values are: numeric, image. Defaults to numeric.
#visualization
def visualization() -> ::Google::Cloud::AIPlatform::V1::ExplanationMetadata::InputMetadata::Visualization
- (::Google::Cloud::AIPlatform::V1::ExplanationMetadata::InputMetadata::Visualization) — Visualization configurations for image explanation.
#visualization=
def visualization=(value) -> ::Google::Cloud::AIPlatform::V1::ExplanationMetadata::InputMetadata::Visualization
- value (::Google::Cloud::AIPlatform::V1::ExplanationMetadata::InputMetadata::Visualization) — Visualization configurations for image explanation.
- (::Google::Cloud::AIPlatform::V1::ExplanationMetadata::InputMetadata::Visualization) — Visualization configurations for image explanation.