Class MultiConfidenceMetrics (2.7.0)

MultiConfidenceMetrics(mapping=None, *, ignore_unknown_fields=False, **kwargs)

Metrics across multiple confidence levels.

Attributes

NameDescription
confidence_level_metrics MutableSequence[google.cloud.documentai_v1beta3.types.Evaluation.ConfidenceLevelMetrics]
Metrics across confidence levels with fuzzy matching enabled.
confidence_level_metrics_exact MutableSequence[google.cloud.documentai_v1beta3.types.Evaluation.ConfidenceLevelMetrics]
Metrics across confidence levels with only exact matching.
auprc float
The calculated area under the precision recall curve (AUPRC), computed by integrating over all confidence thresholds.
estimated_calibration_error float
The Estimated Calibration Error (ECE) of the confidence of the predicted entities.
auprc_exact float
The AUPRC for metrics with fuzzy matching disabled, i.e., exact matching only.
estimated_calibration_error_exact float
The ECE for the predicted entities with fuzzy matching disabled, i.e., exact matching only.
metrics_type google.cloud.documentai_v1beta3.types.Evaluation.MultiConfidenceMetrics.MetricsType
The metrics type for the label.

Classes

MetricsType

MetricsType(value)

A type that determines how metrics should be interpreted.