MultiConfidenceMetrics

Metrics across multiple confidence levels.

JSON representation
{
  "confidenceLevelMetrics": [
    {
      object (ConfidenceLevelMetrics)
    }
  ],
  "confidenceLevelMetricsExact": [
    {
      object (ConfidenceLevelMetrics)
    }
  ],
  "auprc": number,
  "estimatedCalibrationError": number,
  "auprcExact": number,
  "estimatedCalibrationErrorExact": number,
  "metricsType": enum (MetricsType)
}
Fields
confidenceLevelMetrics[]

object (ConfidenceLevelMetrics)

Metrics across confidence levels with fuzzy matching enabled.

confidenceLevelMetricsExact[]

object (ConfidenceLevelMetrics)

Metrics across confidence levels with only exact matching.

auprc

number

The calculated area under the precision recall curve (AUPRC), computed by integrating over all confidence thresholds.

estimatedCalibrationError

number

The Estimated Calibration Error (ECE) of the confidence of the predicted entities.

auprcExact

number

The AUPRC for metrics with fuzzy matching disabled, i.e., exact matching only.

estimatedCalibrationErrorExact

number

The ECE for the predicted entities with fuzzy matching disabled, i.e., exact matching only.

metricsType

enum (MetricsType)

The metrics type for the label.