Class ConfidenceMetricsEntry

Metrics for a single confidence threshold.

Attributes
NameDescription
floatconfidence_threshold
Output only. Metrics are computed with an assumption that the model never returns predictions with score lower than this value.
intposition_threshold
Output only. Metrics are computed with an assumption that the model always returns at most this many predictions (ordered by their score, descendingly), but they all still need to meet the confidence_threshold.
floatrecall
Output only. Recall (True Positive Rate) for the given confidence threshold.
floatprecision
Output only. Precision for the given confidence threshold.
floatfalse_positive_rate
Output only. False Positive Rate for the given confidence threshold.
floatf1_score
Output only. The harmonic mean of recall and precision.
floatrecall_at1
Output only. The Recall (True Positive Rate) when only considering the label that has the highest prediction score and not below the confidence threshold for each example.
floatprecision_at1
Output only. The precision when only considering the label that has the highest prediction score and not below the confidence threshold for each example.
floatfalse_positive_rate_at1
Output only. The False Positive Rate when only considering the label that has the highest prediction score and not below the confidence threshold for each example.
floatf1_score_at1
Output only. The harmonic mean of recall_at1 and precision_at1.
inttrue_positive_count
Output only. The number of model created labels that match a ground truth label.
intfalse_positive_count
Output only. The number of model created labels that do not match a ground truth label.
intfalse_negative_count
Output only. The number of ground truth labels that are not matched by a model created label.
inttrue_negative_count
Output only. The number of labels that were not created by the model, but if they would, they would not match a ground truth label.

Inheritance

builtins.object > proto.message.Message > ConfidenceMetricsEntry