.annotation_spec_set.AnnotationSpec
The annotation spec of the label for which
the precision-recall curve calculated. If this
field is empty, that means the precision-recall
curve is an aggregate curve for all labels.
area_under_curve
float
Area under the precision-recall curve. Not to
be confused with area under a receiver operating
characteristic (ROC) curve.
confidence_metrics_entries
Sequence[.evaluation.PrCurve.ConfidenceMetricsEntry]
Entries that make up the precision-recall graph. Each entry
is a "point" on the graph drawn for a different
confidence_threshold.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2024-12-17 UTC."],[],[]]