Cloud AutoML V1beta1 Client - Class ConfidenceMetricsEntry (1.5.4)

Reference documentation and code samples for the Cloud AutoML V1beta1 Client class ConfidenceMetricsEntry.

Metrics for a single confidence threshold.

Generated from protobuf message google.cloud.automl.v1beta1.ClassificationEvaluationMetrics.ConfidenceMetricsEntry

Namespace

Google \ Cloud \ AutoMl \ V1beta1 \ ClassificationEvaluationMetrics

Methods

__construct

Constructor.

Parameters
NameDescription
data array

Optional. Data for populating the Message object.

↳ confidence_threshold float

Output only. Metrics are computed with an assumption that the model never returns predictions with score lower than this value.

↳ position_threshold int

Output only. Metrics are computed with an assumption that the model always returns at most this many predictions (ordered by their score, descendingly), but they all still need to meet the confidence_threshold.

↳ recall float

Output only. Recall (True Positive Rate) for the given confidence threshold.

↳ precision float

Output only. Precision for the given confidence threshold.

↳ false_positive_rate float

Output only. False Positive Rate for the given confidence threshold.

↳ f1_score float

Output only. The harmonic mean of recall and precision.

↳ recall_at1 float

Output only. The Recall (True Positive Rate) when only considering the label that has the highest prediction score and not below the confidence threshold for each example.

↳ precision_at1 float

Output only. The precision when only considering the label that has the highest prediction score and not below the confidence threshold for each example.

↳ false_positive_rate_at1 float

Output only. The False Positive Rate when only considering the label that has the highest prediction score and not below the confidence threshold for each example.

↳ f1_score_at1 float

Output only. The harmonic mean of recall_at1 and precision_at1.

↳ true_positive_count int|string

Output only. The number of model created labels that match a ground truth label.

↳ false_positive_count int|string

Output only. The number of model created labels that do not match a ground truth label.

↳ false_negative_count int|string

Output only. The number of ground truth labels that are not matched by a model created label.

↳ true_negative_count int|string

Output only. The number of labels that were not created by the model, but if they would, they would not match a ground truth label.

getConfidenceThreshold

Output only. Metrics are computed with an assumption that the model never returns predictions with score lower than this value.

Returns
TypeDescription
float

setConfidenceThreshold

Output only. Metrics are computed with an assumption that the model never returns predictions with score lower than this value.

Parameter
NameDescription
var float
Returns
TypeDescription
$this

getPositionThreshold

Output only. Metrics are computed with an assumption that the model always returns at most this many predictions (ordered by their score, descendingly), but they all still need to meet the confidence_threshold.

Returns
TypeDescription
int

setPositionThreshold

Output only. Metrics are computed with an assumption that the model always returns at most this many predictions (ordered by their score, descendingly), but they all still need to meet the confidence_threshold.

Parameter
NameDescription
var int
Returns
TypeDescription
$this

getRecall

Output only. Recall (True Positive Rate) for the given confidence threshold.

Returns
TypeDescription
float

setRecall

Output only. Recall (True Positive Rate) for the given confidence threshold.

Parameter
NameDescription
var float
Returns
TypeDescription
$this

getPrecision

Output only. Precision for the given confidence threshold.

Returns
TypeDescription
float

setPrecision

Output only. Precision for the given confidence threshold.

Parameter
NameDescription
var float
Returns
TypeDescription
$this

getFalsePositiveRate

Output only. False Positive Rate for the given confidence threshold.

Returns
TypeDescription
float

setFalsePositiveRate

Output only. False Positive Rate for the given confidence threshold.

Parameter
NameDescription
var float
Returns
TypeDescription
$this

getF1Score

Output only. The harmonic mean of recall and precision.

Returns
TypeDescription
float

setF1Score

Output only. The harmonic mean of recall and precision.

Parameter
NameDescription
var float
Returns
TypeDescription
$this

getRecallAt1

Output only. The Recall (True Positive Rate) when only considering the label that has the highest prediction score and not below the confidence threshold for each example.

Returns
TypeDescription
float

setRecallAt1

Output only. The Recall (True Positive Rate) when only considering the label that has the highest prediction score and not below the confidence threshold for each example.

Parameter
NameDescription
var float
Returns
TypeDescription
$this

getPrecisionAt1

Output only. The precision when only considering the label that has the highest prediction score and not below the confidence threshold for each example.

Returns
TypeDescription
float

setPrecisionAt1

Output only. The precision when only considering the label that has the highest prediction score and not below the confidence threshold for each example.

Parameter
NameDescription
var float
Returns
TypeDescription
$this

getFalsePositiveRateAt1

Output only. The False Positive Rate when only considering the label that has the highest prediction score and not below the confidence threshold for each example.

Returns
TypeDescription
float

setFalsePositiveRateAt1

Output only. The False Positive Rate when only considering the label that has the highest prediction score and not below the confidence threshold for each example.

Parameter
NameDescription
var float
Returns
TypeDescription
$this

getF1ScoreAt1

Output only. The harmonic mean of recall_at1 and precision_at1.

Returns
TypeDescription
float

setF1ScoreAt1

Output only. The harmonic mean of recall_at1 and precision_at1.

Parameter
NameDescription
var float
Returns
TypeDescription
$this

getTruePositiveCount

Output only. The number of model created labels that match a ground truth label.

Returns
TypeDescription
int|string

setTruePositiveCount

Output only. The number of model created labels that match a ground truth label.

Parameter
NameDescription
var int|string
Returns
TypeDescription
$this

getFalsePositiveCount

Output only. The number of model created labels that do not match a ground truth label.

Returns
TypeDescription
int|string

setFalsePositiveCount

Output only. The number of model created labels that do not match a ground truth label.

Parameter
NameDescription
var int|string
Returns
TypeDescription
$this

getFalseNegativeCount

Output only. The number of ground truth labels that are not matched by a model created label.

Returns
TypeDescription
int|string

setFalseNegativeCount

Output only. The number of ground truth labels that are not matched by a model created label.

Parameter
NameDescription
var int|string
Returns
TypeDescription
$this

getTrueNegativeCount

Output only. The number of labels that were not created by the model, but if they would, they would not match a ground truth label.

Returns
TypeDescription
int|string

setTrueNegativeCount

Output only. The number of labels that were not created by the model, but if they would, they would not match a ground truth label.

Parameter
NameDescription
var int|string
Returns
TypeDescription
$this