Class SafeSearchAnnotation

Set of features pertaining to the image, computed by computer vision methods over safe-search verticals (for example, adult, spoof, medical, violence).

Attributes
NameDescription
google.cloud.vision_v1.types.Likelihoodadult
Represents the adult content likelihood for the image. Adult content may contain elements such as nudity, pornographic images or cartoons, or sexual activities.
google.cloud.vision_v1.types.Likelihoodspoof
Spoof likelihood. The likelihood that an modification was made to the image's canonical version to make it appear funny or offensive.
google.cloud.vision_v1.types.Likelihoodmedical
Likelihood that this is a medical image.
google.cloud.vision_v1.types.Likelihoodviolence
Likelihood that this image contains violent content.
google.cloud.vision_v1.types.Likelihoodracy
Likelihood that the request image contains racy content. Racy content may include (but is not limited to) skimpy or sheer clothing, strategically covered nudity, lewd or provocative poses, or close-ups of sensitive body areas.
floatadult_confidence
Confidence of adult_score. Range [0, 1]. 0 means not confident, 1 means very confident.
floatspoof_confidence
Confidence of spoof_score. Range [0, 1]. 0 means not confident, 1 means very confident.
floatmedical_confidence
Confidence of medical_score. Range [0, 1]. 0 means not confident, 1 means very confident.
floatviolence_confidence
Confidence of violence_score. Range [0, 1]. 0 means not confident, 1 means very confident.
floatracy_confidence
Confidence of racy_score. Range [0, 1]. 0 means not confident, 1 means very confident.
floatnsfw_confidence
Confidence of nsfw_score. Range [0, 1]. 0 means not confident, 1 means very confident.

Inheritance

builtins.object > proto.message.Message > SafeSearchAnnotation