Class SafeSearchAnnotation (3.1.2)

SafeSearchAnnotation(mapping=None, *, ignore_unknown_fields=False, **kwargs)

Set of features pertaining to the image, computed by computer vision methods over safe-search verticals (for example, adult, spoof, medical, violence).

Attributes

NameDescription
adult google.cloud.vision_v1p4beta1.types.Likelihood
Represents the adult content likelihood for the image. Adult content may contain elements such as nudity, pornographic images or cartoons, or sexual activities.
spoof google.cloud.vision_v1p4beta1.types.Likelihood
Spoof likelihood. The likelihood that an modification was made to the image's canonical version to make it appear funny or offensive.
medical google.cloud.vision_v1p4beta1.types.Likelihood
Likelihood that this is a medical image.
violence google.cloud.vision_v1p4beta1.types.Likelihood
Likelihood that this image contains violent content.
racy google.cloud.vision_v1p4beta1.types.Likelihood
Likelihood that the request image contains racy content. Racy content may include (but is not limited to) skimpy or sheer clothing, strategically covered nudity, lewd or provocative poses, or close-ups of sensitive body areas.

Inheritance

builtins.object > proto.message.Message > SafeSearchAnnotation