- NAME
-
- gcloud ml vision detect-safe-search - detect explicit content in an image
- SYNOPSIS
-
-
gcloud ml vision detect-safe-search
IMAGE_PATH
[GCLOUD_WIDE_FLAG …
]
-
- DESCRIPTION
- Safe Search Detection detects adult content, violent content, medical content and spoof content in an image.
- EXAMPLES
-
To detect adult content, violent content, medical content and spoof content in
an image 'gs://my_bucket/input_file':
gcloud ml vision detect-safe-search gs://my_bucket/input_file
- POSITIONAL ARGUMENTS
-
IMAGE_PATH
- Path to the image to be analyzed. This can be either a local path or a URL. If you provide a local file, the contents will be sent directly to Google Cloud Vision. If you provide a URL, it must be in Google Cloud Storage format (gs://bucket/object) or an HTTP URL (http://... or https://…)
- GCLOUD WIDE FLAGS
-
These flags are available to all commands:
--access-token-file
,--account
,--billing-project
,--configuration
,--flags-file
,--flatten
,--format
,--help
,--impersonate-service-account
,--log-http
,--project
,--quiet
,--trace-token
,--user-output-enabled
,--verbosity
.Run
$ gcloud help
for details. - API REFERENCE
-
This command uses the
vision/v1
API. The full documentation for this API can be found at: https://cloud.google.com/vision/ - NOTES
-
These variants are also available:
gcloud alpha ml vision detect-safe-search
gcloud beta ml vision detect-safe-search
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2024-02-06 UTC.