Usar la API de Cloud Vision para determinar si una imagen es segura
Organízate con las colecciones
Guarda y clasifica el contenido según tus preferencias.
En este tutorial se muestra cómo usar Cloud Run, la API Cloud Vision e ImageMagick para detectar y difuminar imágenes ofensivas subidas a un segmento de Cloud Storage.
Investigar más
Para obtener documentación detallada que incluya este código de muestra, consulta lo siguiente:
Código de ejemplo
A menos que se indique lo contrario, el contenido de esta página está sujeto a la licencia Reconocimiento 4.0 de Creative Commons y las muestras de código están sujetas a la licencia Apache 2.0. Para obtener más información, consulta las políticas del sitio web de Google Developers. Java es una marca registrada de Oracle o sus afiliados.
[[["Es fácil de entender","easyToUnderstand","thumb-up"],["Me ofreció una solución al problema","solvedMyProblem","thumb-up"],["Otro","otherUp","thumb-up"]],[["Es difícil de entender","hardToUnderstand","thumb-down"],["La información o el código de muestra no son correctos","incorrectInformationOrSampleCode","thumb-down"],["Me faltan las muestras o la información que necesito","missingTheInformationSamplesINeed","thumb-down"],["Problema de traducción","translationIssue","thumb-down"],["Otro","otherDown","thumb-down"]],[],[],[],null,["# Use Cloud Vision API to determine if image is safe\n\nThis tutorial demonstrates using Cloud Run, Cloud Vision API, and ImageMagick to detect and blur offensive images uploaded to a Cloud Storage bucket.\n\nExplore further\n---------------\n\n\nFor detailed documentation that includes this code sample, see the following:\n\n- [Process images from Cloud Storage tutorial](/run/docs/tutorials/image-processing)\n- [Processing images asynchronously](/anthos/run/archive/docs/tutorials/image-processing)\n\nCode sample\n-----------\n\n### Go\n\n\nTo authenticate to Cloud Run, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n\n // GCSEvent is the payload of a GCS event.\n type GCSEvent struct {\n \tBucket string `json:\"bucket\"`\n \tName string `json:\"name\"`\n }\n\n // BlurOffensiveImages blurs offensive images uploaded to GCS.\n func BlurOffensiveImages(ctx context.Context, e GCSEvent) error {\n \toutputBucket := os.Getenv(\"BLURRED_BUCKET_NAME\")\n \tif outputBucket == \"\" {\n \t\treturn errors.New(\"BLURRED_BUCKET_NAME must be set\")\n \t}\n\n \timg := vision.NewImageFromURI(fmt.Sprintf(\"gs://%s/%s\", e.Bucket, e.Name))\n\n \tresp, err := visionClient.DetectSafeSearch(ctx, img, nil)\n \tif err != nil {\n \t\treturn fmt.Errorf(\"AnnotateImage: %w\", err)\n \t}\n\n \tif resp.GetAdult() == visionpb.Likelihood_VERY_LIKELY ||\n \t\tresp.GetViolence() == visionpb.Likelihood_VERY_LIKELY {\n \t\treturn blur(ctx, e.Bucket, outputBucket, e.Name)\n \t}\n \tlog.Printf(\"The image %q was detected as OK.\", e.Name)\n \treturn nil\n }\n\n### Java\n\n\nTo authenticate to Cloud Run, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n // Blurs uploaded images that are flagged as Adult or Violence.\n public static void blurOffensiveImages(JsonObject data) {\n String fileName = data.get(\"name\").getAsString();\n String bucketName = data.get(\"bucket\").getAsString();\n BlobInfo blobInfo = BlobInfo.newBuilder(bucketName, fileName).build();\n // Construct URI to GCS bucket and file.\n String gcsPath = String.format(\"gs://%s/%s\", bucketName, fileName);\n System.out.println(String.format(\"Analyzing %s\", fileName));\n\n // Construct request.\n List\u003cAnnotateImageRequest\u003e requests = new ArrayList\u003c\u003e();\n ImageSource imgSource = ImageSource.newBuilder().setImageUri(gcsPath).build();\n Image img = Image.newBuilder().setSource(imgSource).build();\n Feature feature = Feature.newBuilder().setType(Type.SAFE_SEARCH_DETECTION).build();\n AnnotateImageRequest request =\n AnnotateImageRequest.newBuilder().addFeatures(feature).setImage(img).build();\n requests.add(request);\n\n // Send request to the Vision API.\n try (ImageAnnotatorClient client = ImageAnnotatorClient.create()) {\n BatchAnnotateImagesResponse response = client.batchAnnotateImages(requests);\n List\u003cAnnotateImageResponse\u003e responses = response.getResponsesList();\n for (AnnotateImageResponse res : responses) {\n if (res.hasError()) {\n System.out.println(String.format(\"Error: %s\\n\", res.getError().getMessage()));\n return;\n }\n // Get Safe Search Annotations\n SafeSearchAnnotation annotation = res.getSafeSearchAnnotation();\n if (annotation.getAdultValue() == 5 || annotation.getViolenceValue() == 5) {\n System.out.println(String.format(\"Detected %s as inappropriate.\", fileName));\n blur(blobInfo);\n } else {\n System.out.println(String.format(\"Detected %s as OK.\", fileName));\n }\n }\n } catch (Exception e) {\n System.out.println(String.format(\"Error with Vision API: %s\", e.getMessage()));\n }\n }\n\n### Node.js\n\n\nTo authenticate to Cloud Run, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n // Blurs uploaded images that are flagged as Adult or Violence.\n exports.blurOffensiveImages = async event =\u003e {\n // This event represents the triggering Cloud Storage object.\n const object = event;\n\n const file = storage.bucket(object.bucket).file(object.name);\n const filePath = `gs://${object.bucket}/${object.name}`;\n\n console.log(`Analyzing ${file.name}.`);\n\n try {\n const [result] = await client.safeSearchDetection(filePath);\n const detections = result.safeSearchAnnotation || {};\n\n if (\n // Levels are defined in https://cloud.google.com/vision/docs/reference/rest/v1/AnnotateImageResponse#likelihood\n detections.adult === 'VERY_LIKELY' ||\n detections.violence === 'VERY_LIKELY'\n ) {\n console.log(`Detected ${file.name} as inappropriate.`);\n return blurImage(file, BLURRED_BUCKET_NAME);\n } else {\n console.log(`Detected ${file.name} as OK.`);\n }\n } catch (err) {\n console.error(`Failed to analyze ${file.name}.`, err);\n throw err;\n }\n };\n\n### Python\n\n\nTo authenticate to Cloud Run, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n def blur_offensive_images(data):\n \"\"\"Blurs uploaded images that are flagged as Adult or Violence.\n\n Args:\n data: Pub/Sub message data\n \"\"\"\n file_data = data\n\n file_name = file_data[\"name\"]\n bucket_name = file_data[\"bucket\"]\n\n blob = storage_client.bucket(bucket_name).get_blob(file_name)\n blob_uri = f\"gs://{bucket_name}/{file_name}\"\n blob_source = vision.Image(source=vision.ImageSource(image_uri=blob_uri))\n\n # Ignore already-blurred files\n if file_name.startswith(\"blurred-\"):\n print(f\"The image {file_name} is already blurred.\")\n return\n\n print(f\"Analyzing {file_name}.\")\n\n result = vision_client.safe_search_detection(image=blob_source)\n detected = result.safe_search_annotation\n\n # Process image\n if detected.adult == 5 or detected.violence == 5:\n print(f\"The image {file_name} was detected as inappropriate.\")\n return __blur_image(blob)\n else:\n print(f\"The image {file_name} was detected as OK.\")\n\nWhat's next\n-----------\n\n\nTo search and filter code samples for other Google Cloud products, see the\n[Google Cloud sample browser](/docs/samples?product=cloudrun)."]]