Mit der Cloud Vision API ermitteln, ob das Image sicher ist
Mit Sammlungen den Überblick behalten
Sie können Inhalte basierend auf Ihren Einstellungen speichern und kategorisieren.
In dieser Anleitung erfahren Sie, wie Sie mit Cloud Run, der Cloud Vision API und ImageMagick anstößige Bilder erkennen und unkenntlich machen, die in einen Cloud Storage-Bucket hochgeladen wurden.
Weitere Informationen
Eine ausführliche Dokumentation, die dieses Codebeispiel enthält, finden Sie hier:
Codebeispiel
Nächste Schritte
Wenn Sie nach Codebeispielen für andere Google Cloud -Produkte suchen und filtern möchten, können Sie den Google Cloud -Beispielbrowser verwenden.
Sofern nicht anders angegeben, sind die Inhalte dieser Seite unter der Creative Commons Attribution 4.0 License und Codebeispiele unter der Apache 2.0 License lizenziert. Weitere Informationen finden Sie in den Websiterichtlinien von Google Developers. Java ist eine eingetragene Marke von Oracle und/oder seinen Partnern.
[[["Leicht verständlich","easyToUnderstand","thumb-up"],["Mein Problem wurde gelöst","solvedMyProblem","thumb-up"],["Sonstiges","otherUp","thumb-up"]],[["Schwer verständlich","hardToUnderstand","thumb-down"],["Informationen oder Beispielcode falsch","incorrectInformationOrSampleCode","thumb-down"],["Benötigte Informationen/Beispiele nicht gefunden","missingTheInformationSamplesINeed","thumb-down"],["Problem mit der Übersetzung","translationIssue","thumb-down"],["Sonstiges","otherDown","thumb-down"]],[],[],[],null,["# Use Cloud Vision API to determine if image is safe\n\nThis tutorial demonstrates using Cloud Run, Cloud Vision API, and ImageMagick to detect and blur offensive images uploaded to a Cloud Storage bucket.\n\nExplore further\n---------------\n\n\nFor detailed documentation that includes this code sample, see the following:\n\n- [Process images from Cloud Storage tutorial](/run/docs/tutorials/image-processing)\n- [Processing images asynchronously](/anthos/run/archive/docs/tutorials/image-processing)\n\nCode sample\n-----------\n\n### Go\n\n\nTo authenticate to Cloud Run, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n\n // GCSEvent is the payload of a GCS event.\n type GCSEvent struct {\n \tBucket string `json:\"bucket\"`\n \tName string `json:\"name\"`\n }\n\n // BlurOffensiveImages blurs offensive images uploaded to GCS.\n func BlurOffensiveImages(ctx context.Context, e GCSEvent) error {\n \toutputBucket := os.Getenv(\"BLURRED_BUCKET_NAME\")\n \tif outputBucket == \"\" {\n \t\treturn errors.New(\"BLURRED_BUCKET_NAME must be set\")\n \t}\n\n \timg := vision.NewImageFromURI(fmt.Sprintf(\"gs://%s/%s\", e.Bucket, e.Name))\n\n \tresp, err := visionClient.DetectSafeSearch(ctx, img, nil)\n \tif err != nil {\n \t\treturn fmt.Errorf(\"AnnotateImage: %w\", err)\n \t}\n\n \tif resp.GetAdult() == visionpb.Likelihood_VERY_LIKELY ||\n \t\tresp.GetViolence() == visionpb.Likelihood_VERY_LIKELY {\n \t\treturn blur(ctx, e.Bucket, outputBucket, e.Name)\n \t}\n \tlog.Printf(\"The image %q was detected as OK.\", e.Name)\n \treturn nil\n }\n\n### Java\n\n\nTo authenticate to Cloud Run, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n // Blurs uploaded images that are flagged as Adult or Violence.\n public static void blurOffensiveImages(JsonObject data) {\n String fileName = data.get(\"name\").getAsString();\n String bucketName = data.get(\"bucket\").getAsString();\n BlobInfo blobInfo = BlobInfo.newBuilder(bucketName, fileName).build();\n // Construct URI to GCS bucket and file.\n String gcsPath = String.format(\"gs://%s/%s\", bucketName, fileName);\n System.out.println(String.format(\"Analyzing %s\", fileName));\n\n // Construct request.\n List\u003cAnnotateImageRequest\u003e requests = new ArrayList\u003c\u003e();\n ImageSource imgSource = ImageSource.newBuilder().setImageUri(gcsPath).build();\n Image img = Image.newBuilder().setSource(imgSource).build();\n Feature feature = Feature.newBuilder().setType(Type.SAFE_SEARCH_DETECTION).build();\n AnnotateImageRequest request =\n AnnotateImageRequest.newBuilder().addFeatures(feature).setImage(img).build();\n requests.add(request);\n\n // Send request to the Vision API.\n try (ImageAnnotatorClient client = ImageAnnotatorClient.create()) {\n BatchAnnotateImagesResponse response = client.batchAnnotateImages(requests);\n List\u003cAnnotateImageResponse\u003e responses = response.getResponsesList();\n for (AnnotateImageResponse res : responses) {\n if (res.hasError()) {\n System.out.println(String.format(\"Error: %s\\n\", res.getError().getMessage()));\n return;\n }\n // Get Safe Search Annotations\n SafeSearchAnnotation annotation = res.getSafeSearchAnnotation();\n if (annotation.getAdultValue() == 5 || annotation.getViolenceValue() == 5) {\n System.out.println(String.format(\"Detected %s as inappropriate.\", fileName));\n blur(blobInfo);\n } else {\n System.out.println(String.format(\"Detected %s as OK.\", fileName));\n }\n }\n } catch (Exception e) {\n System.out.println(String.format(\"Error with Vision API: %s\", e.getMessage()));\n }\n }\n\n### Node.js\n\n\nTo authenticate to Cloud Run, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n // Blurs uploaded images that are flagged as Adult or Violence.\n exports.blurOffensiveImages = async event =\u003e {\n // This event represents the triggering Cloud Storage object.\n const object = event;\n\n const file = storage.bucket(object.bucket).file(object.name);\n const filePath = `gs://${object.bucket}/${object.name}`;\n\n console.log(`Analyzing ${file.name}.`);\n\n try {\n const [result] = await client.safeSearchDetection(filePath);\n const detections = result.safeSearchAnnotation || {};\n\n if (\n // Levels are defined in https://cloud.google.com/vision/docs/reference/rest/v1/AnnotateImageResponse#likelihood\n detections.adult === 'VERY_LIKELY' ||\n detections.violence === 'VERY_LIKELY'\n ) {\n console.log(`Detected ${file.name} as inappropriate.`);\n return blurImage(file, BLURRED_BUCKET_NAME);\n } else {\n console.log(`Detected ${file.name} as OK.`);\n }\n } catch (err) {\n console.error(`Failed to analyze ${file.name}.`, err);\n throw err;\n }\n };\n\n### Python\n\n\nTo authenticate to Cloud Run, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n def blur_offensive_images(data):\n \"\"\"Blurs uploaded images that are flagged as Adult or Violence.\n\n Args:\n data: Pub/Sub message data\n \"\"\"\n file_data = data\n\n file_name = file_data[\"name\"]\n bucket_name = file_data[\"bucket\"]\n\n blob = storage_client.bucket(bucket_name).get_blob(file_name)\n blob_uri = f\"gs://{bucket_name}/{file_name}\"\n blob_source = vision.Image(source=vision.ImageSource(image_uri=blob_uri))\n\n # Ignore already-blurred files\n if file_name.startswith(\"blurred-\"):\n print(f\"The image {file_name} is already blurred.\")\n return\n\n print(f\"Analyzing {file_name}.\")\n\n result = vision_client.safe_search_detection(image=blob_source)\n detected = result.safe_search_annotation\n\n # Process image\n if detected.adult == 5 or detected.violence == 5:\n print(f\"The image {file_name} was detected as inappropriate.\")\n return __blur_image(blob)\n else:\n print(f\"The image {file_name} was detected as OK.\")\n\nWhat's next\n-----------\n\n\nTo search and filter code samples for other Google Cloud products, see the\n[Google Cloud sample browser](/docs/samples?product=cloudrun)."]]