Text in Cloud Storage ausgeben
Mit Sammlungen den Überblick behalten
Sie können Inhalte basierend auf Ihren Einstellungen speichern und kategorisieren.
Mit dem TextIO-Connector Textdateien in einen Cloud Storage-Bucket schreiben
Weitere Informationen
Eine ausführliche Dokumentation, die dieses Codebeispiel enthält, finden Sie hier:
Codebeispiel
Nächste Schritte
Wenn Sie nach Codebeispielen für andere Google Cloud -Produkte suchen und filtern möchten, können Sie den Google Cloud -Beispielbrowser verwenden.
Sofern nicht anders angegeben, sind die Inhalte dieser Seite unter der Creative Commons Attribution 4.0 License und Codebeispiele unter der Apache 2.0 License lizenziert. Weitere Informationen finden Sie in den Websiterichtlinien von Google Developers. Java ist eine eingetragene Marke von Oracle und/oder seinen Partnern.
[[["Leicht verständlich","easyToUnderstand","thumb-up"],["Mein Problem wurde gelöst","solvedMyProblem","thumb-up"],["Sonstiges","otherUp","thumb-up"]],[["Schwer verständlich","hardToUnderstand","thumb-down"],["Informationen oder Beispielcode falsch","incorrectInformationOrSampleCode","thumb-down"],["Benötigte Informationen/Beispiele nicht gefunden","missingTheInformationSamplesINeed","thumb-down"],["Problem mit der Übersetzung","translationIssue","thumb-down"],["Sonstiges","otherDown","thumb-down"]],[],[[["\u003cp\u003eThis page demonstrates how to use the TextIO connector to write text files to a Cloud Storage bucket.\u003c/p\u003e\n"],["\u003cp\u003eThe provided code samples show how to write text data to Cloud Storage using both Java and Python.\u003c/p\u003e\n"],["\u003cp\u003eBoth the Java and Python examples use Application Default Credentials for authentication with Dataflow.\u003c/p\u003e\n"],["\u003cp\u003eThe Java sample utilizes the \u003ccode\u003eTextIO.write()\u003c/code\u003e method to define the destination, file suffix, and compression settings, while the Python sample uses the \u003ccode\u003eWriteToText\u003c/code\u003e function.\u003c/p\u003e\n"]]],[],null,["# Output text to Cloud Storage\n\nUse the TextIO connector to write text files to a Cloud Storage bucket.\n\nExplore further\n---------------\n\n\nFor detailed documentation that includes this code sample, see the following:\n\n- [Write from Dataflow to Cloud Storage](/dataflow/docs/guides/write-to-cloud-storage)\n\nCode sample\n-----------\n\n### Java\n\n\nTo authenticate to Dataflow, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n import java.util.Arrays;\n import java.util.List;\n import org.apache.beam.sdk.Pipeline;\n import org.apache.beam.sdk.io.Compression;\n import org.apache.beam.sdk.io.TextIO;\n import org.apache.beam.sdk.options.Description;\n import org.apache.beam.sdk.options.PipelineOptions;\n import org.apache.beam.sdk.options.PipelineOptionsFactory;\n import org.apache.beam.sdk.transforms.Create;\n\n public class BatchWriteStorage {\n public interface Options extends PipelineOptions {\n @Description(\"The Cloud Storage bucket to write to\")\n String getBucketName();\n\n void setBucketName(String value);\n }\n\n // Write text data to Cloud Storage\n public static void main(String[] args) {\n final List\u003cString\u003e wordsList = Arrays.asList(\"1\", \"2\", \"3\", \"4\");\n\n var options = PipelineOptionsFactory.fromArgs(args).withValidation().as(Options.class);\n var pipeline = Pipeline.create(options);\n pipeline\n .apply(Create\n .of(wordsList))\n .apply(TextIO\n .write()\n .to(options.getBucketName())\n .withSuffix(\".txt\")\n .withCompression(Compression.GZIP)\n );\n pipeline.run().waitUntilFinish();\n }\n }\n\n### Python\n\n\nTo authenticate to Dataflow, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n import argparse\n from typing import List\n\n import apache_beam as beam\n from apache_beam.io.textio import WriteToText\n from apache_beam.options.pipeline_options import PipelineOptions\n\n from typing_extensions import Self\n\n\n def write_to_cloud_storage(argv: List[str] = None) -\u003e None:\n # Parse the pipeline options passed into the application.\n class MyOptions(PipelineOptions):\n @classmethod\n # Define a custom pipeline option that specfies the Cloud Storage bucket.\n def _add_argparse_args(cls: Self, parser: argparse.ArgumentParser) -\u003e None:\n parser.add_argument(\"--output\", required=True)\n\n wordsList = [\"1\", \"2\", \"3\", \"4\"]\n options = MyOptions()\n\n with beam.Pipeline(options=options.view_as(PipelineOptions)) as pipeline:\n (\n pipeline\n | \"Create elements\" \u003e\u003e beam.Create(wordsList)\n | \"Write Files\" \u003e\u003e WriteToText(options.output, file_name_suffix=\".txt\")\n )\n\nWhat's next\n-----------\n\n\nTo search and filter code samples for other Google Cloud products, see the\n[Google Cloud sample browser](/docs/samples?product=dataflow)."]]