파이프라인은 버스를 타겟 대상으로 연결하고 이벤트 메시지를 해당 대상으로 라우팅합니다. 특정 형식의 이벤트 데이터를 예상하도록 파이프라인을 구성하거나 이벤트가 대상에 전송되기 전에 지원되는 형식에서 다른 형식으로 이벤트 데이터를 변환할 수 있습니다. 예를 들어 Avro 데이터만 허용하는 엔드포인트로 이벤트를 라우팅해야 할 수 있습니다.
지원되는 형식
다음과 같은 형식 변환이 지원됩니다.
Avro에서 JSON
Avro에서 Protobuf
JSON에서 Avro
JSON에서 Protobuf로
Protobuf에서 Avro
Protobuf에서 JSON
다음에 유의하세요.
이벤트 형식을 변환하면 전체 이벤트 메시지가 아닌 이벤트 페이로드 만 변환됩니다.
파이프라인에 인바운드 데이터 형식이 지정된 경우 모든 이벤트가 해당 형식과 일치해야 합니다. 예상 형식과 일치하지 않는 이벤트는 지속 오류로 처리됩니다.
파이프라인에 인바운드 데이터 형식이 지정되지 않은 경우 아웃바운드 형식을 설정할 수 없습니다.
특정 대상에 맞게 이벤트 형식이 변환되기 전에 구성된 모든 데이터 변환이 먼저 적용됩니다.
JSON 스키마는 동적으로 감지됩니다. Protobuf 스키마 정의의 경우 최상위 유형을 하나만 정의할 수 있으며 다른 유형을 참조하는 가져오기 문은 지원되지 않습니다. syntax 식별자가 없는 스키마 정의는 기본적으로 proto2로 설정됩니다. 스키마 크기 제한이 있습니다.
이벤트 형식을 지정하도록 파이프라인 구성
Google Cloud 콘솔 또는 gcloud CLI를 사용하여 특정 형식의 이벤트 데이터를 예상하거나 이벤트 데이터를 한 형식에서 다른 형식으로 변환하도록 파이프라인을 구성할 수 있습니다.
다음 예시에서는 output_payload_format_avro_schema_definition 키와 --input-payload-format-avro-schema-definition 플래그를 사용하여 Avro 형식의 이벤트를 예상하고 동일한 형식으로 출력하는 파이프라인을 만듭니다.
다음 예시에서는 output_payload_format_protobuf_schema_definition 키와 --input-payload-format-avro-schema-definition 플래그를 사용하여 파이프라인을 업데이트하고 스키마 정의를 사용하여 이벤트 데이터를 Avro에서 Protobuf로 변환합니다.
[[["이해하기 쉬움","easyToUnderstand","thumb-up"],["문제가 해결됨","solvedMyProblem","thumb-up"],["기타","otherUp","thumb-up"]],[["이해하기 어려움","hardToUnderstand","thumb-down"],["잘못된 정보 또는 샘플 코드","incorrectInformationOrSampleCode","thumb-down"],["필요한 정보/샘플이 없음","missingTheInformationSamplesINeed","thumb-down"],["번역 문제","translationIssue","thumb-down"],["기타","otherDown","thumb-down"]],["최종 업데이트: 2025-09-04(UTC)"],[[["\u003cp\u003eEventarc Advanced, currently in pre-GA, allows you to configure pipelines to manage the format of event data, including converting between Avro, JSON, and Protobuf.\u003c/p\u003e\n"],["\u003cp\u003ePipelines can be set up to expect a specific inbound data format, and all incoming events must match this format, or they will be treated as persistent errors.\u003c/p\u003e\n"],["\u003cp\u003eWhen converting event formats, only the payload is transformed, not the entire event message, and inbound and outbound schema formats must be specified appropriately.\u003c/p\u003e\n"],["\u003cp\u003eYou can configure pipelines and manage event format conversion through the Google Cloud console or the gcloud CLI, noting that updating a pipeline may take over 10 minutes.\u003c/p\u003e\n"],["\u003cp\u003ePipelines can be set to use data transformations prior to the formatting of the data, and the formatted data can be sent with a CloudEvents format, unless a message binding is specified.\u003c/p\u003e\n"]]],[],null,["# Format received events\n\n[Advanced](/eventarc/advanced/docs/overview)\n\nA pipeline connects a bus to a target destination, and routes event messages to\nthat destination. You can configure a pipeline to expect event data in a\nspecific format or, before events are delivered to a destination, you can\nconvert event data from one supported format to another. For example, you might\nneed to route events to an endpoint that only accepts Avro data.\n\nSupported formats\n-----------------\n\nThe following format conversions are supported:\n\n- Avro to JSON\n- Avro to Protobuf\n- JSON to Avro\n- JSON to Protobuf\n- Protobuf to Avro\n- Protobuf to JSON\n\n#### Note the following:\n\n- When you convert the format of events, *only* the event payload is converted\n and not the entire event message.\n\n- If an inbound data format is specified for a pipeline, all events must match\n that format. Any events that don't match the expected format are treated as\n [persistent errors](/eventarc/advanced/docs/retry-events#persistent).\n\n- If an inbound data format is *not* specified for a pipeline, an outbound\n format *can't* be set.\n\n- Before an event format is converted for a specific destination, any\n [data transformation](/eventarc/advanced/docs/receive-events/transform-events)\n that is configured is applied first.\n\n- Events are always delivered in a\n [CloudEvents format using an HTTP request in binary content mode](/eventarc/docs/cloudevents)\n unless you specify a [message binding](/eventarc/advanced/docs/receive-events/transform-events#message-binding).\n\n- JSON schemas are detected dynamically. For Protobuf schema definitions, you\n can define only one top-level type and import statements that refer to other\n types are not supported. Schema definitions without a `syntax` identifier\n default to `proto2`. Note that there is a\n [schema size limit](/eventarc/docs/quotas#limits).\n\nConfigure a pipeline to format events\n-------------------------------------\n\nYou can configure a pipeline to expect event data in a specific format, or to\nconvert event data from one format to another, in the Google Cloud console or by\nusing the gcloud CLI. \n\n### Console\n\n1. In the Google Cloud console, go to the **Eventarc**\n \\\u003e **Pipelines** page.\n\n\n [Go to Pipelines](https://console.cloud.google.com/eventarc/pipelines)\n\n \u003cbr /\u003e\n\n2. You can [create a pipeline](/eventarc/advanced/docs/receive-events/create-enrollment#console)\n or, if you are updating a pipeline, click the name of the pipeline.\n\n3. In the **Pipeline details** page, click\n edit\n **Edit**.\n\n4. In the **Event mediation** pane, do the following:\n\n 1. Select the **Apply a transformation** checkbox.\n 2. In the **Inbound format** list, select the applicable format.\n\n Note that if an inbound data format is specified for a pipeline, all\n events must match that format. Any events that don't match the\n expected format are treated as\n [persistent errors](/eventarc/advanced/docs/retry-events#persistent).\n 3. For Avro or Protobuf formats, you must specify an inbound schema.\n (Optionally, instead of specifying it directly, you can upload an inbound\n schema.)\n\n 4. In the **CEL expression** field, write a transformation expression\n [using CEL](/eventarc/advanced/docs/receive-events/use-cel).\n\n 5. Click **Continue**.\n\n5. In the **Destination** pane, do the following:\n\n 1. If applicable, in the **Outbound format** list, select a format.\n\n Note that if an inbound data format is *not* specified for a pipeline,\n an outbound format *can't* be set.\n 2. Optional: Apply a **Message binding** . For more information, see\n [Message binding](/eventarc/advanced/docs/receive-events/transform-events#message-binding).\n\n6. Click **Save**.\n\n It can take a couple of minutes to update a pipeline.\n\n### gcloud\n\n1. Open a terminal.\n\n2. You can [create a pipeline](/eventarc/advanced/docs/receive-events/create-enrollment#gcloud)\n or you can update a pipeline using the\n [`gcloud eventarc pipelines update`](/sdk/gcloud/reference/eventarc/pipelines/update)\n command:\n\n ```bash\n gcloud eventarc pipelines update PIPELINE_NAME \\\n --location=REGION \\\n --INPUT_PAYLOAD_FLAG \\\n --destinations=OUTPUT_PAYLOAD_KEY\n ```\n\n Replace the following:\n - \u003cvar translate=\"no\"\u003ePIPELINE_NAME\u003c/var\u003e: the ID of the pipeline or a fully qualified name\n - \u003cvar translate=\"no\"\u003eREGION\u003c/var\u003e: a\n [supported Eventarc Advanced location](/eventarc/docs/locations#advanced-regions)\n\n Alternatively, you can set the gcloud CLI location\n property: \n\n gcloud config set eventarc/location \u003cvar translate=\"no\"\u003eREGION\u003c/var\u003e\n\n - \u003cvar translate=\"no\"\u003eINPUT_PAYLOAD_FLAG\u003c/var\u003e: an input data format\n flag that can be one of the following:\n\n - `--input-payload-format-avro-schema-definition`\n - `--input-payload-format-json`\n - `--input-payload-format-protobuf-schema-definition`\n\n Note that if an input data format is specified for a pipeline, all\n events must match that format. Any events that don't match the\n expected format are treated as\n [persistent errors](/eventarc/advanced/docs/retry-events#persistent).\n - \u003cvar translate=\"no\"\u003eOUTPUT_PAYLOAD_KEY\u003c/var\u003e: an output data format key\n that can be one of the following:\n\n - `output_payload_format_avro_schema_definition`\n - `output_payload_format_json`\n - `output_payload_format_protobuf_schema_definition`\n\n Note that if you set an output data format key, you must also\n specify an input data format flag.\n\n It can take a couple of minutes to update a pipeline.\n\n ### Examples:\n\n The following example use an\n `--input-payload-format-protobuf-schema-definition` flag to specify that\n the pipeline should expect events in a Protobuf data format with a\n specific schema: \n\n ```bash\n gcloud eventarc pipelines update my-pipeline \\\n --input-payload-format-protobuf-schema-definition \\\n '\n syntax = \"proto3\";\n message schema {\n string name = 1;\n string severity = 2;\n }\n '\n ```\n\n The following example uses an\n `output_payload_format_avro_schema_definition` key and an\n `--input-payload-format-avro-schema-definition` flag to create a\n pipeline that expects events in an Avro format and outputs them in the\n same format: \n\n ```bash\n gcloud eventarc pipelines create my-pipeline \\\n --location=us-central1 \\\n --destinations=http_endpoint_uri='https://example-endpoint.com',output_payload_format_avro_schema_definition='{\"type\": \"record\", \"name\": \"my_record\", \"fields\": [{\"name\": \"my_field\", \"type\": \"string\"}]}' \\\n --input-payload-format-avro-schema-definition='{\"type\": \"record\", \"name\": \"my_record\", \"fields\": [{\"name\": \"my_field\", \"type\": \"string\"}]}'\n ```\n\n The following example uses an\n `output_payload_format_protobuf_schema_definition` key and an\n `--input-payload-format-avro-schema-definition` flag to update a\n pipeline and convert its event data from Avro to Protobuf using schema\n definitions: \n\n ```bash\n gcloud eventarc pipelines update my-pipeline \\\n --location=us-central1 \\\n --destinations=output_payload_format_protobuf_schema_definition='message MessageProto {string prop1 = 1; string prop2 = 2;}' \\\n --input-payload-format-avro-schema-definition= \\\n '\n {\n \"type\": \"record\",\n \"name\": \"MessageProto\",\n \"fields\": [\n { \"name\" : \"prop1\", \"type\": \"string\" },\n { \"name\" : \"prop2\", \"type\": \"string\" },\n ]\n }\n '\n ```"]]