Apache Beam은 파이프라인에 추가할 수 있는 턴키 보강 변환을 제공하여 데이터 보강 워크플로를 간소화합니다. 이 페이지에서는 Apache Beam 보강 변환을 사용하여 스트리밍 데이터를 보강하는 방법을 설명합니다.
데이터를 보강할 때는 두 번째 소스의 관련 데이터를 추가하여 한 소스의 원시 데이터를 보강할 수 있습니다. Bigtable 또는 BigQuery와 같은 다양한 소스에서 추가 데이터를 가져올 수 있습니다. Apache Beam 보강 변환은 키-값 조회를 사용하여 추가 데이터를 원시 데이터에 연결합니다.
다음 예시에서는 데이터 보강이 유용한 몇 가지 사례를 제공합니다.
웹사이트 또는 앱에서 사용자 활동을 캡처하고 맞춤형 추천을 제공하는 전자상거래 파이프라인을 만들려고 합니다. 이 변환은 맞춤설정된 추천을 제공할 수 있도록 활동을 파이프라인 데이터에 통합합니다.
지역 기반 분석을 위해 지리 데이터와 조인할 사용자 데이터가 있습니다.
원격 분석 이벤트를 전송하는 사물 인터넷(IOT) 기기에서 데이터를 수집하는 파이프라인을 만들려고 합니다.
[[["이해하기 쉬움","easyToUnderstand","thumb-up"],["문제가 해결됨","solvedMyProblem","thumb-up"],["기타","otherUp","thumb-up"]],[["이해하기 어려움","hardToUnderstand","thumb-down"],["잘못된 정보 또는 샘플 코드","incorrectInformationOrSampleCode","thumb-down"],["필요한 정보/샘플이 없음","missingTheInformationSamplesINeed","thumb-down"],["번역 문제","translationIssue","thumb-down"],["기타","otherDown","thumb-down"]],["최종 업데이트: 2025-09-04(UTC)"],[[["\u003cp\u003eApache Beam's enrichment transform simplifies data enrichment workflows by allowing users to augment raw data with related data from various sources like Bigtable or BigQuery.\u003c/p\u003e\n"],["\u003cp\u003eThe enrichment transform offers benefits such as transforming data without writing complex code, providing built-in source handlers for Bigtable, BigQuery, and Vertex AI Feature Store, and using client-side throttling for rate limiting.\u003c/p\u003e\n"],["\u003cp\u003eTo utilize the enrichment transform, users need to include specific code in their pipeline using \u003ccode\u003eBigTableEnrichmentHandler\u003c/code\u003e, and ensure they have the correct Apache Beam Python SDK versions, among other requirements.\u003c/p\u003e\n"],["\u003cp\u003eThe transform enables data enrichment for use cases such as creating ecommerce pipelines with customized recommendations, joining user data with geographical data for analytics, or gathering data from IoT devices.\u003c/p\u003e\n"],["\u003cp\u003eThe transform defaults to cross join but can be configured using a join function, timeout, throttler or repeater for greater control over how the data is enriched.\u003c/p\u003e\n"]]],[],null,["# Enrich streaming data\n\nApache Beam simplifies the data enrichment workflow by providing a turnkey\nenrichment transform that you can add to your pipeline. This page explains how\nto use the Apache Beam enrichment transform to enrich your streaming data.\n\nWhen you enrich data, you augment the raw data from one source by adding related\ndata from a second source. The additional data can come from a variety of\nsources, such as [Bigtable](/bigtable/docs/overview) or\n[BigQuery](/bigquery/docs/introduction). The Apache Beam enrichment\ntransform uses a key-value lookup to connect the additional data to the raw data.\n\nThe following examples provide some cases where data enrichment is useful:\n\n- You want to create an ecommerce pipeline that captures user activities from a website or app and provides customized recommendations. The transform incorporates the activities into your pipeline data so that you can provide the customized recommendations.\n- You have user data that you want to join with geographical data to do geography-based analytics.\n- You want to create a pipeline that gathers data from internet-of-things (IOT) devices that send out telemetry events.\n\nBenefits\n--------\n\nThe enrichment transform has the following benefits:\n\n- Transforms your data without requiring you to write complex code or manage underlying libraries.\n- Provides built-in source handlers.\n - Use the [`BigTableEnrichmentHandler`](https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.enrichment_handlers.bigtable.html#apache_beam.transforms.enrichment_handlers.bigtable.BigTableEnrichmentHandler) handler to enrich your data by using a Bigtable source without passing configuration details.\n - Use the [`BigQueryEnrichmentHandler`](https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.enrichment_handlers.bigquery.html#apache_beam.transforms.enrichment_handlers.bigquery.BigQueryEnrichmentHandler) handler to enrich your data by using a BigQuery source without passing configuration details.\n - Use the [`VertexAIFeatureStoreEnrichmentHandler`](https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.enrichment_handlers.vertex_ai_feature_store.html#apache_beam.transforms.enrichment_handlers.vertex_ai_feature_store.VertexAIFeatureStoreEnrichmentHandler) handler with [Vertex AI Feature Store](/vertex-ai/docs/featurestore/latest/overview) and [Bigtable online serving](/vertex-ai/docs/featurestore/latest/overview#online_serving).\n- Uses client-side throttling to manage rate limiting the requests. The requests are exponentially backed off with a default retry strategy. You can configure rate limiting to suit your use case.\n\nSupport and limitations\n-----------------------\n\nThe enrichment transform has the following requirements:\n\n- Available for batch and streaming pipelines.\n- The `BigTableEnrichmentHandler` handler is available in the Apache Beam Python SDK versions 2.54.0 and later.\n- The `BigQueryEnrichmentHandler` handler is available in the Apache Beam Python SDK versions 2.57.0 and later.\n- The `VertexAIFeatureStoreEnrichmentHandler` handler is available in the Apache Beam Python SDK versions 2.55.0 and later.\n- When using the Apache Beam Python SDK versions 2.55.0 and later, you also need to install the [Python client for Redis](https://pypi.org/project/redis/).\n- Dataflow jobs must use [Runner v2](/dataflow/docs/runner-v2).\n\nUse the enrichment transform\n----------------------------\n\nTo use the enrichment transform, include the following code in\nyour pipeline: \n\n import apache_beam as beam\n from apache_beam.transforms.enrichment import Enrichment\n from apache_beam.transforms.enrichment_handlers.bigtable import BigTableEnrichmentHandler\n\n bigtable_handler = BigTableEnrichmentHandler(...)\n\n with beam.Pipeline() as p:\n output = (p\n ...\n | \"Create\" \u003e\u003e beam.Create(data)\n | \"Enrich with Bigtable\" \u003e\u003e Enrichment(bigtable_handler)\n ...\n )\n\nBecause the enrichment transform performs a cross join by default, design the\ncustom join to enrich the input data. This design ensures that the join includes\nonly the specified fields.\n\nIn the following example, `left` is the input element of the enrichment\ntransform, and `right` is data fetched from an external service for that input\nelement. \n\n def custom_join(left: Dict[str, Any], right: Dict[str, Any]):\n enriched = {}\n enriched['\u003cvar translate=\"no\"\u003eFIELD_NAME\u003c/var\u003e'] = left['\u003cvar translate=\"no\"\u003eFIELD_NAME\u003c/var\u003e']\n ...\n return beam.Row(**enriched)\n\n### Parameters\n\nTo use the enrichment transform, the `EnrichmentHandler` parameter is required.\n\nYou can also use a configuration parameter to specify a `lambda` function for a join\nfunction, a timeout, a throttler, or a repeater (retry strategy). The following\nconfiguration parameters are available:\n\n- `join_fn`: A `lambda` function that takes dictionaries as input and returns an enriched row (`Callable[[Dict[str, Any], Dict[str, Any]], beam.Row]`). The enriched row specifies how to join the data fetched from the API. Defaults to a cross join.\n- `timeout`: The number of seconds to wait for the request to be completed by the API before timing out. Defaults to 30 seconds.\n- `throttler`: Specifies the throttling mechanism. The only supported option is default client-side adaptive throttling.\n- `repeater`: Specifies the retry strategy when errors like `TooManyRequests` and `TimeoutException` occur. Defaults to `ExponentialBackOffRepeater`.\n\nWhat's next\n-----------\n\n- For more examples, see [Enrichment transform](https://beam.apache.org/documentation/transforms/python/elementwise/enrichment) in the Apache Beam transform catalog.\n- [Use Apache Beam and Bigtable to enrich data](/dataflow/docs/notebooks/bigtable_enrichment_transform).\n- [Use Apache Beam and BigQuery to enrich data](/dataflow/docs/notebooks/bigquery_enrichment_transform).\n- [Use Apache Beam and Vertex AI Feature Store to enrich data](/dataflow/docs/notebooks/vertex_ai_feature_store_enrichment)."]]