이 페이지에서는 Cloud Storage 버킷과 BigQuery 데이터 세트를 기존 Dataplex Universal Catalog 영역의 애셋으로 추가, 업그레이드, 삭제하는 방법을 설명합니다.
개요
애셋은 Cloud Storage 또는 BigQuery에 저장된 데이터에 매핑됩니다. 별도의 Google Cloud 프로젝트에 저장된 데이터를 레이크 내의 단일 영역에 애셋으로 매핑할 수 있습니다. 레이크 내에서 관리할 기존 Cloud Storage 버킷 또는 BigQuery 데이터 세트를 연결할 수 있습니다.
대부분 gcloud lakes 명령어에는 위치가 필요합니다. --location 플래그를 사용하여 위치를 지정할 수 있습니다.
필요한 역할
애셋을 삭제하려면 dataplex.lakes.delete, dataplex.zones.delete 권한 또는 dataplex.assets.delete IAM 권한이 포함된 IAM 역할을 부여합니다. Dataplex Universal Catalog 관련 roles/dataplex.admin 및 roles/dataplex.editor 역할을 사용하여 이러한 권한을 부여할 수 있습니다.
애셋을 추가하려면 create - dataplex.lakes.create, dataplex.zones.create 또는 dataplex.assets.create 권한이 포함된 IAM 역할을 부여합니다.
roles/dataplex.admin 및 roles/dataplex.editor 역할에는 이러한 권한이 포함되어 있습니다.
또한 roles/owner 및 roles/editor 레거시 역할을 사용하여 사용자 또는 그룹에 권한을 부여할 수 있습니다.
Dataplex Universal Catalog 레이크에 연결된 리소스에 Dataplex Universal Catalog 서비스를 승인해야 합니다. 레이크에서 만든 프로젝트의 리소스에 승인이 자동으로 그리고 암시적으로 부여됩니다.
기타 프로젝트의 경우 리소스에서 Dataplex 범용 카탈로그Dataplex Universal Catalog 서비스를 명시적으로 승인합니다.
다른 프로젝트의 Cloud Storage 버킷을 레이크에 연결하려면 Dataplex Universal Catalog 서비스 계정(콘솔의 레이크 세부정보 페이지에서 검색한 service-PROJECT_NUMBER@gcp-sa-dataplex.iam.gserviceaccount.com)에 버킷이 포함된 프로젝트에 대한 Dataplex Universal Catalog 서비스 계정 역할(roles/dataplex.serviceAgent)을 부여해야 합니다. 이 역할은 버킷에 대한 필수 관리자 수준 역할을 Dataplex Universal Catalog 서비스에 제공하여 버킷 자체에 권한을 설정할 수 있도록 합니다.
BigQuery 데이터 세트에 대한 역할 부여
다른 프로젝트의 BigQuery 데이터 세트를 레이크에 연결하려면 데이터 세트에 대한 BigQuery 관리자 역할인 Dataplex Universal Catalog 서비스 계정을 부여해야 합니다.
VPC 서비스 제어 고려사항
Dataplex Universal Catalog는 VPC 서비스 제어 경계를 위반하지 않습니다. 레이크에 애셋을 추가하기 전에 기본 버킷 또는 데이터 세트가 레이크와 동일한 VPC 서비스 제어 네트워크에 있는지 확인합니다.
추가에 성공하면 데이터 영역이 자동으로 활성 상태로 전환됩니다. 실패하면 데이터 영역이 이전의 정상 상태로 롤백됩니다.
Cloud Storage 버킷 애셋 업그레이드
Cloud Storage 버킷 유형의 애셋을 추가하면 Dataplex Universal Catalog가 애셋에서 호스팅되는 테이블에 BigQuery 외부 테이블을 자동으로 게시합니다.
Cloud Storage 버킷 애셋을 업그레이드하면 Dataplex Universal Catalog가 연결된 외부 테이블을 삭제하고 BigLake 테이블을 만듭니다.
BigLake 테이블은 행 수준, 열 수준, 동적 데이터 마스킹을 포함한 보다 세분화된 보안을 지원합니다.
Cloud Storage 버킷 애셋을 업그레이드하려면 다음 단계를 따르세요.
콘솔
Google Cloud 콘솔에서 Dataplex Universal Catalog 페이지로 이동합니다.
[[["이해하기 쉬움","easyToUnderstand","thumb-up"],["문제가 해결됨","solvedMyProblem","thumb-up"],["기타","otherUp","thumb-up"]],[["이해하기 어려움","hardToUnderstand","thumb-down"],["잘못된 정보 또는 샘플 코드","incorrectInformationOrSampleCode","thumb-down"],["필요한 정보/샘플이 없음","missingTheInformationSamplesINeed","thumb-down"],["번역 문제","translationIssue","thumb-down"],["기타","otherDown","thumb-down"]],["최종 업데이트: 2025-08-19(UTC)"],[[["\u003cp\u003eThis guide outlines how to manage Cloud Storage buckets and BigQuery datasets as assets within Dataplex zones, including adding, upgrading, and removing them.\u003c/p\u003e\n"],["\u003cp\u003eAdding assets requires specific IAM roles, such as \u003ccode\u003edataplex.assets.create\u003c/code\u003e, and may necessitate authorizing the Dataplex service on resources from different projects.\u003c/p\u003e\n"],["\u003cp\u003eCloud Storage bucket assets can be upgraded to use BigLake tables, offering enhanced security features, or downgraded back to external tables.\u003c/p\u003e\n"],["\u003cp\u003eAssets must be removed from a data zone or lake before they can be attached to a different one, but removing them does not delete the underlying data.\u003c/p\u003e\n"],["\u003cp\u003eDataplex does not violate VPC Service Controls, but underlying buckets or datasets must reside within the same network as the lake for assets to be added.\u003c/p\u003e\n"]]],[],null,["# Manage data assets in a lake\n\nThis page explains how to add, upgrade, and remove Cloud Storage buckets and\nBigQuery datasets as assets in existing Dataplex Universal Catalog zones.\n\nOverview\n--------\n\nAn asset maps to data stored in either Cloud Storage or BigQuery. You\ncan map data stored in separate Google Cloud projects as assets into a single\nzone within a lake. You can attach existing Cloud Storage buckets or\nBigQuery datasets to be managed from within the lake.\n| **Note:** Attached assets cannot be managed by another data zone or lake.\n\nBefore you begin\n----------------\n\n- If you haven't already, [create a lake](/dataplex/docs/create-lake) and a\n [zone](/dataplex/docs/add-zone) in that lake.\n\n- Most `gcloud lakes` commands require a location. You can specify\n the location by using the `--location` flag.\n\n### Required roles\n\n- To remove assets, grant the IAM roles containing the permissions\n `dataplex.lakes.delete`, `dataplex.zones.delete`, or\n `dataplex.assets.delete` IAM permissions. The Dataplex Universal Catalog\n specific `roles/dataplex.admin` and `roles/dataplex.editor` roles\n can be used to grant these permissions.\n\n- To add assets, grant the IAM roles containing the permissions `create` -\n `dataplex.lakes.create`, `dataplex.zones.create`, or `dataplex.assets.create`.\n The `roles/dataplex.admin` and `roles/dataplex.editor` roles contain these\n permissions.\n\n- You can also give permission to users or groups by using the `roles/owner`\n and `roles/editor` legacy roles.\n\n- You must authorize the Dataplex Universal Catalog service on resources being\n attached to the Dataplex Universal Catalog lake. The authorization is automatically and\n implicitly granted for resources in the project in which the lake is created.\n For other projects, authorize the Dataplex Universal Catalog service\n on resources explicitly.\n\nFor more information, see [Dataplex Universal Catalog IAM and access control](/dataplex/docs/iam-and-access-control).\n\n#### Grant roles for Cloud Storage buckets\n\nTo attach a Cloud Storage bucket from another project to your lake, you\nmust grant the Dataplex Universal Catalog service account\n(`service-`\u003cvar translate=\"no\"\u003ePROJECT_NUMBER\u003c/var\u003e`@gcp-sa-dataplex.iam.gserviceaccount.com`,\nretrieved from the lake details page in the console) the Dataplex Universal Catalog\nservice account role (`roles/dataplex.serviceAgent`) in the project that\ncontains the bucket. This role provides the\nDataplex Universal Catalog service with the prerequisite administrator level role on the bucket so that\npermissions can be set on the bucket itself.\n\n#### Grant roles for BigQuery datasets\n\nTo attach a BigQuery dataset from another project to your lake,\nyou must grant the Dataplex Universal Catalog service account, the\nBigQuery Administrator role on the dataset.\n\n### VPC Service Controls considerations\n\nDataplex Universal Catalog doesn't violate VPC Service Controls perimeters. Before\nadding an asset to the lake, make sure that the underlying bucket or dataset is\nin the same VPC Service Controls network as the lake.\n\nFor more information, See [VPC Service Controls with\nDataplex Universal Catalog](/dataplex/docs/vpc-sc).\n\nAdd an asset\n------------\n\n| **Note:** You can create multiple assets in a data zone concurrently. You can continue to use the data zone while the asset is being added.\n\nIf there is no overlap between the Dataplex Universal Catalog lake region\nand one of the Cloud Storage buckets region, you can't add\nthe bucket to a zone in your lake.\n\nTo learn more about the region location of a\nCloud Storage asset and how Dataplex Universal Catalog handles the\nlocation of a bucket when creating the publishing dataset, see\n[Regional resources](/dataplex/docs/regional-resources).\n\nTo add an asset, follow these steps: \n\n### Console\n\n1. In the Google Cloud console, go to the Dataplex Universal Catalog page.\n\n [Go to Dataplex Universal Catalog](https://console.cloud.google.com/dataplex/lakes)\n2. On the **Manage** page, click the lake to which\n you want to add a Cloud Storage bucket or BigQuery\n dataset. The lake page opens.\n\n3. On the **Zones** tab, click the name of the data zone to which\n you want to add the asset. The Data zone page for that data zone\n opens.\n\n4. On the **Assets** tab, click **+ Add Assets** . The **Add assets** page\n opens.\n\n5. Click **Add an Asset**.\n\n6. In the **Type** field, and select either\n **BigQuery dataset** or **Cloud Storage bucket**.\n\n7. In the **Display name** field, enter a name for the new asset.\n\n8. In the **ID** field, enter a unique ID for the asset.\n\n9. Optional: Enter a **Description**.\n\n10. In the **Dataset** or **Bucket** field (based on the type of your asset),\n click **Browse** to find and select your Cloud Storage bucket or\n BigQuery dataset.\n\n11. Optional: If your asset type is **Cloud Storage bucket** and if you\n want Dataplex Universal Catalog to manage the asset, then select the\n **Upgrade to Managed** checkbox. If you choose this option, you don't\n have to upgrade the asset separately. This option isn't available\n for BigQuery datasets.\n\n12. Click **Continue**.\n\n13. Choose the rest of the parameter values. For more information about\n security settings, see [Lake security](/dataplex/docs/lake-security).\n\n14. Click **Submit**.\n\n15. Verify that you have returned to the data zone page, and that your new\n asset appears in the assets list.\n\n### REST\n\nTo add an asset, use the\n[lakes.zones.assets.create](/dataplex/docs/reference/rest/v1/projects.locations.lakes.zones.assets/create)\nmethod.\n\nWhen the addition succeeds, the data zone automatically enters active\nstate. If it fails, then the data zone is rolled back to its previous\nhealthy state.\n\nUpgrade a Cloud Storage bucket asset\n------------------------------------\n\nWhen you add an asset of type Cloud Storage bucket,\nDataplex Universal Catalog automatically publishes BigQuery\n[external tables](/bigquery/docs/external-tables) for the tables hosted in the\nasset.\n\nWhen you [upgrade a Cloud Storage bucket asset](/dataplex/docs/lake-security#upgrade),\nDataplex Universal Catalog removes the attached external tables and creates\n[BigLake tables](/bigquery/docs/biglake-intro).\nBigLake tables support better fine-grained security,\nincluding row-level, column-level, and dynamic data masking.\n| **Note:** Unstructured data of type `Fileset` in Cloud Storage buckets, which are marked as `Managed` are published as [BigQuery object tables](/bigquery/docs/object-table-introduction).\n\nTo upgrade a Cloud Storage bucket asset, follow these steps: \n\n### Console\n\n1. In the Google Cloud console, go to the Dataplex Universal Catalog page.\n\n [Go to Dataplex Universal Catalog](https://console.cloud.google.com/dataplex/lakes)\n2. On the **Manage** page, click the name of the lake. The lake page opens.\n\n3. On the **Zones** tab, click the name of the data zone. The\n data zone page opens.\n\n4. On the **Assets** tab, click the name of the asset that you want to\n upgrade.\n\n5. Click **Upgrade to Managed**.\n\n### REST\n\nTo upgrade a bucket asset, use the\n[lakes.zones.assets.patch](/dataplex/docs/reference/rest/v1/projects.locations.lakes.zones.assets/patch)\nmethod.\n\nDowngrade a Cloud Storage bucket asset\n--------------------------------------\n\nWhen you [downgrade a Cloud Storage bucket asset](/dataplex/docs/lake-security#upgrade),\nDataplex Universal Catalog removes the attached\n[BigLake tables](/bigquery/docs/biglake-intro) and creates\nexternal tables. \n\n### Console\n\n1. In the Google Cloud console, go to the Dataplex Universal Catalog page.\n\n [Go to Dataplex Universal Catalog](https://console.cloud.google.com/dataplex/lakes)\n2. On the **Manage** page, click the name of the lake. The lake page opens.\n\n3. On the **Zones** tab, click the name of the data zone. The\n data zone page opens.\n\n4. On the **Assets** tab, click the name of the asset that you want to\n upgrade.\n\n5. Click **Downgrade from Managed**.\n\n### REST\n\nTo downgrade a bucket asset, use the\n[lakes.zones.assets.patch](/dataplex/docs/reference/rest/v1/projects.locations.lakes.zones.assets/patch)\nmethod. Make sure that you set the `readAccessMode` field to `DIRECT` in\n[`ResourceSpec`](/dataplex/docs/reference/rest/v1/projects.locations.lakes.zones.assets#resourcespec).\n\nRemove an asset\n---------------\n\nRemove the asset from the data zone or lake before attaching it to a\ndifferent one.\n| **Note:** Your Cloud Storage bucket isn't deleted when you remove it from your data zone or lake. You must explicitly delete it, if required.\n\nTo remove an asset, follow these steps: \n\n### Console\n\n1. In the Google Cloud console, go to the Dataplex Universal Catalog page.\n\n [Go to Dataplex Universal Catalog](https://console.cloud.google.com/dataplex/lakes)\n2. On the **Manage** page, click the lake from which\n you want to remove a Cloud Storage bucket or\n BigQuery dataset. The lake page for that lake opens.\n\n3. On the **Zones** tab, click the name of the data zone you\n want to remove the Cloud Storage bucket or BigQuery\n dataset from. The Data zone page for that data zone opens.\n\n4. On the **Assets** tab, select the asset by checking the box to the left\n of the asset name.\n\n5. Click **Delete Asset**.\n\n6. On the confirmation dialog, click **Delete**.\n\n### REST\n\nTo remove a bucket, use the\n[lakes.zones,assets.delete](/dataplex/docs/reference/rest/v1/projects.locations.lakes.zones.assets/delete)\nmethod.\n\nWhat's next\n-----------\n\n- Learn more about [discovering data](/dataplex/docs/discover-data).\n- Learn how to [create a lake](/dataplex/docs/create-lake).\n- Learn more about [Cloud Audit Logs](/dataplex/docs/audit-logging)."]]