Data Catalog를 이미 사용 중인 경우 Data Catalog API가 사용 설정된 프로젝트가 이미 있어야 합니다. Data Catalog에서 여러 프로젝트를 사용하는 데 권장하는 방법에 대한 자세한 내용은 여러 프로젝트에서 태그 템플릿 사용을 참조하세요.
Data Catalog와 처음 상호작용하는 경우 다음을 수행합니다.
Sign in to your Google Cloud account. If you're new to
Google Cloud,
create an account to evaluate how our products perform in
real-world scenarios. New customers also get $300 in free credits to
run, test, and deploy workloads.
In the Google Cloud console, on the project selector page,
select or create a Google Cloud project.
Analytics Hub의 목록을 구독하면 연결된 데이터 세트가 프로젝트에 생성됩니다. Data Catalog는 연결된 데이터 세트와 데이터 세트에 포함된 모든 테이블의 메타데이터 항목을 자동으로 생성합니다. 연결된 데이터 세트와 기타 Analytics Hub 기능에 대한 자세한 내용은 Analytics Hub 소개를 참조하세요.
Data Catalog 검색에서는 연결된 데이터 세트가 표준 BigQuery 데이터 세트로 표시되지만 type=dataset.linked 조건자를 사용하여 필터링할 수 있습니다. 자세한 내용은 데이터 애셋 검색을 참조하세요.
BigQuery 및 Pub/Sub
조직에서 이미 BigQuery 및 Pub/Sub를 사용하는 경우 권한에 따라 즉시 해당 소스에서 메타데이터를 검색할 수 있습니다. 해당 항목이 검색결과에 표시되지 않으면 관리자와 프로젝트 사용자가 Identity and Access Management에서 필요할 수 있는 IAM 역할을 찾습니다.
Bigtable
Bigtable에 데이터를 저장하면 메타데이터가 다음 Bigtable 리소스의 Data Catalog에 자동으로 동기화됩니다.
[[["이해하기 쉬움","easyToUnderstand","thumb-up"],["문제가 해결됨","solvedMyProblem","thumb-up"],["기타","otherUp","thumb-up"]],[["이해하기 어려움","hardToUnderstand","thumb-down"],["잘못된 정보 또는 샘플 코드","incorrectInformationOrSampleCode","thumb-down"],["필요한 정보/샘플이 없음","missingTheInformationSamplesINeed","thumb-down"],["번역 문제","translationIssue","thumb-down"],["기타","otherDown","thumb-down"]],["최종 업데이트: 2025-09-04(UTC)"],[[["\u003cp\u003eData Catalog imports and updates metadata from various Google Cloud data sources and many on-premises data sources.\u003c/p\u003e\n"],["\u003cp\u003eOnce metadata is ingested, Data Catalog enables metadata discovery through search and allows data enrichment with business metadata via tags.\u003c/p\u003e\n"],["\u003cp\u003eFor custom on-premises data sources, integration can be achieved using community-contributed connectors or through the Data Catalog API for custom entries.\u003c/p\u003e\n"],["\u003cp\u003eData Catalog automatically generates metadata entries for linked datasets in Analytics Hub, displaying them as standard BigQuery datasets in search results.\u003c/p\u003e\n"],["\u003cp\u003eMetadata for Bigtable, Spanner, and Vertex AI resources are automatically synced to Data Catalog, including details like tables, databases, and column schemas.\u003c/p\u003e\n"]]],[],null,["# Integrate your data sources with Data Catalog\n\nData Catalog can import and keep up-to-date metadata from\nseveral Google Cloud data sources as well as a number of popular\non-premises ones.\n\nWith metadata ingested, Data Catalog does the following:\n\n- Makes the existing metadata discoverable through search. For more information, see [How to search](/data-catalog/docs/how-to/search).\n- Allows the members of your organization to enrich your data with additional business metadata through tags. For more information, see [Tags and tag templates](/data-catalog/docs/tags-and-tag-templates).\n\nWhile the integration with Google Cloud sources is automatic, to\nintegrate with custom on-premises sources that your organization uses, you can\ndo either of the following:\n\n- Set up and run corresponding [connectors](/data-catalog/docs/integrate-data-sources#integrate_on-premises_data_sources) contributed by the community.\n- Use the [Data Catalog API for custom entries](/data-catalog/docs/how-to/custom-entries).\n\nBefore you begin\n----------------\n\nIf you're already using Data Catalog, you must already have a\nproject with the enabled Data Catalog API. For more information\non the recommended way to use multiple projects with\nData Catalog, see\n[Using tag templates in multiple projects](/data-catalog/docs/concepts/resource-project#multiple-projects).\n\nIf this is the first time you interact with the Data Catalog,\ndo the following:\n\n- Sign in to your Google Cloud account. If you're new to Google Cloud, [create an account](https://console.cloud.google.com/freetrial) to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.\n- In the Google Cloud console, on the project selector page,\n select or create a Google Cloud project.\n\n | **Note**: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.\n\n [Go to project selector](https://console.cloud.google.com/projectselector2/home/dashboard)\n-\n [Verify that billing is enabled for your Google Cloud project](/billing/docs/how-to/verify-billing-enabled#confirm_billing_is_enabled_on_a_project).\n\n-\n\n\n Enable the Data Catalog API.\n\n\n [Enable the API](https://console.cloud.google.com/flows/enableapi?apiid=datacatalog.googleapis.com)\n\n- In the Google Cloud console, on the project selector page,\n select or create a Google Cloud project.\n\n | **Note**: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.\n\n [Go to project selector](https://console.cloud.google.com/projectselector2/home/dashboard)\n-\n [Verify that billing is enabled for your Google Cloud project](/billing/docs/how-to/verify-billing-enabled#confirm_billing_is_enabled_on_a_project).\n\n-\n\n\n Enable the Data Catalog API.\n\n\n [Enable the API](https://console.cloud.google.com/flows/enableapi?apiid=datacatalog.googleapis.com)\n\n\u003cbr /\u003e\n\nIntegrate Google Cloud data sources\n-----------------------------------\n\n| **Note:** In a project with an organization policy that [restricts resource\n| locations](/resource-manager/docs/organization-policy/defining-locations), your metadata for some Bigtable and Spanner resources might not be synced to Data Catalog.\n\n### BigQuery sharing\n\nWhen you subscribe to a listing in BigQuery sharing (formerly Analytics Hub), a linked dataset\nis created in your project. Data Catalog\nautomatically generates metadata entries for that linked dataset and all tables\ncontained in it. For more information on linked datasets and other\nBigQuery sharing features, see\n[Introduction to Sharing](/bigquery/docs/analytics-hub-introduction).\n\nIn Data Catalog search, linked datasets are displayed as\nstandard BigQuery datasets, but you can filter them using\nthe `type=dataset.linked` predicate. For more details,\nsee [Search for data assets](/data-catalog/docs/how-to/search).\n\n### BigQuery and Pub/Sub\n\nIf your organization already uses BigQuery and\nPub/Sub, depending on your permissions, you can [search for the\nmetadata](/data-catalog/docs/how-to/search) from those sources right away. If\nyou can't see the corresponding entries in search results, look for the IAM\nroles that you and the users of your project might need in\n[Identity and Access Management](/data-catalog/docs/concepts/iam#searching_resources).\n\n### Bigtable\n\nWhen you store data in Bigtable, metadata is automatically synced to\nData Catalog for the following Bigtable\nresources:\n\n- Instances\n- Tables, including column family details\n\nFor guidance on using Data Catalog for data discovery and\ntagging, see [Manage data assets using\nData Catalog](/bigtable/docs/manage-data-assets-using-data-catalog) in the\nBigtable documentation.\n\n### Cloud SQL\n\nCloud SQL doesn't integrate with Data Catalog, but does\nintegrate with Dataplex Universal Catalog. For more information, see\n[Integrate your data sources with Dataplex Universal Catalog](/dataplex/docs/integrate-data-sources).\n\n### Dataproc Metastore\n\nTo integrate with Dataproc Metastore, enable the sync to\nData Catalog for new or existing services as described in\n[Enabling Data Catalog sync](/dataproc-metastore/docs/data-catalog-sync#enabling-Data%20Catalog-sync).\n\n### Sensitive Data Protection\n\nAdditionally, Data Catalog integrates with Sensitive Data Protection that\nlets you scan specific Google Cloud resources for sensitive data\nand send results back to Data Catalog in the form of tags.\n\nFor more information, see\n[Sending Sensitive Data Protection scan results to Data Catalog](/sensitive-data-protection/docs/sending-results-to-dc).\n\n### Spanner\n\nWhen you store data in Spanner, metadata for the following Spanner\nresources is synced to Data Catalog:\n\n- Instances\n- Databases\n- Tables and views with column schema\n\nFor guidance on using Data Catalog for data discovery and\ntagging, see [Manage data assets using\nData Catalog](/spanner/docs/dc-integration).\n\n### Vertex AI\n\nVertex AI syncs metadata for the following resources to Data Catalog:\n\n- [Model Registry Models](/vertex-ai/docs/model-registry/introduction)\n- [Datasets](/vertex-ai/docs/datasets/overview)\n- [Online store instances](/vertex-ai/docs/featurestore/latest/create-onlinestore)\n- [Feature views](/vertex-ai/docs/featurestore/latest/create-featureview)\n- [Feature groups](/vertex-ai/docs/featurestore/latest/create-featuregroup)\n\nIntegrate on-premises data sources\n----------------------------------\n\nTo integrate on-premises data sources, you can use the corresponding\nPython connectors contributed by the community:\n| **Note:** These connectors are not officially supported by Google.\n\n1. Find your data source in the following table.\n2. Open its GitHub repository.\n3. Follow the setup instructions in the readme file.\n\nSelect a category RDBMS BI Hive\n\n### Integrate unsupported data sources\n\nIf you can't find a connector for your data source, you can still manually\nintegrate it by creating entry groups and custom entries. To do that, you can:\n\n- Use one of the [Data Catalog Client Libraries](/data-catalog/docs/reference/libraries) in one of the following languages: C#, Go, Java, Node.js, PHP, Python, or Ruby.\n- Or manually build on the [Data Catalog API](/data-catalog/docs/reference).\n\nTo integrate your sources, first, learn about\n[Entries and entry groups](/data-catalog/docs/entries-and-entry-groups), then\nfollow the instructions in\n[Create custom Data Catalog entries for your data sources](/data-catalog/docs/how-to/custom-entries).\n\nWhat's next\n-----------\n\n- Learn more about [Identity and Access Management](/data-catalog/docs/concepts/iam).\n- Learn [How to search](/data-catalog/docs/how-to/search).\n- Go through the [Tagging tables](/data-catalog/docs/tag-bigquery-dataset) quickstart."]]