데이터베이스 서비스 가져오기 서비스 계정에 덤프 파일에 대한 액세스 권한이 있어야 합니다.
서비스 계정의 이름은 가져오는 데이터베이스의 유형에 따라 postgresql-import-DATABASE_CLUSTER_NAME 또는 oracle-import-DATABASE_CLUSTER_NAME입니다.
DATABASE_CLUSTER_NAME을 데이터를 가져오는 데이터베이스 클러스터의 이름으로 바꿉니다.
GDC 콘솔 또는 Distributed Cloud CLI를 사용하여 덤프 파일을 데이터베이스 클러스터로 가져올 수 있습니다.
콘솔
GDC 콘솔에서 데이터베이스 클러스터 개요 페이지를 열어 가져오려는 데이터베이스가 포함된 클러스터를 확인합니다.
가져오기를 클릭합니다. 계정으로 데이터 가져오기 패널이 열립니다.
계정으로 데이터 가져오기 패널의 소스 섹션에서 이전에 업로드한 SQL 데이터 덤프 파일의 위치를 지정합니다.
대상 필드에서 가져올 기존 대상 데이터베이스를 지정합니다.
가져오기를 클릭합니다. GDC 콘솔의 배너에 가져오기 상태가 표시됩니다.
gdcloud CLI
Distributed Cloud CLI를 사용하기 전에 설치하고 초기화하세요.
그런 다음 조직으로 인증합니다.
[[["이해하기 쉬움","easyToUnderstand","thumb-up"],["문제가 해결됨","solvedMyProblem","thumb-up"],["기타","otherUp","thumb-up"]],[["이해하기 어려움","hardToUnderstand","thumb-down"],["잘못된 정보 또는 샘플 코드","incorrectInformationOrSampleCode","thumb-down"],["필요한 정보/샘플이 없음","missingTheInformationSamplesINeed","thumb-down"],["번역 문제","translationIssue","thumb-down"],["기타","otherDown","thumb-down"]],["최종 업데이트: 2025-09-04(UTC)"],[],[],null,["# Import from a dump file\n\nBefore importing data, you must:\n\n1. [Create a database cluster](/distributed-cloud/hosted/docs/latest/gdch/application/ao-user/db-service#create)\n to import the data to.\n\n2. Upload the dump file to a storage bucket. See\n [Upload objects to storage buckets](/distributed-cloud/hosted/docs/latest/gdch/platform/pa-user/upload-download-storage-objects#upload_objects_to_storage_buckets)\n for instructions.\n\n The Database Service import service account must have access to the dump file.\n The service account is named\n `postgresql-import-`\u003cvar translate=\"no\"\u003eDATABASE_CLUSTER_NAME\u003c/var\u003e or\n `oracle-import-`\u003cvar translate=\"no\"\u003eDATABASE_CLUSTER_NAME\u003c/var\u003e, depending on\n the type of database you are importing.\n\n Replace \u003cvar translate=\"no\"\u003eDATABASE_CLUSTER_NAME\u003c/var\u003e with the name of the\n database cluster where you are importing data.\n\nYou can import a dump file into a database cluster using either the\nGDC console or the Distributed Cloud CLI: \n\n### Console\n\n1. Open the **Database cluster overview** page in the GDC console to see\n the cluster that contains the database you are importing.\n\n2. Click **Import** . The **Import data to accounts** panel opens.\n\n3. In the **Source** section of the **Import data to accounts** panel, specify\n the location of the SQL data dump file you uploaded previously.\n\n4. In the **Destination** field, specify an existing destination database for the import.\n\n | **Note:** You can leave this field empty if the dump file already specifies a destination. If the dump file specifies a destination and you fill the **Destination** field, the **Destination** field overrides the destination specified in the dump file.\n5. Click **Import**. A banner on the GDC console shows the status of\n the import.\n\n### gdcloud CLI\n\n1. Before using Distributed Cloud CLI,\n [install and initialize](/distributed-cloud/hosted/docs/latest/gdch/resources/gdcloud-install) it.\n Then, [authenticate](/distributed-cloud/hosted/docs/latest/gdch/resources/gdcloud-auth) with your\n organization.\n\n2. Run the following command to import a dump file into a database:\n\n gdcloud database import sql \u003cvar translate=\"no\"\u003eDATABASE_CLUSTER\u003c/var\u003e s3://\u003cvar translate=\"no\"\u003eBUCKET_NAME/sample.dmp\u003c/var\u003e \\\n --project=\u003cvar translate=\"no\"\u003ePROJECT_NAME\u003c/var\u003e\n\n Replace the following:\n - \u003cvar translate=\"no\"\u003eDATABASE_CLUSTER\u003c/var\u003e with the name of the database cluster to import data into.\n - \u003cvar translate=\"no\"\u003eBUCKET_NAME/SAMPLE.dmp\u003c/var\u003e with the location of the dump file.\n - \u003cvar translate=\"no\"\u003ePROJECT_NAME\u003c/var\u003e with the name of the project that the database cluster is in.\n\n### API\n\n apiVersion: \u003cvar translate=\"no\"\u003e\u003cspan class=\"devsite-syntax-l devsite-syntax-l-Scalar devsite-syntax-l-Scalar-Plain\"\u003eDBENGINE_NAME\u003c/span\u003e\u003c/var\u003e.dbadmin.gdc.goog/v1\n kind: Import\n metadata:\n name: \u003cvar translate=\"no\"\u003e\u003cspan class=\"devsite-syntax-l devsite-syntax-l-Scalar devsite-syntax-l-Scalar-Plain\"\u003eIMPORT_NAME\u003c/span\u003e\u003c/var\u003e\n namespace: \u003cvar translate=\"no\"\u003e\u003cspan class=\"devsite-syntax-l devsite-syntax-l-Scalar devsite-syntax-l-Scalar-Plain\"\u003eUSER_PROJECT\u003c/span\u003e\u003c/var\u003e\n spec:\n dbclusterRef: \u003cvar translate=\"no\"\u003e\u003cspan class=\"devsite-syntax-l devsite-syntax-l-Scalar devsite-syntax-l-Scalar-Plain\"\u003eDBCLUSTER_NAME\u003c/span\u003e\u003c/var\u003e\n dumpStorage:\n s3Options:\n bucket: \u003cvar translate=\"no\"\u003e\u003cspan class=\"devsite-syntax-l devsite-syntax-l-Scalar devsite-syntax-l-Scalar-Plain\"\u003eBUCKET_NAME\u003c/span\u003e\u003c/var\u003e\n key: \u003cvar translate=\"no\"\u003e\u003cspan class=\"devsite-syntax-l devsite-syntax-l-Scalar devsite-syntax-l-Scalar-Plain\"\u003eDUMP_FILE_PATH\u003c/span\u003e\u003c/var\u003e\n type: S3\n\nReplace the following variables:\n\n- \u003cvar translate=\"no\"\u003eDBENGINE_NAME\u003cvar translate=\"no\"\u003e\u003c/var\u003e\u003c/var\u003e: the name of the database engine. This is one of `alloydbomni`, `postgresql` or `oracle`.\n- \u003cvar translate=\"no\"\u003eIMPORT_NAME\u003cvar translate=\"no\"\u003e\u003c/var\u003e\u003c/var\u003e: the name of the import operation.\n- \u003cvar translate=\"no\"\u003eUSER_PROJECT\u003cvar translate=\"no\"\u003e\u003c/var\u003e\u003c/var\u003e: the name of the user project where the database cluster to import is created.\n- \u003cvar translate=\"no\"\u003eDBCLUSTER_NAME\u003cvar translate=\"no\"\u003e\u003c/var\u003e\u003c/var\u003e: the name of the database cluster.\n- \u003cvar translate=\"no\"\u003eBUCKET_NAME\u003cvar translate=\"no\"\u003e\u003c/var\u003e\u003c/var\u003e: the name of the object storage bucket that stores the import files.\n- \u003cvar translate=\"no\"\u003eDUMP_FILE_PATH\u003cvar translate=\"no\"\u003e\u003c/var\u003e\u003c/var\u003e: the name of the object storage path to the stored files."]]