다음 Dataflow 템플릿을 사용하여 Bigtable의 데이터를 Avro 파일로 내보낼 수 있고, 해당 데이터를 다시 Bigtable로 가져올 수 있습니다. Google Cloud CLI 또는 Google Cloud 콘솔을 사용하여 템플릿을 실행할 수 있습니다. 소스 코드는 GitHub에 있습니다.
다음 Dataflow 템플릿을 사용하여 Bigtable의 데이터를 Parquet 파일로 내보낸 다음 해당 데이터를 다시 Bigtable로 가져올 수 있습니다. gcloud CLI 또는 Google Cloud 콘솔을 사용하여 템플릿을 실행할 수 있습니다. 소스 코드는 GitHub에 있습니다.
다음 Dataflow 템플릿을 사용하여 Bigtable의 데이터를 SequenceFiles로 내보낼 수 있고 해당 데이터를 다시 Bigtable로 가져올 수 있습니다. Google Cloud CLI 또는 Google Cloud 콘솔을 사용하여 템플릿을 실행할 수 있습니다.
cbt CLI
를 사용하여 CSV 파일의 데이터를 Bigtable 테이블로 가져올 수 있습니다. 이렇게 하려면 Cloud Shell과 같은 환경에서 CSV 파일에 액세스할 수 있어야 합니다. 다음 방법 중 하나로 CSV 파일을 Cloud Shell에 가져올 수 있습니다.
로컬 CSV 파일 업로드:
Cloud Shell에서 more_vert더보기 메뉴를 클릭하고 업로드를 선택합니다.
로컬 머신에서 CSV 파일을 선택합니다.
파일을 업로드한 후 cbt CLI 명령어에서 파일 이름을 참조합니다.
Cloud Storage에서 CSV 파일 복사:
cbt CLI는 Cloud Storage 버킷에서 가져오기를 직접 지원하지 않습니다. 먼저 Cloud Storage에서 Cloud Shell 환경으로 CSV 파일을 복사해야 합니다. 자세한 내용은 버킷에 객체 업로드를 참고하세요.
CSV 파일을 환경에서 사용할 수 있게 되면 cbt CLI 명령어를 사용하여 데이터를 가져옵니다. 샘플 명령어는 입력 파일을 기반으로 여러 행 일괄 쓰기를 참고하세요.
[[["이해하기 쉬움","easyToUnderstand","thumb-up"],["문제가 해결됨","solvedMyProblem","thumb-up"],["기타","otherUp","thumb-up"]],[["이해하기 어려움","hardToUnderstand","thumb-down"],["잘못된 정보 또는 샘플 코드","incorrectInformationOrSampleCode","thumb-down"],["필요한 정보/샘플이 없음","missingTheInformationSamplesINeed","thumb-down"],["번역 문제","translationIssue","thumb-down"],["기타","otherDown","thumb-down"]],["최종 업데이트: 2025-08-27(UTC)"],[[["\u003cp\u003eThis page provides various methods for importing and exporting data to and from Bigtable, including using Dataflow templates and the \u003ccode\u003ecbt\u003c/code\u003e CLI.\u003c/p\u003e\n"],["\u003cp\u003eData can be exported from BigQuery, Apache Cassandra, or directly from Bigtable tables, using Dataflow templates such as BigQuery to Bigtable and Apache Cassandra to Bigtable, among others.\u003c/p\u003e\n"],["\u003cp\u003eBigtable data can be exported as Avro, Parquet, or SequenceFile formats, then imported back into Bigtable using provided Dataflow templates, with source code available on GitHub.\u003c/p\u003e\n"],["\u003cp\u003eUsers can export Avro, Parquet, or SequenceFiles directly from the Tables page in the Google Cloud console by selecting the desired file type in the export menu of a table.\u003c/p\u003e\n"],["\u003cp\u003eThe \u003ccode\u003ecbt\u003c/code\u003e CLI allows for the import of data from CSV files into a Bigtable table, enabling batch writes of many rows based on an input file.\u003c/p\u003e\n"]]],[],null,["# Import and export data\n======================\n\nThis page lists available methods for importing and exporting data into\nand from Bigtable.\n\nImport data into Bigtable\n-------------------------\n\nTo import BigQuery data into Bigtable, see\n[Export data to Bigtable (Reverse ETL)](/bigquery/docs/export-to-bigtable) in the\nBigQuery documentation.\n\nYou can run continuous queries on your BigQuery data and export\nthe results to Bigtable in real time using reverse ETL. For more\ninformation, see\n[Introduction to continuous queries](/bigquery/docs/continuous-queries-introduction) in the\nBigQuery documentation.\n\nMove or copy data using a template\n----------------------------------\n\nYou can use the following Dataflow templates to move or copy data\nbetween Bigtable and other sources or destinations.\n\n### BigQuery\n\nThe following Dataflow template lets you export data from\nBigQuery to Bigtable.\n\n- [BigQuery to Bigtable](/dataflow/docs/guides/templates/provided/bigquery-to-bigtable)\n\n### Apache Cassandra to Bigtable\n\nThe following Dataflow template lets you export data from\nApache Cassandra to Bigtable.\n\n- [Apache Cassandra to Bigtable](/dataflow/docs/guides/templates/provided/cassandra-to-bigtable)\n\n### Avro files\n\nThe following Dataflow templates allow you to export data from\nBigtable as Avro files and then import the data back into\nBigtable. You can execute the templates by using the\nGoogle Cloud CLI or the Google Cloud console. The [source code](https://github.com/GoogleCloudPlatform/DataflowTemplates/tree/main/v1/src/main/java/com/google/cloud/teleport/bigtable) is\non GitHub.\n\n- [Bigtable to Cloud Storage Avro](/dataflow/docs/guides/templates/provided-batch#cloudbigtabletoavrofile)\n- [Cloud Storage Avro to Bigtable](/dataflow/docs/guides/templates/provided-batch#avrofiletocloudbigtable)\n\n### Parquet files\n\nThe following Dataflow templates allow you to export data from\nBigtable as Parquet files and then import the data back into\nBigtable. You can execute the templates by using the\ngcloud CLI or the Google Cloud console. The\n[source code](https://github.com/GoogleCloudPlatform/DataflowTemplates/blob/main/v1/src/main/java/com/google/cloud/teleport/bigtable/BigtableToParquet.java) is on GitHub.\n\n- [Bigtable to Cloud Storage Parquet](/dataflow/docs/guides/templates/provided-batch#cloudbigtabletoparquetfile)\n- [Cloud Storage Parquet to Bigtable](/dataflow/docs/guides/templates/provided-batch#parquetfiletocloudbigtable)\n\n### SequenceFiles\n\nThe following Dataflow templates allow you to export data from\nBigtable as SequenceFiles and then import the data back into\nBigtable. You can execute the templates by using the\nGoogle Cloud CLI or the Google Cloud console.\n\n- [Bigtable to Cloud Storage SequenceFile](/dataflow/docs/guides/templates/provided-batch#cloudbigtabletosequencefile)\n- [Cloud Storage SequenceFile to Bigtable](/dataflow/docs/guides/templates/provided-batch#sequencefiletocloudbigtable)\n\nImport from the Tables page\n---------------------------\n\nYou can execute many of the import methods described on this page using the\nGoogle Cloud console. Import the following types of data from the **Tables**\npage:\n\n- CSV data\n- BigQuery data\n- Avro files\n- Cassandra keyspaces and tables\n- Parquet files\n- SequenceFile files\n\n### Console\n\n1.\n Open the list of Bigtable instances in the Google Cloud console.\n\n\n [Open the instance list](https://console.cloud.google.com/bigtable/instances)\n2. Click the instance that contains the table you want to import.\n\n3. Click **Tables** in the left pane.\n\n The **Tables** page displays a list of tables in the instance.\n4. Next to the name of the table that you want to import data into, click\n the more_vert **Table action** menu.\n\n5. Click **Import data**, and then select the type of data that you want to\n import:\n\n - If you select Avro, Parquet, SequenceFile or Cassandra, the console displays a partly completed Dataflow template. Fill out the job template and click **Run job**.\n - If you select CSV, the `cbt` CLI terminal window opens. For more information, see the [Import CSV data](#csv) section of this document.\n - If you select BigQuery, BigQuery Studio opens. Fill out the reverse ETL query and run it.\n\nExport from the Tables page\n---------------------------\n\nYou can execute some of the export methods described on this page using the\nGoogle Cloud console. Export the following types of data from the **Tables**\npage:\n\n- Avro files\n- Parquet files\n- SequenceFile files\n\n### Console\n\n1.\n Open the list of Bigtable instances in the Google Cloud console.\n\n\n [Open the instance list](https://console.cloud.google.com/bigtable/instances)\n2. Click the instance that contains the table you want to export.\n\n3. Click **Tables** in the left pane.\n\n The **Tables** page displays a list of tables in the instance.\n4. Next to the name of the table, click the more_vert **Table action** menu.\n\n5. Click **Export data**, and then select the file type that you want to export.\n\n The console displays a partly completed Dataflow template.\n6. Fill out the job template and click **Run job**.\n\nImport CSV data\n---------------\n\nYou can import data from a CSV file into a Bigtable table by using\nthe\n`cbt` CLI\n. To do this, you need to make sure that your environment,\nsuch as Cloud Shell, can access the CSV file. You can get your CSV file\ninto Cloud Shell in one of the following ways:\n\n**Upload a local CSV file**:\n\n1. In Cloud Shell, click the more_vert **More** menu and select **Upload**.\n2. Select the CSV file from your local machine.\n3. After you upload the file, refer to the file by its name in the `cbt` CLI command.\n\n**Copy a CSV file from Cloud Storage**:\n\nThe\n`cbt` CLI\ndoes not directly support importing from a Cloud Storage\nbucket. You must first copy the CSV file from Cloud Storage to your\nCloud Shell environment. For more information, see\n[Upload an object to a bucket](/storage/docs/uploading-objects#uploading-an-object).\n\nAfter the CSV file is available in your environment, use the\n`cbt` CLI\n\ncommand to import the data. For a sample command, see\n[Batch write many rows based on the input file](/bigtable/docs/cbt-reference#batch_write_many_rows_based_on_the_input_file).\n\nWhat's next\n-----------\n\n- [Create, copy, or restore from a Bigtable backup.](/bigtable/docs/backups)\n- [Explore concepts related to designing a Bigtable schema.](/bigtable/docs/schema-design)\n- [Migrate to Bigtable](/bigtable/docs/migrate)."]]