This page lists available methods for importing and exporting data into and from Cloud Bigtable.
Backing up, moving, or copying data
Avro files
The following Dataflow templates allow you to export data from
Cloud Bigtable as Avro files and then import the data back into
Cloud Bigtable. You can execute the templates by using the gcloud
command-line tool or the Google Cloud Console. The source code
is on GitHub.
SequenceFiles
The following Dataflow templates allow you to export data from
Cloud Bigtable as SequenceFiles and then import the data back into
Cloud Bigtable. You can execute the templates by using the gcloud
command-line tool or the Google Cloud Console.
Exporting from the Tables page
You can also export Avro files or SequenceFiles directly from the Tables page in the Cloud Console:
Open the list of Cloud Bigtable instances in the Cloud Console.
Click the instance that contains the table you want to export.
Click Tables in the left pane.
The Tables page displays a list of tables in the instance.
Next to the name of the table you want to export, click the Overflow menu.
Hold the pointer over Export to, then click the file type you want.
The console displays a partly completed Dataflow template.
Fill out the rest of the form and click Run job.
Migrating data from another database to Cloud Bigtable
HBase
If you need to migrate data from HBase to Cloud Bigtable, use the following tutorial, which offers variants for different types of data:
Importing CSV data
Follow this tutorial to learn how to import a CSV file into Cloud Bigtable: