This page describes how to export data in an AlloyDB database to a CSV file.
The procedure to perform the export involves these tasks:
Create a Cloud Storage bucket to store the CSV file in.
Prepare a client host to perform the export operation.
Export data to a CSV file on the client host, and then copy it to the storage bucket.
Clean up the resources created to perform the procedure.
Before you begin
- You must have the Owner (
roles/owner
) or Editor (roles/editor
) basic IAM role in the Google Cloud project you are using, or you must have these predefined IAM roles:- AlloyDB Admin (
roles/alloydb.admin
) or AlloyDB Viewer (roles/alloydb.viewer
) - Storage Admin (
roles/storage.admin
) - Compute Instance Admin (v1) (
roles/compute.instanceAdmin.v1
)
- AlloyDB Admin (
Create a Cloud Storage bucket
Create a standard storage, regional storage bucket in the project and region where your AlloyDB database is located.
Prepare a client host
To prepare a client host to perform the export operation, you create a
Compute Engine VM that can connect to the AlloyDB primary
instance where your database is located, and install the psql
tool and the
Google Cloud CLI on that VM.
Follow the instructions Connect a psql client to an instance to create a Compute Engine VM with the proper connectivity and the
psql
tool installed. When following these instructions, make sure to allocate enough local storage to the Compute Engine VM to accommodate the CSV file you will create.Install the gcloud CLI to provide command-line access to create the CSV file in the Cloud Storage bucket.
Export data
To export data to a CSV file, you get the IP address of the AlloyDB
primary instance where your database is located and then run the psql
tool's \copy
command.
- Get the IP address of the AlloyDB primary instance where your database is located by viewing its details.
- SSH into the Compute Engine VM.
Console
- In the Google Cloud console, go to the VM instances page.
- In the list of virtual machine instances, click SSH in the row of the instance you created.
gcloud
Use the
gcloud compute ssh
command to connect to the instance you created.gcloud compute ssh --project=PROJECT_ID --zone=ZONE VM_NAME
Replace the following:
PROJECT_ID
: The ID of the project that contains the instance.ZONE
: The name of the zone in which the instance is located.VM_NAME
: The name of the instance.
- Export data to a CSV file:
- Run the
psql
client tool and then, at the psql prompt, connect to the database:psql -h IP_ADDRESS -U postgres \c DB_NAME
You will be prompted to enter the password of the
postgres
user.IP_ADDRESS
: The IP address of the primary instance.DB_NAME
: The name of the database where the data you want to export is located.
- Run the
\copy
command to export data to a CSV file on the client host's local file system:\copy TABLE_NAME TO 'CSV_FILE_NAME' WITH DELIMITER ',' CSV HEADER ;
TABLE_NAME
: The name of the table whose data you want to export.CSV_FILE_NAME
: Provide a file name for the CSV file to create on the client host's local file system.
For information about the wide range of options that the
\copy
command supports inpsql
, see the documentation forCOPY
, its underlying PostgreSQL command. - Enter
\q
to exitpsql
and then copy the CSV file to the Cloud Storage bucket you created earlier:gcloud storage cp CSV_FILE_NAME gs://BUCKET_NAME
Clean up resources
After successfully exporting the CSV file and copying it to the storage bucket, you can delete the Compute Engine VM you used during the export procedure.
- Run the