Exporting data

This page describes how to export data from Cloud SQL instances or from a database server not managed by Cloud SQL.

You can export data from Cloud SQL to Cloud Storage. You can download data from Cloud Storage to your local environment if you want to access it locally.

To export data to Cloud Storage, the instance's service account needs to have the legacyBucketWriter Cloud IAM role set in the project. For more information, see Cloud Identity and Access Management for Cloud Storage.

For best practices for exporting data, see Best Practices for Importing and Exporting Data.

You can export a CSV file. You also can export a SQL dump file, e.g., to export to another SQL database.

Exporting data to a SQL dump file

When you use Cloud SQL to perform an export, whether from the Cloud Console, the gcloud command-line tool, or the API, you are using the pg_dump utility, with the options required to ensure that the resulting export file is valid for import back into Cloud SQL.

You can also run pg_dump manually, if you are exporting a database not managed by Cloud SQL. However, if you plan to import your data into Cloud SQL, you must follow the instructions provided in Exporting data from an external database server.

Exporting data using Cloud SQL

Console

  1. Go to the Cloud SQL Instances page in the Google Cloud Console.

    Go to the Cloud SQL Instances page

  2. Click the instance you want to export data from to open its Instance details page.
  3. Click Export in the button bar.
  4. Under Cloud Storage export location, select a Cloud Storage bucket or folder for your export.
  5. In the Name field, provide a name for your export file and click Select.
  6. For Format, select SQL.
  7. Select the name of the database you want to export.
  8. Click Export to start the export.

gcloud

To export a dump file from an instance to a Cloud Storage bucket:

  1. Create a bucket for the export:
      gsutil mb -p [PROJECT_NAME] -l [LOCATION_NAME] gs://[BUCKET_NAME]
      

    This step is not required, but strongly recommended, so you do not open up access to any other data.

  2. Describe the instance you are exporting from:
      gcloud sql instances describe [INSTANCE_NAME]
      
  3. Copy the serviceAccountEmailAddress field.
  4. Use gsutil iam to grant the legacyBucketWriter Cloud IAM role to the service account for the bucket. For help with setting Cloud IAM permissions, see Using Cloud IAM permissions.
  5. Export the database:
      gcloud sql export sql [INSTANCE_NAME] gs://[BUCKET_NAME]/sqldumpfile.gz \
                                  --database=[DATABASE_NAME]
      

    For information about using the export sql command, see the sql export sql command reference page.

  6. If you do not need to retain the Cloud IAM role you set previously, remove it now.

REST

  1. Create a bucket for the export:
    gsutil mb -p [PROJECT_NAME] -l [LOCATION_NAME] gs://[BUCKET_NAME]
    

    This step is not required, but strongly recommended, so you do not open up access to any other data.

  2. Provide your instance with the legacyBucketWriter Cloud IAM role for your bucket. For help with setting Cloud IAM permissions, see Using Cloud IAM permissions.
  3. Export your database:

    Before using any of the request data below, make the following replacements:

    • project-id: The project ID
    • instance-id: The instance ID
    • bucket_name: The Cloud Storage bucket name
    • path_to_dump_file: The path to the SQL dump file
    • database_name_1: The name of a database inside the Cloud SQL instance
    • database_name_2: The name of a database inside the Cloud SQL instance

    HTTP method and URL:

    POST https://www.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export

    Request JSON body:

    {
     "exportContext":
       {
          "fileType": "SQL",
          "uri": "gs://bucket_name/path_to_dump_file",
          "databases": ["database_name_1", "database_name_2"]
        }
    }
    

    To send your request, expand one of these options:

    You should receive a JSON response similar to the following:

  4. If you do not need to retain the Cloud IAM role you set previously, remove it now.
For the complete list of parameters for this request, see the instances:export page.

Exporting data from an externally-managed database server

To export a database that is not managed by Cloud SQL, you use the pg_dump utility with the following flags:

  • --no-owner

    Ownership change commands must not be included in the SQL dump file.

  • --format=plain

    Only plain SQL format is supported by Cloud SQL.

  • --no-acl

    This flag is required if your dump would otherwise contain statements to grant or revoke membership in a SUPERUSER role.

In addition, you must remove all of the following:

  • extension-related statements, if Cloud SQL does not support that extension. See PostgreSQL Extensions for the list of supported extensions.
  • CREATE EXTENSION or DROP EXTENSION statements referencing plpgsql. This extension comes pre-installed on Cloud SQL Postgres instances.
  • COMMENT ON EXTENSION statements.

From a command line, run pg_dump:

pg_dump -U [USERNAME] --format=plain --no-owner --no-acl [DATABASE_NAME] \
    | sed -E 's/(DROP|CREATE|COMMENT ON) EXTENSION/-- \1 EXTENSION/g' > [SQL_FILE].sql

The sed post-processing comments out all extension statements in the SQL dump file.

Confirm that the default encoding, as determined by the database settings, is correct for your data. If needed, you can override the default with the --encoding flag.

Exporting data to a CSV file

You can export your data in CSV format, which is usable by other tools and environments.

Console

  1. Go to the Cloud SQL Instances page in the Google Cloud Console.

    Go to the Cloud SQL Instances page

  2. Click the instance to open its Instance overview page.
  3. Click Export.
  4. Under Cloud Storage export location, select a Cloud Storage bucket or folder for your export.
  5. In the Name field, provide a name for your export file and click Select.

    You can use a file extension of .gz to compress your export file.

  6. Set Format to CSV.
  7. Select the database you want to export from.
  8. Enter a SQL query to specify the data to export.

    For example, to export the entire contents of the entries table in the guestbook database, you would enter SELECT * FROM guestbook.entries;. Your query must specify a table in the specified database; you cannot export an entire database in CSV format.

  9. Click Export to start the export.

gcloud

  1. Create a bucket for the export:
    gsutil mb -p [PROJECT_NAME] -l [LOCATION_NAME] gs://[BUCKET_NAME]
    

    This step is not required, but strongly recommended, so you do not open up access to any other data.

  2. Describe the instance you are exporting from:
    gcloud sql instances describe [INSTANCE_NAME]
    
  3. Use gsutil iam to grant the legacyBucketWriter Cloud IAM role to the service account for the bucket. For help with setting Cloud IAM permissions, see Using Cloud IAM permissions.
  4. Export the database:
    gcloud sql export csv [INSTANCE_NAME] gs://[BUCKET_NAME]/[FILE_NAME] \
                                --database=[DATABASE_NAME] --query=[SELECT_QUERY]
    

    For information about using the export csv command, see the sql export csv command reference page.

  5. If you do not need to retain the Cloud IAM permissions you set previously, remove them now.

REST

  1. Create a bucket for the export:
    gsutil mb -p [PROJECT_NAME] -l [LOCATION_NAME] gs://[BUCKET_NAME]
    

    This step is not required, but strongly recommended, so you do not open up access to any other data.

  2. Provide your instance with the legacyBucketWriter Cloud IAM role for your bucket. For help with setting Cloud IAM permissions, see Using Cloud IAM permissions.
  3. Export your database:

    Before using any of the request data below, make the following replacements:

    • project-id: The project ID
    • instance-id: The instance ID
    • bucket_name: The Cloud Storage bucket name
    • path_to_csv_file: The path to the CSV file
    • database_name: The name of a database inside the Cloud SQL instance
    • select_query: SQL query for export

    HTTP method and URL:

    POST https://www.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export

    Request JSON body:

    {
     "exportContext":
       {
          "fileType": "CSV",
          "uri": "gs://bucket_name/path_to_csv_file",
          "databases": ["database_name"],
          "csvExportOptions":
           {
               "selectQuery":"select_query"
           }
       }
    }
    

    To send your request, expand one of these options:

    You should receive a JSON response similar to the following:

    You must specify exactly one database with the databases property, and if the select query specifies a database, it must be the same.

  4. If you do not need to retain the Cloud IAM permissions you set previously, remove them now.
For the complete list of parameters for this request, see the instances:export page.

CSV export creates standard CSV output. If you need a non-standard CSV format, you can use the following statement in a psql client:

  \copy [table_name] TO '[csv_file_name].csv' WITH
      (FORMAT csv, ESCAPE '[escape_character]', QUOTE '[quote_character]',
      DELIMITER '[delimiter_character]', ENCODING 'UTF8', NULL '[null_marker_string]');
To see how the underlying REST API request is constructed for this task, see the APIs Explorer on the instances:export page.

What's next