Exporting data from Cloud SQL

This page describes how to export data from Cloud SQL instances or from a database server not managed by Cloud SQL.

Exports use database resources, but they do not interfere with normal database operations unless the instance is under-provisioned.

For best practices for exporting data, see Best Practices for Importing and Exporting Data.

CSV and mysqldump exports behave the same in Cloud SQL as they would if they were from on any other MySQL database.

Before you begin

Make sure you have configured the required roles and permissions. To export data into Cloud Storage, the instance's service account or user must have:

  • Either the Cloud SQL Editor role, or a custom role that includes the cloudsql.instances.export permission.
  • The roles/storage.legacyBucketWriter IAM role.
  • If the service account or user is also performing import operations, grant the cloudsql.instances.import permission, and the roles/storage.legacyObjectReader IAM role.

For help with IAM roles, see Cloud Identity and Access Management for Cloud Storage.

You can find the instance's service account name in the Google Cloud Console on your instance's **Overview** page. You can verify the roles for your Cloud Storage bucket by using the gsutil tool to inspect the bucket:

gsutil iam get gs://[BUCKET_NAME]

Learn more about using IAM with buckets.

Exporting data from Cloud SQL to a SQL dump file in Cloud Storage

To create a SQL dump file, you export data from Cloud SQL to Cloud Storage. Once the file is in Cloud Storage, you can import it into another Cloud SQL database. You can also download data from Cloud Storage to your local environment if you want to access it locally.

Exporting from Cloud SQL uses the mysqldump utility with the --single-transaction and --hex-blob options. With the --single-transaction option, mysqldump starts a transaction before running. Rather than lock the entire database, this lets mysqldump read the database in the current state, making for a consistent data dump.

If your SQL dump file contains DEFINER clauses (views, triggers, stored_procedures, and so on), then depending on the order these statements are executed, using this file for import could fail. Learn more about DEFINER usage. and potential workarounds in Cloud SQL.

To export data from a database on a Cloud SQL instance to a SQL dump file in a Cloud Storage bucket:

Console

  1. Go to the Cloud SQL Instances page in the Google Cloud Console.

    Go to the Cloud SQL Instances page

  2. Click the instance you want to export data from to open its Overview page.
  3. Click EXPORT in the button bar.
  4. Under File format, click SQL to create a SQL dump file.
  5. Under Data to export, click One or more databases in this instance to export specific databases.
  6. Use the drop-down menu to select the databases you want to export from.
  7. Under Destination, select Browse to search for a Cloud Storage bucket or folder for your export.
  8. Click Export to begin the export.

gcloud

  1. Create a Cloud Storage bucket, if you haven't already.

    For help with creating a bucket, see Creating Storage Buckets.

  2. Upload the file to your bucket.

    For help with uploading files to buckets, see Uploading an Object.

  3. Describe the instance you are exporting from:
      gcloud sql instances describe [INSTANCE_NAME]
      
  4. Copy the serviceAccountEmailAddress field.
  5. Use gsutil iam to grant the storage.objectAdmin IAM role to the service account for the bucket. For help with setting IAM permissions, see Using IAM permissions.
  6. Export the database:
      gcloud sql export sql [INSTANCE_NAME] gs://[BUCKET_NAME]/sqldumpfile.gz \
                                  --database=[DATABASE_NAME] --offload
      

    The export sql command does not contain triggers or stored procedures, but does contain views. To export triggers and/or stored procedures, use the mysqldump tool.

    For more information about using the export sql command, see the sql export sql command reference page.

  7. If you do not need to retain the IAM role you set previously, revoke it now.

REST v1beta4

  1. Create a bucket for the export:
    gsutil mb -p [PROJECT_NAME] -l [LOCATION_NAME] gs://[BUCKET_NAME]
    

    This step is not required, but strongly recommended, so you do not open up access to any other data.

  2. Provide your instance with the storage.objectAdmin IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions.
  3. Export your database:

    Before using any of the request data below, make the following replacements:

    • project-id: The project ID
    • instance-id: The instance ID
    • bucket_name: The Cloud Storage bucket name
    • path_to_dump_file: The path to the SQL dump fle
    • database_name_1: The name of a database inside the Cloud SQL instance
    • database_name_2: The name of a database inside the Cloud SQL instance
    • offload: Enables serverless export. Set to true to use serverless export.

    HTTP method and URL:

    POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export

    Request JSON body:

    {
     "exportContext":
       {
          "fileType": "SQL",
          "uri": "gs://bucket_name/path_to_dump_file",
          "databases": ["database_name"]
          "offload": true | false
        }
    }
    

    To send your request, expand one of these options:

    You should receive a JSON response similar to the following:

  4. If you do not need to retain the IAM role you set previously, revoke it now.
For the complete list of parameters for this request, see the instances:export page.

Exporting data from Cloud SQL to a CSV file in Cloud Storage

You can export your data in CSV format, which is usable by other tools and environments. Exports happen at the database level. During a CSV export, you can specify the schemas to export. All schemas under the database level are eligible for export.

To export data from a database on a Cloud SQL instance to a SQL dump file in a Cloud Storage bucket:

Console

  1. Go to the Cloud SQL Instances page in the Google Cloud Console.

    Go to the Cloud SQL Instances page

  2. Click the instance to open its Overview page.
  3. Click EXPORT.
  4. Under Cloud Storage export location, select a Cloud Storage bucket or folder for your export.
  5. In the Name field, provide a name for your export file and click Select.

    You can use a file extension of .gz to compress your export file.

  6. Set Format to CSV.
  7. Click Show advanced options.
  8. Under Database, select the name of the database from the drop-down menu.
  9. Under SQL query, enter a SQL query to specify the table to export data from.

    For example, to export the entire contents of the entries table in the guestbook database, you enter

    SELECT * FROM guestbook.entries;
    Your query must specify a table in the specified database; you cannot export an entire database in CSV format.

  10. Click Export to start the export.
  11. The Export database? dialog box opens with a message that the export process can take an hour or more for large databases. During the export, the only operation you can perform on the instance is viewing information. You can't stop the export once it starts. If this is a good time to start an export, click EXPORT. Otherwise, click CANCEL.

gcloud

  1. Create a Cloud Storage bucket, if you haven't already.

    For help with creating a bucket, see Creating Storage Buckets.

  2. Upload the file to your bucket.

    For help with uploading files to buckets, see Uploading an Object.

  3. Describe the instance you are exporting from:
    gcloud sql instances describe [INSTANCE_NAME]
    
  4. Use gsutil iam to grant the storage.objectAdmin IAM role to the service account for the bucket. For help with setting IAM permissions, see Using IAM permissions.
  5. Export the database:
    gcloud sql export csv [INSTANCE_NAME] gs://[BUCKET_NAME]/[FILE_NAME] \
                                --database=[DATABASE_NAME] \
                                --offload \
                                --query=[SELECT_QUERY]
    

    For information about using the export csv command, see the sql export csv command reference page.

  6. If you do not need to retain the IAM role you set previously, revoke it now.

REST v1beta4

  1. Create a bucket for the export:
    gsutil mb -p [PROJECT_NAME] -l [LOCATION_NAME] gs://[BUCKET_NAME]
    

    This step is not required, but strongly recommended, so you do not open up access to any other data.

  2. Provide your instance with the storage.objectAdmin IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions.
  3. Export your database:

    Before using any of the request data below, make the following replacements:

    • project-id: The project ID
    • instance-id: The instance ID
    • bucket_name: The Cloud Storage bucket name
    • path_to_csv_file: The path to the CSV file
    • database_name: The name of a database inside the Cloud SQL instance
    • offload: Enables serverless export. Set to true to use serverless export.
    • select_query: SQL query for export

    HTTP method and URL:

    POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export

    Request JSON body:

    {
     "exportContext":
       {
          "fileType": "CSV",
          "uri": "gs://bucket_name/path_to_csv_file",
          "databases": ["database_name"],
          "offload": true | false
          "csvExportOptions":
           {
               "selectQuery":"select_query"
           }
       }
    }
    

    To send your request, expand one of these options:

    You should receive a JSON response similar to the following:

    If your select query specifies a database, it overrides the databases property.

  4. If you do not need to retain the IAM role you set previously, revoke it now.
For the complete list of parameters for this request, see the instances:export page.

Exporting in CSV format is equivalent to running the following SQL statement:

      SELECT [QUERY] INTO OUTFILE ... CHARACTER SET 'utf8mb4'
            FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"'
            ESCAPED BY '\"' LINES TERMINATED BY '\n'

For the complete list of parameters for this request, see the instances:export page.

Exporting data from your local MySQL server to a CSV file

To export a MySQL database that is not managed by Cloud SQL to a CSV file, for later import into Cloud SQL, use the following command:

mysql --host=[INSTANCE_IP] --user=[USER_NAME] --password [DATABASE] \
-e " SELECT * FROM [TABLE] INTO OUTFILE '[FILE_NAME]' CHARACTER SET 'utf8mb4'
     FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' ESCAPED BY '\"' "

Exporting data from your local MySQL server using mysqldump

If you are exporting data from an on-premises MySQL database for import into a Cloud SQL database, you must use the mysqldump utility with the following flags:

  • --databases You must use the --databases option to specify an explicit list of databases to export, and this list must not contain the mysql system database.
  • --hex-blob If your database contains any binary fields, you must use this flag to ensure that your binary fields are imported correctly.
  • --set-gtid-purged=OFF GTID information must not be included in the SQL dump file, and binary logging must not be disabled by the SQL dump file. (Not required for MySQL 5.5 or external replication.)
  • --single-transaction Starts a transaction before running. Rather than lock the entire database, this lets mysqldump read the database in the current state, making for a consistent data dump.

Standard configuration

From a command line, run mysqldump:

mysqldump --databases [DATABASE_NAME] -h [INSTANCE_IP] -u [USERNAME] -p \
--hex-blob --single-transaction --set-gtid-purged=OFF \
--default-character-set=utf8mb4 > [SQL_FILE].sql

For help with mysqldump, see the mysqldump reference.

External replication

To create a dump file for use in an external server configuration, see Replicating from an external server.

Automating export operations

Although Cloud SQL doesn't provide a built-in way to automate database exports, you can build your own automation tool using several Google Cloud components. To learn more, see this tutorial.

Troubleshooting

Click the links in the table for details:

For this problem... The issue might be... Try this...
Can't see the operation status. The user interface only shows success or failure. Use these database commands to find out more.
408 Error (Timeout) during export. SQL export can take a long time depending on database size and export content. Use multiple CSV exports to reduce the size of each operation.
CSV export worked but SQL export failed. SQL export is more likely to encounter compatibility issues with Cloud SQL. Use CSV exports to export only what you need..
Export is taking too long. Cloud SQL does not support concurrent synchronous operations. Use export offloading. Learn more.
Error 1412: Table definition has changed. The table changed during export. Remove any table change statements from the dump operation.
Connection closed during the export operation. Query must produce data within first seven minutes. Test the query manually. Learn more.
Unknown error during export. Possible bandwidth issue. Ensure that both the instance and the Cloud Storage bucket are in the same region.
You want to automate exports. Cloud SQL does not provide a way to automate exports. Build your own pipeline to perform this functionality. Learn more.
Error message: Access denied; you need (at least one of) the SUPER privilege(s) for this operation. There could be an event, a view, a function, or a procedure in the dump file using super user@localhost (such as root@localhost). This is not supported by Cloud SQL. Learn more about DEFINER usage in and potential workarounds in Cloud SQL.

Can't see the operation status

You can't see the status of an ongoing operation.

The issue might be

The Google Cloud Console reports only success or failure when done, and is not designed to return warnings.

Things to try

Connect to the database and run SHOW WARNINGS.


408 Error (Timeout) during export

You see the error message 408 Error (Timeout) while performing an export job in Cloud SQL.

The issue might be

CSV and SQL formats do export differently. The SQL format exports the entire database, and likely takes longer to complete. The CSV format lets you define which elements of the database to include in the export,

Things to try

Use the CSV format, and run multiple, smaller export jobs to reduce the size and length of each operation.


CSV export worked but SQL export failed

CSV export worked but SQL export failed.

The issue might be

CSV and SQL formats do export differently. The SQL format exports the entire database, and likely takes longer to complete. The CSV format lets you define which elements of the database to include in the export,

Things to try

Use CSV exports to export only what you need.


Export is taking too long

Export is taking too long, blocking other operations.

The issue might be

Cloud SQL does not support concurrent synchronous operations.

Things to try

Try exporting smaller datasets at a time.


mysqldump: Error 1412: Table definition has changed

You see the error message mysqldump: Error 1412: Table definition has changed, retry transaction when dumping the table.

The issue might be

During the export process, there was a change in the table.

Things to try

The dump transaction can fail if you use the following statements during the export operation:

  • ALTER TABLE
  • CREATE TABLE
  • DROP TABLE
  • RENAME TABLE
  • TRUNCATE TABLE
Remove any of these statements from the dump operation.


Connection closed during the export operation

Connection closed during the export operation.

The issue might be

The connection to Cloud Storage may be timing out because the query running in the export is not producing any data within the first seven minutes since the export is initiated.

Things to try

Test the query manually by connecting from any client and sending the output of your query to STDOUT with the command below:

COPY (INSERT_YOUR_QUERY_HERE) TO STDOUT WITH ( FORMAT csv, DELIMITER ',', ENCODING 'UTF8', QUOTE '"', ESCAPE '"' ).

This is expected behavior since when the export is initiated, the client is expected to start sending data right away. Keeping the connection with no data sent ends up breaking the connection and eventually resulting in the export failing and leaving the operation in an uncertain state. Also, this is what the error message from gcloud is trying to say with this message:

operation is taking longer than expected.


Unknown error during export

You see the error message Unknown error while trying to export a database to a Cloud Storage bucket.

The issue might be

The transfer might be failing due to a bandwidth issue.

Things to try

The Cloud SQL instance may be located in a different region from the Cloud Storage bucket. Reading and writing data from one continent to another involves a lot of network usage, and can cause intermittent issues like this. Check the regions of your instance and bucket.


Want to automate exports

You want to automate exports.

The issue might be

Cloud SQL does not provide a way to automate exports.

Things to try

You could build your own automated export system using Google Cloud products such as Cloud Scheduler, Pub/Sub, and Cloud Functions.


ERROR_RDBMS system error occurred

You see the error message [ERROR_RDBMS] system error occurred.

The issue might be

  • The user might not have all the Cloud Storage permissions it needs.
  • The database table might not exist.

Things to try

  1. Check that you have at least WRITER permissions on the bucket and READER permissions on the export file. For more information on configuring access control in Cloud Storage, see Create and Manage Access Control Lists.
  2. Ensure the table exists. If the table exists, confirm that you have the correct permissions on the bucket.

Access denied; you need (at least one of) the SUPER privilege(s) for this operation

You see the error Access denied; you need (at least one of) the SUPER privilege(s) for this operation.

The issue might be

There could be an event, a view, a function, or a procedure in the dump file using super user@localhost (such as root@localhost). This is not supported by Cloud SQL.

Things to try

Refer to this document on importing a database with DEFINER clauses.

What's next