Exporting data from Cloud SQL

This page describes how to export data from Cloud SQL instances or from a database server not managed by Cloud SQL.

You can export data from Cloud SQL to Cloud Storage. You can also download data from Cloud Storage to your local environment if you want to access it locally.

To export data to Cloud Storage, the instance's service account must have the storage.objectAdmin Cloud IAM role set in the project. For more information, see Cloud Identity and Access Management for Cloud Storage.

You can export a CSV file. You also can export a SQL dump file, for example, to export data to another SQL database.

For best practices for exporting data, see Best Practices for Importing and Exporting Data.

Use the same SQL Mode for import and export

The SQL Mode setting affects how Cloud SQL interprets SQL queries. For example, if you export from a database without Strict SQL enabled, then try to import to Cloud SQL (which enables Strict SQL, by default), the import might fail. The best practice is to use the same SQL Mode on import that you used for export.

Review the SQL Mode on both the source and target databases for compatibility. Pay particular attention to the flags that enable Strict SQL mode. If Strict SQL is NOT set on your database, you will likely want to remove it in Cloud SQL. If you remove Strict SQL, you must set another flag.

To verify that your Cloud SQL instance has the desired mode set, run SELECT @@GLOBAL.sql_mode;.

Exporting data using Cloud SQL

Console

  1. Go to the Cloud SQL Instances page in the Google Cloud Console.

    Go to the Cloud SQL Instances page

  2. Click the instance you want to export data from to open its Instance details page.
  3. Click Export in the button bar.
  4. Under Cloud Storage export location, select a Cloud Storage bucket or folder for your export.
  5. In the Name field, provide a name for your export file and click Select.
  6. For Format, select SQL.
  7. Click Show advanced options and enter a comma-separated list of all the databases that you want to export. Do not include system databases.
  8. Click Export to start the export.

gcloud

To export a dump file from an instance to a Cloud Storage bucket:

  1. Create a bucket for the export:
      gsutil mb -p [PROJECT_NAME] -l [LOCATION_NAME] gs://[BUCKET_NAME]
      

    This step is not required, but strongly recommended, so you do not open up access to any other data.

  2. Describe the instance you are exporting from:
      gcloud sql instances describe [INSTANCE_NAME]
      
  3. Copy the serviceAccountEmailAddress field.
  4. Use gsutil iam to grant the storage.objectAdmin Cloud IAM role to the service account for the bucket. For help with setting Cloud IAM permissions, see Using Cloud IAM permissions.
  5. Export the database:
      gcloud sql export sql [INSTANCE_NAME] gs://[BUCKET_NAME]/sqldumpfile.gz \
                                  --database=[DATABASE_NAME]
      

    For information about using the export sql command, see the sql export sql command reference page.

  6. If you do not need to retain the Cloud IAM role you set previously, remove it now.

REST

  1. Create a bucket for the export:
    gsutil mb -p [PROJECT_NAME] -l [LOCATION_NAME] gs://[BUCKET_NAME]
    

    This step is not required, but strongly recommended, so you do not open up access to any other data.

  2. Provide your instance with the storage.objectAdmin Cloud IAM role for your bucket. For help with setting Cloud IAM permissions, see Using Cloud IAM permissions.
  3. Export your database:

    Before using any of the request data below, make the following replacements:

    • project-id: The project ID
    • instance-id: The instance ID
    • bucket_name: The Cloud Storage bucket name
    • path_to_dump_file: The path to the SQL dump file
    • database_name_1: The name of a database inside the Cloud SQL instance
    • database_name_2: The name of a database inside the Cloud SQL instance

    HTTP method and URL:

    POST https://www.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export

    Request JSON body:

    {
     "exportContext":
       {
          "fileType": "SQL",
          "uri": "gs://bucket_name/path_to_dump_file",
          "databases": ["database_name_1", "database_name_2"]
        }
    }
    

    To send your request, expand one of these options:

    You should receive a JSON response similar to the following:

  4. If you do not need to retain the Cloud IAM role you set previously, remove it now.
For the complete list of parameters for this request, see the instances:export page.

Exporting data to a CSV file

You can export your data in CSV format, which is usable by other tools and environments. Exports happen at the database level. During a CSV export, you can specify the schemas to export. All schemas under the database level are eligible for export.

You cannot import a CSV file that was created from a MySQL instance into a PostgreSQL or SQL Server instance, or vice versa.

Console

  1. Go to the Cloud SQL Instances page in the Google Cloud Console.

    Go to the Cloud SQL Instances page

  2. Click the instance to open its Instance overview page.
  3. Click Export.
  4. Under Cloud Storage export location, select a Cloud Storage bucket or folder for your export.
  5. In the Name field, provide a name for your export file and click Select.

    You can use a file extension of .gz to compress your export file.

  6. Set Format to CSV.
  7. Enter a SQL query to specify the data to export.

    For example, to export the entire contents of the entries table in the guestbook database, you would enter SELECT * FROM guestbook.entries;. Your query must specify a table; you cannot export an entire database in CSV format.

  8. Click Export to start the export.

gcloud

  1. Create a bucket for the export:
    gsutil mb -p [PROJECT_NAME] -l [LOCATION_NAME] gs://[BUCKET_NAME]
    

    This step is not required, but strongly recommended, so you do not open up access to any other data.

  2. Describe the instance you are exporting from:
    gcloud sql instances describe [INSTANCE_NAME]
    
  3. Use gsutil iam to grant the storage.objectAdmin Cloud IAM role to the service account for the bucket. For help with setting Cloud IAM permissions, see Using Cloud IAM permissions.
  4. Export the database:
    gcloud sql export csv [INSTANCE_NAME] gs://[BUCKET_NAME]/[FILE_NAME] \
                                --database=[DATABASE_NAME] --query=[SELECT_QUERY]
    

    For information about using the export csv command, see the sql export csv command reference page.

  5. If you do not need to retain the Cloud IAM permissions you set previously, remove them now.

REST

  1. Create a bucket for the export:
    gsutil mb -p [PROJECT_NAME] -l [LOCATION_NAME] gs://[BUCKET_NAME]
    

    This step is not required, but strongly recommended, so you do not open up access to any other data.

  2. Provide your instance with the storage.objectAdmin Cloud IAM role for your bucket. For help with setting Cloud IAM permissions, see Using Cloud IAM permissions.
  3. Export your database:

    Before using any of the request data below, make the following replacements:

    • project-id: The project ID
    • instance-id: The instance ID
    • bucket_name: The Cloud Storage bucket name
    • path_to_csv_file: The path to the CSV file
    • database_name: The name of a database inside the Cloud SQL instance
    • select_query: SQL query for export

    HTTP method and URL:

    POST https://www.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export

    Request JSON body:

    {
     "exportContext":
       {
          "fileType": "CSV",
          "uri": "gs://bucket_name/path_to_csv_file",
          "databases": ["database_name"],
          "csvExportOptions":
           {
               "selectQuery":"select_query"
           }
       }
    }
    

    To send your request, expand one of these options:

    You should receive a JSON response similar to the following:

    If your select query specifies a database, it overrides the databases property.

  4. If you do not need to retain the Cloud IAM permissions you set previously, remove them now.
For the complete list of parameters for this request, see the instances:export page.

Exporting in CSV format is equivalent to running the following SQL statement:

  SELECT <query> INTO OUTFILE ... CHARACTER SET 'utf8mb4'
        FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"'
        ESCAPED BY '\\' LINES TERMINATED BY '\n'

Creating a CSV file from your local MySQL server

mysql --host=[INSTANCE_IP] --user=[USER_NAME] --password [DATABASE] \
-e " SELECT * FROM [TABLE] INTO OUTFILE '[FILE_NAME]' CHARACTER SET 'utf8mb4'
     FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' ESCAPED BY '\"' "

Exporting using mysqldump

You can run the mysqldump utility directly against your Cloud SQL database, using whatever options you require. However, if you are exporting from Cloud SQL in order to import into another Cloud SQL database, you must use the mysqldump utility with the following flags:

  • --databases You must use the --databases option to specify an explicit list of databases to export, and this list must not contain the mysql system database.
  • --hex-blob If your database contains any binary fields, you must use this flag to ensure that your binary fields are imported correctly.
  • --set-gtid-purged=OFF GTID information must not be included in the SQL dump file, nor should binary logging be disabled by the SQL dump file. (Not required for MySQL 5.5 or external replication.)
  • --single-transactionReplicating from an external server

Standard configs

From a command line, run mysqldump:

mysqldump --databases [DATABASE_NAME] -h [INSTANCE_IP] -u [USERNAME] -p \
--hex-blob --single-transaction --set-gtid-purged=OFF \
--default-character-set=utf8mb4 > [SQL_FILE].sql

For more information about mysqldump, see the mysqldump reference.

External replication

To create a dump file for use in an external master configuration, see Replicating from an external server

To see how the underlying REST API request is constructed for this task, see the APIs Explorer on the instances:export page.

What's next