Export and import using SQL dump files

Stay organized with collections Save and categorize content based on your preferences.

This page describes exporting and importing data into Cloud SQL instances using SQL dump files.

Exporting from Cloud SQL to a SQL dump file is not supported for SQL Server.

Before you begin

Exports use database resources, but they do not interfere with normal database operations unless the instance is under-provisioned.

For best practices, see Best Practices for Importing and Exporting Data.

After completing an import operation, verify the results.

Export

Exporting from Cloud SQL to a SQL dump file is not supported for SQL Server.

Import

Required roles and permissions for importing

This procedure requires you to import a file from Cloud Storage. To import data from Cloud Storage, the Cloud SQL instance service account or user must have one of the following sets of roles:

  • The Cloud SQL Admin role and the roles/storage.legacyObjectReader IAM role
  • A custom role including the following permissions:
    • cloudsql.instances.get
    • cloudsql.instances.import
    • storage.buckets.get
    • storage.objects.get

If the service account or user is also performing export operations, it might be more convenient to grant the Storage Object Admin IAM role to have all the required permissions for both import and export. For help with IAM roles, see Cloud Identity and Access Management for Cloud Storage.

Import a SQL dump file

SQL files are plain text files with a sequence of SQL commands.

For the instructions below, prepare to specify a new database; do not create a database before starting the import of your SQL file.

Console

  1. In the Google Cloud console, go to the Cloud SQL Instances page.

    Go to Cloud SQL Instances

  2. To open the Overview page of an instance, click the instance name.
  3. Click Import.
  4. In the Choose the file you'd like to import data from section, enter the path to the bucket and SQL dump file to use for the import, or browse to an existing file.

    You can import a compressed (.gz) or an uncompressed (.sql) file.

  5. For Format, select SQL.
  6. Select the database you want the data to be imported into.

    This causes Cloud SQL to run the USE DATABASE statement before the import.

  7. If you want to specify a user to perform the import, select the user.

    If your import file contains statements that must be performed by a specific user, use this field to specify that user.

  8. Click Import to start the import.

gcloud

  1. Create a Cloud Storage bucket.
  2. Upload the file to your bucket.

    For help with uploading files to buckets, see Uploading objects.

  3. Describe the instance you are importing to:
    gcloud sql instances describe INSTANCE_NAME
    
  4. Copy the serviceAccountEmailAddress field.
  5. Use gsutil iam to grant the storage.objectAdmin IAM role to the service account for the bucket.
    gsutil iam ch serviceAccount:SERVICE-ACCOUNT:objectAdmin \
    gs://BUCKET-NAME
      
    For help with setting IAM permissions, see Using IAM permissions.
  6. Import the database:
    gcloud sql import sql INSTANCE_NAME gs://BUCKET_NAME/IMPORT_FILE_NAME \
    --database=DATABASE_NAME
    

    For information about using the import sql command, see the sql import sql command reference page.

    If the command returns an error like ERROR_RDBMS, review the permissions; this error is often due to permissions issues.

  7. If you do not need to retain the IAM permissions you set previously, remove them using gsutil iam.

REST v1

  1. Create a Cloud Storage bucket.
  2. Upload the file to your bucket.

    For help with uploading files to buckets, see Uploading objects.

  3. Provide your instance with the legacyBucketWriter and objectViewer IAM roles for your bucket. For help with setting IAM permissions, see Using IAM permissions.
  4. Import your dump file:

    Before using any of the request data, make the following replacements:

    • project-id: The project ID
    • instance-id: The instance ID
    • bucket_name: The Cloud Storage bucket name
    • path_to_sql_file: The path to the SQL file
    • database_name: The name of a database inside the Cloud SQL instance

    HTTP method and URL:

    POST https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import

    Request JSON body:

    {
     "importContext":
       {
          "fileType": "SQL",
          "uri": "gs://bucket_name/path_to_sql_file",
          "database": "database_name"
        }
    }
    
    

    To send your request, expand one of these options:

    You should receive a JSON response similar to the following:

    For the complete list of parameters for this request, see the instances:import page.
  5. If you do not need to retain the IAM permissions you set previously, remove them now.

REST v1beta4

  1. Create a Cloud Storage bucket.
  2. Upload the file to your bucket.

    For help with uploading files to buckets, see Uploading objects.

  3. Provide your instance with the storage.objectAdmin IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions.
  4. Import your dump file:

    Before using any of the request data, make the following replacements:

    • project-id: The project ID
    • instance-id: The instance ID
    • bucket_name: The Cloud Storage bucket name
    • path_to_sql_file: The path to the SQL file
    • database_name: The name of a database inside the Cloud SQL instance

    HTTP method and URL:

    POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import

    Request JSON body:

    {
     "importContext":
       {
          "fileType": "SQL",
          "uri": "gs://bucket_name/path_to_sql_file",
          "database": "database_name"
        }
    }
    
    

    To send your request, expand one of these options:

    You should receive a JSON response similar to the following:

    For the complete list of parameters for this request, see the instances:import page.
  5. If you do not need to retain the IAM permissions you set previously, remove them now.

What's next