Importing data into Cloud SQL

For best practices for importing data, see Best Practices for Importing and Exporting Data.

To import data from Cloud Storage, the instance's service account must have the legacyBucketReader Cloud IAM role set in the project. For more information, see Cloud Identity and Access Management for Cloud Storage.

Importing SQL dump files to Cloud SQL

Before you begin

  • SQL dump files must have been created with the correct parameters.

    For help with creating SQL dump files ready for import into Cloud SQL, see Exporting Data.

Importing a SQL dump file

To import a SQL dump file to a Cloud SQL instance:

Console

    Create a SQL dump file using the instructions in Exporting Data to ensure that your dump file is useable by Cloud SQL.

  1. Create a Cloud Storage bucket, if you haven't already.

    For help with creating a bucket, see Creating Storage Buckets.

  2. Upload the file to your bucket.

    For help with uploading files to buckets, see Uploading an Object.

  3. Go to the Cloud SQL Instances page in the Google Cloud Console.

    Go to the Cloud SQL Instances page

  4. Select the instance to open its Instance details page.
  5. Click Import in the button bar.
  6. Enter the path to the bucket and SQL dump file you uploaded, or browse to it by clicking Browse and doubleclicking the name of the bucket and then selecting the file and clicking Select.

    You can use a compressed (.gz) or an uncompressed (.sql) file.

  7. For Format, select SQL.
  8. Select the database you want the data to be imported into.

    This causes Cloud SQL to run the USE DATABASE statement before the import.

  9. Click Import to start the import.

gcloud

    Create a SQL dump file using the instructions in Exporting Data to ensure that your dump file is useable by Cloud SQL.

  1. Create a Cloud Storage bucket, if you haven't already.

    For help with creating a bucket, see Creating Storage Buckets.

  2. Upload the file to your bucket.

    For help with uploading files to buckets, see Uploading an Object.

  3. Describe the instance you are importing to:
    gcloud sql instances describe [INSTANCE_NAME]
    
  4. Copy the serviceAccountEmailAddress field.
  5. Use gsutil iam to grant the legacyBucketWriter and objectViewer Cloud IAM roles to the service account for the bucket. For help with setting Cloud IAM permissions, see Using Cloud IAM permissions.
  6. Import the database:
    gcloud sql import sql [INSTANCE_NAME] gs://[BUCKET_NAME]/[IMPORT_FILE_NAME] \
                                --database=[DATABASE_NAME]
    

    For information about using the import sql command, see the sql import sql command reference page.

    If the command returns an error like `ERROR_RDBMS`, review the permissions; this error is often due to permissions issues.

  7. If you do not need to retain the Cloud IAM permissions you set previously, remove them using gsutil iam.

REST

    Create a SQL dump file using the instructions in Exporting Data to ensure that your dump file is useable by Cloud SQL.

  1. Create a Cloud Storage bucket, if you haven't already.

    For help with creating a bucket, see Creating Storage Buckets.

  2. Upload the file to your bucket.

    For help with uploading files to buckets, see Uploading an Object.

  3. Provide your instance with the legacyBucketWriter and objectViewer Cloud IAM roles for your bucket. For help with setting Cloud IAM permissions, see Using Cloud IAM permissions.
  4. Import your dump file:

    Before using any of the request data below, make the following replacements:

    • project-id: The project ID
    • instance-id: The instance ID
    • bucket_name: The Cloud Storage bucket name
    • path_to_dump_file: The path to the SQL dump file
    • database_name: The name of a database inside the Cloud SQL instance

    HTTP method and URL:

    POST https://www.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import

    Request JSON body:

    {
     "importContext":
       {
          "fileType": "SQL",
          "uri": "gs://bucket_name/path_to_dump_file",
          "database": "database_name"
        }
    }
    
    

    To send your request, expand one of these options:

    You should receive a JSON response similar to the following:

    For the complete list of parameters for this request, see the instances:import page.
  5. If you do not need to retain the Cloud IAM permissions you set previously, remove them now.

Importing BAK files to Cloud SQL

Before you begin

For the instructions below, prepare to specify a new database; do not create a database before starting the import of your BAK file.

Importing the BAK file

To import data to a Cloud SQL instance using a BAK file:

Console

  1. Go to the Cloud SQL Instances page in the Google Cloud Console.

    Go to the Cloud SQL Instances page

  2. Select the instance to open its Instance details page.
  3. Click Import in the button bar.
  4. Choose the BAK file you uploaded for import into your Cloud SQL instance.
  5. Under Format, select BAK.
  6. Specify the Database in your Cloud SQL instance where you wish to import the BAK file.
  7. Click the Import to start the import.

gcloud

  1. Create a bucket for the import:

    gsutil mb -p [PROJECT_NAME] -l [LOCATION_NAME] gs://[BUCKET_NAME]
    

    This step is not required, but strongly recommended, so you do not open up access to any other data.

  2. Upload the BAK file to the bucket.
  3. Describe the instance you are exporting from:
    gcloud sql instances describe [INSTANCE_NAME]
    
  4. Copy the serviceAccountEmailAddress field.
  5. Use gsutil iam to grant the legacyBucketWriter and objectViewer Cloud IAM roles to the service account for the bucket. For help with setting Cloud IAM permissions, see Using Cloud IAM permissions.
  6. Import the file:
    gcloud beta sql import bak [INSTANCE_NAME] gs://[BUCKET_NAME]/[FILE_NAME] \
                                --database=[DATABASE_NAME] --table=[TABLE_NAME]
    
  7. If you do not need to retain the Cloud IAM permissions you set previously, remove them using gsutil iam.

REST

  1. Create a Cloud Storage bucket, if you haven't already.

    For help with creating a bucket, see Creating Storage Buckets.

  2. Upload the file to your bucket.

    For help with uploading files to buckets, see Uploading an Object.

  3. Provide your instance with the legacyBucketWriter and objectViewer Cloud IAM roles for your bucket. For help with setting Cloud IAM permissions, see Using Cloud IAM permissions.
  4. Import the file:

    Before using any of the request data below, make the following replacements:

    • project-id: The project ID
    • instance-id: The instance ID
    • bucket_name: The Cloud Storage bucket name
    • path_to_bak_file: The path to the BAK file
    • database_name: The name of a database inside the Cloud SQL instance

    HTTP method and URL:

    POST https://www.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import

    Request JSON body:

    {
     "importContext":
       {
          "fileType": "BAK",
          "uri": "gs://bucket_name/path_to_bak_file",
          "database": "database_name"
        }
    }
    
    

    To send your request, expand one of these options:

    You should receive a JSON response similar to the following:

    To use a different user for the import, specify the importContext.importUser property.

    For the complete list of parameters for this request, see the instances:import page.
  5. If you do not need to retain the Cloud IAM permissions you set previously, remove the permissions.

If you get an error such as ERROR_RDBMS, ensure the table exists. If the table exists, confirm that you have the correct permissions on the bucket. For more information about configuring access control in Cloud Storage, see Create and Manage Access Control Lists.

To see how the underlying REST API request is constructed for this task, see the APIs Explorer on the instances:import page.

What's next