Importing data into Cloud SQL

Importing data from BAK files into Cloud SQL

If your instance version is a Microsoft SQL Server Enterprise Edition, you can import encrypted BAK files.

Before you begin

  • For the instructions below, prepare to specify a new database; do not create a database before starting the import of your BAK file.
  • This procedure requires you to import a file from Cloud Storage. To import data from Cloud Storage, the Cloud SQL instance's service account or the user must have the Storage Object Admin (storage.objectAdmin) IAM role set. For help with IAM roles, see Cloud Identity and Access Management for Cloud Storage.

    You can find the instance's service account name in the Google Cloud Console on your instance's Overview page. You can verify the roles for your Cloud Storage bucket by using the gsutil tool:

    gsutil iam get gs://[BUCKET_NAME]

    Learn more about using IAM with buckets.

Importing data from a BAK file in Cloud Storage

To import data to a Cloud SQL instance using a BAK file:

Console

  1. Go to the Cloud SQL Instances page in the Google Cloud Console.

    Go to the Cloud SQL Instances page

  2. Select the instance to open its Overview page.
  3. Click Import in the button bar.
  4. Under Choose the file you'd like to import data from, enter the path to the bucket and BAK file to use for the import. Or to browse to the file:
    1. Click Browse.
    2. Under Location, double-click the name of the bucket in the list.
    3. Select the file in the list.
    4. Click Select.

    You can import a compressed (.gz) or an uncompressed file.

  5. Under Format, select BAK.
  6. Specify the Database in your Cloud SQL instance where you want to import the BAK file.
  7. Click the Import to start the import.

gcloud

  1. Create a bucket for the import:

    gsutil mb -p [PROJECT_NAME] -l [LOCATION_NAME] gs://[BUCKET_NAME]
    

    This step is not required, but strongly recommended, so you do not open up access to any other data.

  2. This procedure requires you to import a file from Cloud Storage. To import data from Cloud Storage, the Cloud SQL instance's service account or the user must have the Storage Object Admin (storage.objectAdmin) IAM role set. For help with IAM roles, see Cloud Identity and Access Management for Cloud Storage.

    You can find the instance's service account name in the Google Cloud Console on your instance's Overview page. You can verify the roles for your Cloud Storage bucket by using the gsutil tool:

    gsutil iam get gs://[BUCKET_NAME]

    Learn more about using IAM with buckets.

  3. Upload the data from the BAK file to the bucket.
  4. Describe the instance you are exporting from:
    gcloud sql instances describe [INSTANCE_NAME]
    
  5. Copy the serviceAccountEmailAddress field.
  6. Use gsutil iam to grant the storage.objectViewer IAM role to the service account for the bucket. For help with setting IAM permissions, see Using IAM permissions.
  7. Import the data from the file:
    gcloud beta sql import bak [INSTANCE_NAME] gs://[BUCKET_NAME]/[FILE_NAME] \
                                --database=[DATABASE_NAME]
    
  8. If you do not need to retain the IAM permissions you set previously, remove them using gsutil iam.

REST

  1. Create a Cloud Storage bucket, if you haven't already.

    For help with creating a bucket, see Creating Storage Buckets.

  2. Upload the file to your bucket.

    For help with uploading files to buckets, see Uploading objects.

  3. Provide your instance with the storage.objectAdmin IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions.
  4. Import the data from the file:

    Before using any of the request data below, make the following replacements:

    • project-id: The project ID
    • instance-id: The instance ID
    • bucket_name: The Cloud Storage bucket name
    • path_to_bak_file: The path to the BAK file
    • database_name: The name of a database inside the Cloud SQL instance

    HTTP method and URL:

    POST https://www.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import

    Request JSON body:

    {
     "importContext":
       {
          "fileType": "BAK",
          "uri": "gs://bucket_name/path_to_bak_file",
          "database": "database_name"
        }
    }
    
    

    To send your request, expand one of these options:

    You should receive a JSON response similar to the following:

    To use a different user for the import, specify the importContext.importUser property.

    For the complete list of parameters for this request, see the instances:import page.
  5. If you do not need to retain the IAM permissions you set previously, remove the permissions.

If you get an error such as ERROR_RDBMS, ensure the table exists. If the table exists, confirm that you have the correct permissions on the bucket. For help configuring access control in Cloud Storage, see Create and Manage Access Control Lists.

To see how the underlying REST API request is constructed for this task, see the APIs Explorer on the instances:import page.

Importing data from SQL files into Cloud SQL

SQL files are plain text files with a sequence of SQL commands.

Before you begin

  • For the instructions below, prepare to specify a new database; do not create a database before starting the import of your SQL file.
  • This procedure requires you to import a file from Cloud Storage. To import data from Cloud Storage, the Cloud SQL instance's service account or the user must have the Storage Object Admin (storage.objectAdmin) IAM role set. For help with IAM roles, see Cloud Identity and Access Management for Cloud Storage.

    You can find the instance's service account name in the Google Cloud Console on your instance's Overview page. You can verify the roles for your Cloud Storage bucket by using the gsutil tool:

    gsutil iam get gs://[BUCKET_NAME]

    Learn more about using IAM with buckets.

Importing the SQL file

To import data to a Cloud SQL instance using a SQL file:

Console

  1. Go to the Cloud SQL Instances page in the Google Cloud Console.

    Go to the Cloud SQL Instances page

  2. Select the instance to open its Instance details page.
  3. Click Import in the button bar.
  4. Under Choose the file you'd like to import data from, enter the path to the bucket and SQL file to use for the import. Or to browse to the file:
    1. Click Browse.
    2. Under Location, double-click the name of the bucket in the list.
    3. Select the file in the list.
    4. Click Select.

    You can import a compressed (.gz) or an uncompressed file.

  5. Under Format, select SQL.
  6. Specify the Database in your Cloud SQL instance where you want to import the SQL file.
  7. Click the Import to start the import.

gcloud

  1. Create a bucket for the import:

    gsutil mb -p [PROJECT_NAME] -l [LOCATION_NAME] gs://[BUCKET_NAME]
    

    This step is not required, but strongly recommended, so you do not open up access to any other data.

  2. Upload the SQL file to the bucket.
  3. Describe the instance you are exporting from:
    gcloud sql instances describe [INSTANCE_NAME]
    
  4. Copy the serviceAccountEmailAddress field.
  5. Use gsutil iam to grant the storage.objectAdmin IAM role to the service account for the bucket. For help with setting IAM permissions, see Using IAM permissions.
  6. Import the file:
    gcloud sql import sql [INSTANCE_NAME] gs://[BUCKET_NAME]/[FILE_NAME] \
                                --database=[DATABASE_NAME]
    
  7. If you do not need to retain the IAM permissions you set previously, remove them using gsutil iam.

REST

  1. Create a Cloud Storage bucket, if you haven't already.

    For help with creating a bucket, see Creating Storage Buckets.

  2. Upload the file to your bucket.

    For help with uploading files to buckets, see Uploading objects.

  3. Provide your instance with the storage.objectAdmin IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions.
  4. Import the file:

    Before using any of the request data below, make the following replacements:

    • project-id: The project ID
    • instance-id: The instance ID
    • bucket_name: The Cloud Storage bucket name
    • path_to_sql_file: The path to the SQL file
    • database_name: The name of a database inside the Cloud SQL instance

    HTTP method and URL:

    POST https://www.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import

    Request JSON body:

    {
     "importContext":
       {
          "fileType": "SQL",
          "uri": "gs://bucket_name/path_to_sql_file",
          "database": "database_name"
        }
    }
    
    

    To send your request, expand one of these options:

    You should receive a JSON response similar to the following:

    To use a different user for the import, specify the importContext.importUser property.

    For the complete list of parameters for this request, see the instances:import page.
  5. If you do not need to retain the IAM permissions you set previously, remove the permissions.

If you get an error such as ERROR_RDBMS, ensure the table exists. If the table exists, confirm that you have the correct permissions on the bucket. For more information about configuring access control in Cloud Storage, see Create and Manage Access Control Lists.

To see how the underlying REST API request is constructed for this task, see the APIs Explorer on the instances:import page.

What's next