This page describes exporting and importing data into Cloud SQL instances using SQL dump files.
Exporting from Cloud SQL to a SQL dump file isn't supported for SQL Server.
Before you begin
Exports use database resources, but they do not interfere with normal database operations unless the instance is under-provisioned.
For best practices, see Best Practices for Importing and Exporting Data.
After completing an import operation, verify the results.
Export data from Cloud SQL for SQL Server
Exporting from Cloud SQL to a SQL dump file is not supported for SQL Server.Import data to Cloud SQL for SQL Server
Required roles and permissions for importing to Cloud SQL for SQL Server
To import data from Cloud Storage into Cloud SQL, the user initiating the import must have one of the following roles:
- The Cloud SQL Admin role
- A custom role,
including the following permissions:
cloudsql.instances.get
cloudsql.instances.import
Additionally, the service account for the Cloud SQL instance must have one of the following roles:
- The
storage.objectAdmin
IAM role - A custom role, including the following permissions:
storage.objects.get
storage.objects.list
(for striped import only)
For help with IAM roles, see Identity and Access Management.
Import a SQL dump file to Cloud SQL for SQL Server
SQL files are plain text files with a sequence of SQL commands.
In the following procedure, prepare to specify an existing database to import your SQL files from.
Console
-
In the Google Cloud console, go to the Cloud SQL Instances page.
- To open the Overview page of an instance, click the instance name.
- Click Import.
- In the Choose the file you'd like to import data from section, enter the path
to the bucket and SQL dump file to use for the import, or browse to an existing
file.
You can import a compressed (
.gz
) or an uncompressed (.sql
) file. - For Format, select SQL.
Select the database you want the data to be imported into.
This causes Cloud SQL to run the
USE DATABASE
statement before the import.If you want to specify a user to perform the import, select the user.
If your import file contains statements that must be performed by a specific user, use this field to specify that user.
- Click Import to start the import.
gcloud
- Create a Cloud Storage bucket.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
- Describe the instance you are importing to:
gcloud sql instances describe INSTANCE_NAME
- Copy the
serviceAccountEmailAddress
field. - Use
gcloud storage buckets add-iam-policy-binding
to grant thestorage.objectAdmin
IAM role to the service account for the bucket. For help with setting IAM permissions, see Using IAM permissions.gcloud storage buckets add-iam-policy-binding gs://BUCKET_NAME \ --member=serviceAccount:SERVICE-ACCOUNT \ --role=roles/storage.objectAdmin
- Import the database:
gcloud sql import sql INSTANCE_NAME gs://BUCKET_NAME/IMPORT_FILE_NAME \ --database=DATABASE_NAME
For information about using the
import sql
command, see thesql import sql
command reference page.If the command returns an error like
ERROR_RDBMS
, review the permissions; this error is often due to permissions issues. - If you do not need to retain the IAM permissions you
set previously, remove them using
gcloud storage buckets remove-iam-policy-binding
.
REST v1
- Create a Cloud Storage bucket.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
- Provide your instance with the
legacyBucketWriter
andobjectViewer
IAM roles for your bucket. For help with setting IAM permissions, see Using IAM permissions. - Import your dump file:
Before using any of the request data, make the following replacements:
- project-id: The project ID
- instance-id: The instance ID
- bucket_name: The Cloud Storage bucket name
- path_to_sql_file: The path to the SQL file
- database_name: The name of a database inside the Cloud SQL instance
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "SQL", "uri": "gs://bucket_name/path_to_sql_file", "database": "database_name" } }
To send your request, expand one of these options:
You should receive a JSON response similar to the following:
For the complete list of parameters for the request, see the instances:import page. - If you do not need to retain the IAM permissions you set previously, remove them now.
REST v1beta4
- Create a Cloud Storage bucket.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
- Provide your instance with the
storage.objectAdmin
IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions. - Import your dump file:
Before using any of the request data, make the following replacements:
- project-id: The project ID
- instance-id: The instance ID
- bucket_name: The Cloud Storage bucket name
- path_to_sql_file: The path to the SQL file
- database_name: The name of a database inside the Cloud SQL instance
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "SQL", "uri": "gs://bucket_name/path_to_sql_file", "database": "database_name" } }
To send your request, expand one of these options:
You should receive a JSON response similar to the following:
For the complete list of parameters for the request, see the instances:import page. - If you do not need to retain the IAM permissions you set previously, remove them now.
What's next
- Learn how to check the status of import and export operations.
- Learn more about best practices for importing and exporting data.
- Learn more about Cloud Storage.
- Known issues for imports and exports.