This page describes how to export data from Cloud SQL instances.
Exports use database resources, but they do not interfere with normal database operations unless the instance is under-provisioned.
For best practices for exporting data, see Best Practices for Importing and Exporting Data.
Exporting data from Cloud SQL to a BAK file
Before you begin
Make sure you have configured the required roles and permissions.
Exporting data to a BAK file in Cloud Storage
To export data from a database on a Cloud SQL instance to a BAK file in a Cloud Storage bucket:
Console
- Go to the Cloud SQL Instances page in the Google Cloud Console.
- Click the instance you want to export data from to open its Overview page.
- Click EXPORT in the button bar.
- Under File format, click BAK.
- Under Data to export, use the drop-down menu to select the database you want to export from.
- Under Destination, select Browse to search for a Cloud Storage bucket or folder for your export.
- Click Export to begin the export.
gcloud
-
Create a Cloud Storage bucket, if you haven't already.
For help with creating a bucket, see Creating Storage Buckets.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading an Object.
- Describe the instance you are exporting from:
gcloud sql instances describe [INSTANCE_NAME]
- Copy the serviceAccountEmailAddress field.
- Use
gsutil iam
to grant thestorage.objectAdmin
IAM role to the service account for the bucket. For help with setting IAM permissions, see Using IAM permissions. - Export the database:
gcloud beta sql export bak [INSTANCE_NAME] gs://[BUCKET_NAME]/sqldumpfile.sql.gz \ --database=[DATABASE_NAME]
For information about using the
gcloud beta sql export bak
command, see the command reference page. - If you do not need to retain the IAM role you set previously, revoke it now.
REST v1beta4
- Create a bucket for the export:
gsutil mb -p [PROJECT_NAME] -l [LOCATION_NAME] gs://[BUCKET_NAME]
This step is not required, but strongly recommended, so you do not open up access to any other data.
- Provide your instance with the
storage.objectAdmin
IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions. - Export your database:
Before using any of the request data below, make the following replacements:
- project-id: The project ID
- instance-id: The instance ID
- bucket_name: The Cloud Storage bucket name
- path_to_dump_file: The path to the SQL dump fle
- database_name_1: The name of a database inside the Cloud SQL instance
- database_name_2: The name of a database inside the Cloud SQL instance
- offload: Enables serverless export. Set to
true
to use serverless export.
HTTP method and URL:
POST https://www.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export
Request JSON body:
{ "exportContext": { "fileType": "BAK", "uri": "gs://bucket_name/path_to_dump_file", "databases": ["database_name"] "offload": true | false } }
To send your request, expand one of these options:
You should receive a JSON response similar to the following:
- If you do not need to retain the IAM role you set previously, revoke it now.
Required roles and permissions
This procedure requires you to export a file to Cloud Storage. To export
data into Cloud Storage, the instance's service account or the user must
have the Cloud SQL Editor
role, or a custom role including the cloudsql.instances.export
permission, and at least the roles/storage.legacyBucketWriter
IAM role.
If the service account or user is also
performing import operations, you can grant the account the
storage.objectAdmin
IAM roles set in the project.
For help with IAM roles, see Cloud Identity and Access Management for Cloud Storage.
You can find the instance's service account name in the Google Cloud Console on your instance's Overview page. You can verify the roles for your Cloud Storage bucket by using the gsutil tool to inspect the bucket:
gsutil iam get gs://[BUCKET_NAME]
Learn more about using IAM with buckets.
Troubleshooting
Click the links in the table for details:
For this problem... | The issue might be... | Try this... |
---|---|---|
Can't see the operation status. | The user interface only shows success or failure. | Use these database commands to find out more. |
Export is taking too long. | Cloud SQL does not support concurrent synchronous operations. | Use export offloading. Learn more. |
Import is taking too long. | Too many active connections can interfere with import operations. | Close unused connections, or restart the Cloud SQL instance before beginning an import operation. |
Import fails. | Exported file may contain database users who do not yet exist. | Clean up the failed database before retrying the import. Create the database users before doing the import. |
Connection closed during the export operation. | Query must produce data within first seven minutes. | Test the query manually. Learn more. |
Unknown error during export. | Possible bandwidth issue. | Ensure that both the instance and the Cloud Storage bucket are in the same region. |
You want to automate exports. | Cloud SQL does not provide a way to automate exports. | Build your own pipeline to perform this functionality. Learn more. |
ERROR_RDBMS: system error occurred . |
Cloud Storage permissions or non-existent table. | Check permissions OR ensure table exists. |
After import, you can't log in to Cloud SQL with your original database user. | The user doesn't have the LOGIN permission. |
Log in with the default sqlserver user. Then grant the
LOGIN permission to the original user. |
Can't see the operation status
You can't see the status of an ongoing operation.
The issue might be
The Google Cloud Console reports only success or failure when done, and is not designed to return warnings.
Things to try
Connect to the database and run SHOW WARNINGS
.
Export is taking too long
Export is taking too long, blocking other operations.
The issue might be
Cloud SQL does not support concurrent synchronous operations.
Things to try
Try exporting smaller datasets at a time.
Import is taking too long
Import is taking too long, blocking other operations.
The issue might be
Too many active connections can interfere with import operations. Connections consume CPU and memory, limiting the resources available.
Things to try
Close unused operations. Check CPU and memory usage to make sure there are plenty of resources available. The best way to ensure maximum resources for the import operation is to restart the instance before beginning the operation. A restart:
- Closes all connections.
- Ends any tasks that may be consuming resources.
Import fails
Import fails when one or more users referenced in the exported SQL dump file does not exist.
The issue might be
Before importing a SQL dump, all the database users who own objects or were granted permissions on objects in the dumped database must exist. If they do not, the restore fails to recreate the objects with the original ownership and/or permissions.
Things to try
Clean up the failed database before retrying the import. Create the database users before importing the SQL dump.
Connection closed during the export operation
Connection closed during the export operation.
The issue might be
The connection to Cloud Storage may be timing out because the query running in the export is not producing any data within the first seven minutes since the export is initiated.
Things to try
Test the query manually by connecting from any client and sending the output of your query to STDOUT with the command below:
COPY (INSERT_YOUR_QUERY_HERE) TO STDOUT WITH ( FORMAT csv, DELIMITER ',', ENCODING 'UTF8', QUOTE '"', ESCAPE '"' )
.
This is expected behavior since when the export is initiated, the client is expected to start sending data right away. Keeping the connection with no data sent ends up breaking the connection and eventually resulting in the export failing and leaving the operation in an uncertain state. Also, this is what the error message from gcloud is trying to say with this message:
operation is taking longer than expected
.
Unknown error during export
You see the error message Unknown error
while trying to export a
database to a Cloud Storage bucket.
The issue might be
The transfer might be failing due to a bandwidth issue.
Things to try
The Cloud SQL instance may be located in a different region from the Cloud Storage bucket. Reading and writing data from one continent to another involves a lot of network usage, and can cause intermittent issues like this. Check the regions of your instance and bucket.
Want to automate exports
You want to automate exports.
The issue might be
Cloud SQL does not provide a way to automate exports.
Things to try
You could build your own automated export system using Google Cloud products such as Cloud Scheduler, Pub/Sub, and Cloud Functions.
ERROR_RDBMS system error occurred
You see the error message [ERROR_RDBMS] system error occurred
.
The issue might be
- The user might not have all the Cloud Storage permissions it needs.
- The database table might not exist.
Things to try
- Check that you have at least
WRITER
permissions on the bucket andREADER
permissions on the export file. For more information on configuring access control in Cloud Storage, see Create and Manage Access Control Lists. - Ensure the table exists. If the table exists, confirm that you have the correct permissions on the bucket.
After import, you can't log in to Cloud SQL with your original database user
After importing a database into SQL Server, you can't log in to Cloud SQL with your original database user.
The issue might be
The user doesn't have the LOGIN
permission.
Things to try
Log in with the default sqlserver
user. Then grant the
LOGIN
permission to the original user.
What's next
- Learn how to check the status of import and export operations.
- Learn more about importing and exporting data.
- Learn more about Cloud Storage.