Best practices for importing and exporting data

This page provides best practices for importing and exporting data with Cloud SQL. For step-by-step instructions for importing data into Cloud SQL, see Importing Data. For step-by-step instructions for exporting your data, whether it is in Cloud SQL or an instance you manage, see Exporting Data.

Best practices for importing and exporting

The following are best practices to consider when importing and exporting data:

Don't use Cloud Storage Requester Pays buckets

You cannot use a Cloud Storage bucket that has Requester Pays enabled for imports and exports from Cloud SQL.

Use the correct flags when you create a SQL dump file

Compress data to reduce cost

Cloud SQL supports importing and exporting both compressed and uncompressed files. Compression can save significant storage space on Cloud Storage and reduce your storage costs, especially when you are exporting large instances.

When you export a BAK file, use a .gz file extension to compress the data. When you import a file with an extension of .gz, it is decompressed automatically.

Reduce long-running import and export processes

Imports and exports into Cloud SQL using the import functionality (with a Cloud Storage bucket) can take a long time to complete, depending on the size of the database. This can have the following impacts:

  • You cannot stop a long-running Cloud SQL instance operation.
  • You can perform only one import or export operation at a time for each instance.

You can decrease the amount of time it takes to complete each operation by using the Cloud SQL import or export functionality with smaller batches of data.

Verify the imported database

After an import operation is complete, connect to your database and run the appropriate database commands to make sure the contents are correct. For example, connect and list the databases, tables, and specific entries.

Known limitations

For a list of known limitations, see Issues with importing and exporting data.

Automating export operations

Although Cloud SQL doesn't provide a built-in way to automate database exports, you can build your own automation tool using several Google Cloud components. To learn more, see this tutorial.

Troubleshooting

Troubleshooting import operations

Issue Troubleshooting
The import operation is taking too long. Too many active connections can interfere with import operations.

Close unused operations. Check the CPU and memory usage of your Cloud SQL instance to make sure there are plenty of resources available. The best way to ensure maximum resources for the import is to restart the instance before beginning the operation.

A restart:

  • Closes all connections.
  • Ends any tasks that may be consuming resources.
An import operation can fail when one or more users referenced in the dump file don't exist. Before importing a dump file, all the database users who own objects or were granted permissions on objects in the dumped database must exist in the target database. If they don't, the import operation fails to recreate the objects with the original ownership or permissions.

Create the database users before importing.

Troubleshooting export operations

Issue Troubleshooting
You want exports to be automated. Cloud SQL does not provide a way to automate exports.

You could build your own automated export system using Google Cloud products such as Cloud Scheduler, Pub/Sub, and Cloud Functions, similar to this article on automating backups.

What's next